Interactive Design.

We imaginate, corroborate, elaborate, amalgamate, create, and then...


01 Journey Back A Multi-user Room-Scale VR Museum Experience

Created for the Illinois Holocaust Museum and Education Center, this room scale VR piece details the stories of Fritzie Fritzshall and George Brent, two survivors of the Holocaust.

Journey Back uniquely combines 360º stereoscopic video sequences with animated 3D environments created from photogrammetric scans of historical spaces that take full advantage of the capabilities of room scale VR. Produced at stereoscopic 8K @ 59.94 frames per second, Journey Back proved to be a technical challenge.

Aside from the challenges of coordinating with a team in Ukraine shooting on location in Poland, and shooting 360º stereo over green screen in the US for 3D environments, we had the monumental task of navigating our most tasteful approach to an incredibly sensitive subject. Once we had captured the footage of the show, we needed to set aside roughly 50TB in storage for the manipulation of the footage including VFX, stitching and cleanup. Working with Winikur Productions, we were given an offline edit that needed to be brought online in full resolution with final editorial, VFX, color correction, and six-track ambisonic sound.

As the nuts and bolts of the frames came together, we simultaneously built the delivery system which was delivered as an application built in Unity to be viewed on Vive Pro headsets in the museum space. Our partners at Gallagher Design had a bold vision to build the surrounding space with empty picture frames and our task was to have those frames shown and aligned in VR when the headset is put on.

We needed a method of starting all 8 headsets at the same time so we also built a show controller using our VFS backend to ensure that every seat has the correct experience, with the correct timing and framing on the walls. This is run with a simple to use interface running on an iPad. Showtime displays live at the entrance of the exhibit and indicate which show is up next and when. Part of the consideration is to give staff a simple method of administering a schedule change, and is why we opted to link Google Calendar into our show controller. This allows staff to trigger a series of events which simultaneously change the state of video content playing on the projectors, change lighting, and change the audio source all from a mobile device. The VFS communicates with all of these various devices and applications to orchestrate the show.

A 360º video-only version was created for use on non room-scale capable devices. This includes Google Cardboard and a variety of other devices for use when traveling to schools or events. It required a massive amount of tech to get there, and tells a beautiful message: We will survive.

A Promise Kept is a gripping virtual reality experience that takes you on a journey back to the notorious Auschwitz killing center with Survivor, Fritzie Fritzshall, as she fulfills her promise to the 599 women who helped save her life.

Don't Forget Me combines storytelling and cutting-edge VR technology that allows audiences to journey from a happy childhood through the descent into the Holocaust with Survivor George Brent, whose father heartbreakingly tells him “don’t forget me” as they arrived at Auschwitz-Birkenau.

Eyelash provided technical advisement, 8K 360 stitching, creative, compositing, graphics, software development, Unity app development, creative experience, installation, and the iPad control system using our VFS.

Created by The Illinois Holocaust Museum and Education Center
Directed by Ken Winikur and Ariel Efron
Produced by Gallagher & Associates and 30 Ninjas
Ambisonic Sound by SilVR
Mix and Audio Design by Robin Shore
Music by Luke Allen
Photogrammetry by Konstanty Kulik

Stereoscopic Room-scale VR 8k @ 59.94 fps
Running Length: 14:51 (Fritzie), 12:44 (George)
Ambisonic Spatial Audio
Unity Application

02 ArtWorld A creative arts metaverse.

Under development with 3Beep and Toppan, ArtWorld is a VR platform for consuming and creating art in a massive multiplayer online VR environment.

Artworld takes advantage of the limitless possibilities of VR space to create art and share it as an international social media platform. We are developing the tools needed to allow people to come together and express themselves regardless of language or culture.

Written in Unity, we are collaborating with multiple contributing entities from across the world to create a robust and extensible platform.

Our roadmap includes creative spaces for music, painting, gardening, sculpture, graffiti as well as gallery spaces and concert halls. We are providing world design that is both pleasant to use and intuitive. This is no easy task, especially when weighed against our goals of fostering a creative and fun educational medium in a growing industry.

One of the most integral pieces of the puzzle is the social aspect that VR offers. So much of the fun and enjoyment of VR is realized when experiencing small discovery moments with other people. Sharing tracks, jamming with your friends, socializing in real time across continents in various languages, and experiencing concerts. This makes for an unforgettable experience, and puts collaboration front and center. We want you to explore, learn, expand your horizons, and make something you're proud of.

Not only do we want you to make something from your vision, but we also want to encourage exploring the different world offerings. Each world has a different purpose and function, so if you're into museum experiences, or festivals, or just hanging out, Art World will have something for you.

This is designed for everyone to experience. Creators and lovers, amateurs and professionals can all find something here to enjoy and share with the world.

03 Installations Where digital meets physical.

Much of our work involves physical installation and system design. This may include anything from designing a network to choosing machine specs to contemplating device control like curtains or doors. This is where our VFS really shines.

Central Park Tower

Central Park Tower is one of the newest additions to Billionares' Row on 57th street in Manhattan NY.

Eyelash created a control system for Extell's sales gallery which brokers use to present information about the Tower to potential buyers. The sales gallery has a variety of spaces that are all controllable by the iPad interface, and includes an immersive Cave Room from Igloo. It features various displays and projection spaces, a scale model of the building where the interface can illuminate individual units, and various show rooms with controllable lighting.

Utilizing our VFS server we are able to integrate all the different aspects of the space, and control it through a single interface. As the broker moves through the space they are able to simultaneously trigger a door to open, change the audio source or volume playing through the house speakers, and start a video in the next room.

The impressive and expansive building has much to show, and the presentation system makes over 1000 pieces of media available for quick access across multiple devices simultaneously. This is all available for remote administration and debugging, and has a "party mode" or a "special group mode" for private investors.

Eyelash created a 3D model of the building that can be shown on any of the displays. The broker can select any residence, and the building model indicates where the unit is within the building, then splits the building to reveal the floor plan.

Sales gallery designed by Extell
Creative design by Wordsearch
Immersive space created by Igloo
Network and displays by JDAV

50 West 66th Street

On the the Upper West Side of Manhattan, Extell's 50W66 is a beautiful piece of architecture.

For this project, Eyelash worked with Extell and other creatives to create a sequenced flow that steps guests through the show from the moment of arrival to the moment of departure from the sales experience. The iPad interface controls all aspects of the various rooms in concert. For instance manual and automatic controls set the lighting, and the system switches inputs from house music to film sound. Additionally there is unit-selectable and pre-programmed lighting on the physical model, specific content presented via the projector which requires additional automatic lighting control, a multi-display grid, and various other screens throughout the space.

While the system has various steps of automated behavior, it is also flexible enough for the broker to deviate from the standard show. They are able to present any content on any of the displays within the gallery at any time. Ease of use and flexibility is key to allowing a broker to concentrate on their conversation with the guest, and allow them to adjust their presentation to create a bespoke experience for anyone's needs.

We write custom drivers for our VFS, which allows the flexibility to produce bespoke presentation systems of any type. The expandability of the VFS enables communication with new tools as they become available to us.

Sales gallery design by Extell
Creative design by Pandiscio Green

Sales Tool Installations

Objects in Mirror AR Closer Than They Appear

Objects fuses augmented reality with an immersive theater installation, inviting audiences to reflect on the relationship between new media and archaic objects, 21st-century technology and 19th-century magic, memory and optical illusion.

Using Vuforia for object tracking, Objects has a variety of augmentations that people can discover as they explore the space. The most noticeable augmentation is a series of 'portals' into VR space positioned throughout the physical space in places like the inside of a box, or a picture frame or a car window. As the viewer approaches the portal, the perspective into the digital space updates appropriately. Individual items scattered throughout the space trigger content from home videos to full 360 VR scenes.

The piece is based on the critically acclaimed theatrical performance The Object Lesson by Geoff Sobelle and creates a philosophical playground to explore the shifting relationship between images, memories, and things.

The AR application is designed to support and enhance this experience of exploration and discovery.

Audience members find headsets within the space and are encouraged to explore the space both with and without the headset, allowing them to interact with the physical objects and then to examine the object with one of the headsets to see if there is more to discover. Various objects within the theater space act as AR triggers that are recognized by the application.

The phones are seated within vintage stereoscopic viewers and other unique headsets that allow for camera pass-through, and audience members see associated content overlaid on top of the real world triggers.

This experience received many viewers flowing through the otherwise crowded Tribeca Film Festival, and as an asynchronous exploration made a huge splash - Artsy hails Objects "a seductive meditation" and "an impressive technical and social experiment." People exploring the space found things that were not understandable or attainable, but instead were enjoyable discoveries. Little bits of joy in a card catalogue.

Credits: Graham Sack, Geoff Sobelle, John Fitzgerald, Matthew Niederhauser

Modern Passage

Modern Passage was designed in collaboration with Kyle Farmer at the Fashion Institute of Technology Fashion Design MFA Program.

An art installation in their public space, the array of 5 displays made use of our camera tracking software to show how cultural diversity stimulates and fuels creativity while exploring the hidden artistic underpinnings of FIT's diverse origins.

By transitioning traditional clothing with modern fashion based on movement, our filmed subjects start in profile, each in one of two outfits: either a traditional cultural outfit or an original, contemporary outfit designed by the student.

Visitors passing the installation are tracked with our tracking software Radon, resulting in our filmed subjects turning toward the viewer while morphing between sets of clothing. The subject will follow the visitor with their head as they walk past. When the passageway is busy, our subjects will constantly change between the two modes of dress, creating a cascading ripple effect. As the space empties, the subjects return to profile following the flow of the leaving crowd.

Department Head Jonathan KYLE Farmer
Communications Design Professor C.J Yeh

04 Lincoln in the Bardo A VR Experience for NYTVR

On February 22, 1862, Willie Lincoln, Abraham Lincoln’s youngest son, is laid to rest in a marble crypt in a Georgetown cemetery. That very night Abraham Lincoln arrives at the cemetery shattered by grief and, under cover of darkness, visits the crypt to spend time with his son’s body.

Set over the course of one night and narrated by a chorus of ghosts of the recently passed and the long dead, Lincoln in the Bardo is a thrilling exploration of death, grief, and the powers of good and evil. Based on the novel by MacArthur “Genius” Grant-winning writer George Saunders.

Very heartfelt and elegantly rendered through acting, casting, directing, and set design, we see a world of people that eagerly watch this transaction as he works through it.

We watch this in VR and can feel that it's his experience, but it's also the experience of all the people around him, the community.

This can be viewed on the New York Times VR app

Written and Directed by Graham Sack
Sound by SilVR
A Plympton, Sensorium, and Graham Sack Production Based on "Lincoln in the Bardo," the novel by George Saunders.

Cast: Pete Simpson, Keaton Nigel Cooke, Jake Hart, Sam Lilja, Mark Linn-Baker, April Matthis, Robbie Tann

05 Experiments Creative Explorations

Volumetric Display

This volumetric display system is inspired by CAVE rendering. A series of devices are arrayed in physical space to produce a viewport into digital space that is responsive to the viewer’s changing perspective. This creates an effect like looking into a fish bowl or out a window instead of at a video playing on a standard unresponsive display. Any number or shape of devices can be accommodated here, and will work with the viewer outside looking in or inside looking out.

Once the layout is determined the devices are calibrated so the devices and VFS server are aware of their relationship to each other and to the 3D space they will be displaying.

This prototype was built with Legos to hold smartphones, although any type or size of display would work. Using smartphones allowed us to also take advantage of the front facing camera on each device. The camera output is fed through an OpenCV face detection algorithm, and that detection information is used to calculate the location of the viewer.

Each phone’s detection is transmitted to the VFS which uses the various values and a historical record of where the viewer was previously to arrive at a viewing position relative to the 3D space. It then uses UDP to broadcast that information back to all the devices for rendering and to ensure synchronization.


Sluice is the perfect waiting room game. When not in use it is a soothing screensaver. As soon as players join, Sluice transforms into a quick, competitive and intuitive multiplayer game.

The game is zero-installation which means anyone can walk up and start redirecting the rain with just the scan of a simple QR code. Players join the persistently running game on their phones by scanning a provided QR code targeting a webpage that connects to the game’s VFS server.

Once connected, the phone acts as an intuitive controller that can move and orient the player's paddle with which they can direct rain into buckets. The more buckets you fill, the more you influence the game to change the rain to your color.

With an unlimited number of players, Sluice was developed as not only fun entertainment, but as a demonstration of the VFS backend Eyelash has been developing for years which allows for instantaneous communication between software agnostic devices.


Helium was developed to transform the typically solitary act of putting on a VR headset in one's living room into a party game by allowing and encouraging others in the room who aren’t in the VR headset to join in on the fun from their mobile devices.

For the player in the headset, Helium is a room scale balloon popping game where the goal is to pop as many balloons as you can in short rounds. Like a classic arcade game the rounds get more difficult as you go and topping the high score board is the objective.

The players joining on their mobile devices play a series of quick movement-based mini-games that, if you win, allow you to add entertaining assistance or challenges to the game for the main VR player. They can do anything from turning your sword into a wiggly snake to making the balloons so big it’s impossible to miss them.


Pocket Museum

Pocket museum is a retail display and product interaction tracking system that uses intelligent camera tracking with low-cost off-the-shelf cameras to be able to detect when a user selects a particular item from a display.

When the system detects that a user is holding an item, it displays corresponding information about that object on a monitor nearby and collects statistics.

Information about what item was selected and for how long is sent to a central database.

This information creates many possibilities, where we can create a heuristical analysis of visitor patterns. Are we simply looking if an object was taken? What if time of day is important, or frequency, or average away time? These are just a few of the possible use cases for this technology.

A product display can be set up to track a single object or it could track many objects together in one location. Multiple product displays can be installed with different groupings of objects at each, all feeding into the same analytics system which would allow for more complex comparisons.

Some examples: Object A does better when grouped with Object B than when it’s near Object C. Stations in one part of the store see more overall interactions than others. Object B is picked up a lot in New York but rarely in California.

06 VFS The Virtual File System

Many of our projects employ our VFS, which is a multi-user, multi-machine, multithreaded Virtual File System used for enforcing synchronization between devices.

The job of VFS is to service requests from endpoints and maintain synchronization and consistency with its subscribers. These devices and subscribers are agnostic, disparate and strange, from lighting control systems to web browsers; lidar or audio devices; local applications or WAN services.

The beauty of the system is that it is built entirely from a place of scalability and flexibility, where the VFS can act as a linkage between parts or a more comprehensive engine with smarts inside of it.

Natively written in Qt, it is highly expandable and designed to allow plugins and new nodes to be added easily in either C++ or Python. Plugins are written as needed and can expand the system to talk to a host of other systems and APIs.

The VFS has been compiled on many different types of operating systems, and through battle testing has proven to be very stable and extensible, including Docker deployments, Raspberry Pi, CI and Kubernetes environments. Many of our installations happen entirely on AWS, GDC, or Digital Ocean.

The flexibility provided by VFS allows us to create applications and ecosystems that interweave components built in drastically different environments and for drastically different end devices.

Distinctions between users is an important aspect of how the VFS functions. Two users connected to the same VFS server will not necessarily be provided with the same tools and functionality. A monitor is a user as well as a calendar or mobile device or mongo database. It is a system of connected end points, and they all talk to each other. All connections in the VFS observe very strict SSL requirements.

The developer has flexibility when it comes to the ways in which a user can be securely logged into the VFS. These range from a user providing an approved username that has an associated set of options and virtual files available to them to having to log in via oAuth2 or LDAP credentials. Administrative users can provide more or less access to general users via an Access Control List for all the Virtual File and Directory points synchronized by the VFS.

The data that controls how a user can be connected, and what a user sees when logged into the VFS is also stored in Virtual Files within the VFS that can be subscribed to by multiple users and multiple tools, allowing consistency throughout the system and allowing multiple instances of the same user on multiple devices.

Let's innovate!