It begins with a two-day gathering in Los Angeles. Participants will work with advisors who have backgrounds in architecture, immersive theater, storytelling and film production. Projects will be developed and prototyped through hands-on workshops, live performances, and round table discussions. Creators will then go off on their own to develop a working version of their project to showcase.After a month of continued development, creators present their projects to potential investors at a showcase event at Facebook headquarters.
The initial creators were selected for the inaugural DevLab class because they represent innovative artists who have either already produced important VR experiences or first-time VR creators with established careers in cinema.
The team plans to expand DevLab next year and will open up submissions for the next class of creators.
Winslow & Malica
Set within the basement of a home within an unnamed war zone, Giant gives viewers a glimpse into a world that is a harsh reality for some and a virtual awakening for others. In it, you witness a couple consoling their daughter while bombs blast nearby buildings and threaten their existence. As a film it would convey a sense of sympathy for the family that serves as its subjects, but as a UE4-powered VR experience, it delivers a connection that makes the experience deeply personal and emotionally exhausting.
Making the live-action video with real human performance seamlessly mixed with a 3D environment in real-time was a big challenge because no ready-made solutions were available when they first started production. Depthkit by Simile, which implements the Kinect 2, was the platform they ended up choosing because it was a highly effective and affordable capture option. Depthkit merged with Red Dragon 5k footage shot by Alex Corn is particularly powerful from a fixed perspective, because it only gives volume data to what is directly visible to the camera. Since we wanted our viewers to be seated on a chair which emulates the approaching bomb blasts it ended up working out quite well for Giant.
Marshmallow Laser Feast
The project was conceived as deeper study into filmmaking for the virtual reality environment – a three dimensional study of mortality exploring photogrammetry processes.
The female model was created using FBFX’s scanning rig comprising of 94 DSLR cameras and a number of very fast flash heads. The system allows them to capture 360º 3D scans in 1/13,000th of a second, with 16k photorealistic textures. After taking a set of 94 images simultaneously, they are loaded into a photogrammetry software package called Agisoft. The software analyses the pictures looking for distinct pixels, then matches identical pixels on different images and through triangulation creates a point cloud.
In the Eye of the Animal:
The team scanned Grizedale forest with a Faro X330 lidar scanner that has provided them GPU expensive 800 million points which then decimated to a real-time rendering friendly level. In the project; Lidar scanning, CT scanning and phoogrammetry techniques co-exist harmoniously even though they are completely different procedures. Each scene of the narrative contains multiple sets of environmental particles that come from Lidar data and dynamic particles that come from come from highly detailed CT Scans of insects and animals. AND Festival visitors are invited to see the forest through the unique eyes of its creatures.
Visual engine generates and renders whole environment in realtime with certain generative elements which makes each experience unique. Visual engine communicates with 3D Audio Engine via osc to provide positional data as well as head tracking data from the Inertial sensors of VR headset. The sound uses Binaural audio, a technique mimicking the natural functioning of the ear by creating an illusion of 3D space and movement around the head of the listener as immersive as reality can be. Sounds were recorded in Grizedale forest and spatialised using custom built 3D sound tools by company Two Big Ears. Sounds move around the listener at 360° as well as up and down, adapting to the trajectory, head movements and visual experience of the listener.
Through a huge replica model of a giant sequoia tree and a Vive VR headset, Treehugger lets the viewer experience the scale and wonder of what is arguably one of the largest living individual organisms on earth. The longer you hug, the deeper you drift into a hidden world just beyond the limits of your senses. You discover the tree’s hidden inner structure: grooves in the bark become giant cliffs; pinecones feel like cathedrals. Exploring further leads viewers upwards to the glorious canopy where they can experience the wonder of complex energy flows reaching high into the uppermost branches.
This project uses the latest 3D imaging techniques. A combination of LIDAR, white light and CT scanning creates a VR experience that distorts space and time and makes the invisible visible.
LA PÉRI is a short VR dancing experience based on a classical ballet performance captured with mocap and exclusively made for roomscale VR. A visually stunning and breathtaking experience that infuses gaming, film and theatre for virtual reality. It is a nice example of what can happen when the elements of great storytelling and the performing arts combine. FIREBIRD LA PÉRI is an interactive virtual reality experience that draws inspiration from Walt Disney’s FANTASIA.
All locomotion and interaction like walking, jumping, climbing, swimming and flying is possible through our handwalking character controller. For special occasions Lucid Trips comes with a windfeedback device to simulate drag and thrust, which ultimately increases the awareness of an altered reality.
NOTES ON BLINDNESS: INTO DARKNESS
Each scene addresses a memory, a moment and a specific location from John’s audio diary, using binaural audio and real time 3D animations to create a fully immersive experience in a ‘world beyond sight’.
Sundance Film Festival New Frontier