Is it possible to experience being another life form?
TreeSense is a sensory VR system that transforms a person into a tree, from a seedling to its full-size form, to its final destiny. The person experiences what it feels like to be a tree by seeing and feeling her arms turning into branches and her body into a trunk. To evoke these sensations, we used Electronic Muscle Stimulation (EMS) on the user’s forearms to stimulate muscles and the skin, so that she can feel branches growing, a worm crawling, or a bird landing on her arm. This intimate, visceral experience dramatically creates an illusion of being a different life form, and thus develops a personal, immediate identification with a need of environmental protection. This project was created collaboratively with Xin LIU in Fluid Interfaces Group at MIT Media Lab.
TIME: 10 Weeks ( Sept - Present, 2016 )
SCHOOL: MIT Media Lab, Fluid Interfaces Group
TEAM: Xin LIU
ADVISOR: Pattie Maes
FOCUS: HCI Research, VR, Tactile Interaction
SKILLS: VR Development (Unity3D, Cinema4D, LeapMotion), Hardware Prototype (Electronic Muscle Stimulation, Arduino)
Awards and Exhibitions 2017
Be the Center of the Story
In its early days of development, film was a cheaper way to provide theater performance to the masses. It took several decades for filmmakers to discover the capacity of manipulating and distorting time and space. New techniques, such the close-up shot, perspective changes and dream-time, were developed to dramatically create a new sense of continuity and simultaneity. Film has now become a new way to tell stories that are not achievable in traditional theater.
Now with Virtual Reality, we have the possibility of creating experiences that are much more participatory and immersive. What if we could let audiences directly experience empathy for problems that are outside their daily life? In other words, what if we could shift the perspective of audiences and let them be the center to live the story and feel the experience firsthand? Would that embodied experience enable audiences to relate to remote problems emotionally and immediately?
Experience with Your Own Body
The beauty of allowing people to be part of a story is that it makes an experience unique each person each time. In experimental theater, for example, artists blur the boundary of the reality and the play and morph viewers into actors to create the unsettling illusion of “what if I were ... ”. But the embodiment we want to achieve with TreeSense is much more visceral and immersive.
Unlike conventional storytelling mediums, such as films, that are limited to only sight and hearing, we tap into more senses to set up the mood for a immediate mindset change. The body image and body schema inside our brains are plastic and variable. Through a systematic alteration of sensory stimuli, such as vision, touch, motor control and proprioception, the brain can inhabit a body dramatically different from ours, such as a tree. And this body illusion leads to a more direct, personal and emotional connection with the new identity that we embodied.
Dynamic Body Experience
HCI Research and Demos
Homunculus Flexibility and Body Ownership Illusion
To what extend can human accept body distortion and obtain a non-humanoid avatar body? People usually think the body image inside the brain is static. But many phenomenon such as phantom limb and rubber hand illusion prove the existence of this neural plasticity in nature. Lanier (2006) postulates the concept of Homuncular Flexibility that the homunculus—an approximate mapping of the human body in the cortex—is capable of inhabiting novel bodies and control them. This flexibility of owning new body image and schema unleashed endless possibility to experience being another life form and ultimately building strong personal connection with the new identity.
Synchronized Sensory Alteration System
In TreeSense, the strong body ownership illusion is induced through synchronized visual, proprioceptive and tactile inputs. We use Unity3D game engine as the center hub of all the inputs and outputs. The motions and gestures of the head, forearms, fingers are tracked through infrared light by Leap Motion. Then, we send signals from Unity to Max/MSP to control all the physical elements, such as haptics, heat, wind and scents. Also, the system embeds AI in the virtual animals who interact with the tree. Thus, they behave differently based on the user’s movements, which makes the experience unique for each person every time.
Electronic Muscle Stimulation for Native Tactile Feedback
In order to generate novel tactile sensations that does not exist in real life, we designed a series of EMS signals by varying combinations of pulse amplitude, pulse width, frequency of the current and the location of the electrodes. The intensity is calibrated per-user with a potentiometer and remains fixed once calibrated. Throughout the experience, various EMS sensations are triggered by the pre-defined hand gestures, timeline events and the interactions with virtual objects. The audience can feel the branches sprouting, touching the leaves and interacting with birds, caterpillars, etc.
Extended Project: VR Film <TREE>
Collaborating with Artist Milica Zec and Winslow Porter, we are presenting the short VR film <Tree> at the Sundance Film Festival and TriBeCa Film Festival in 2017.
My main contribution: 1) created the first Unity3D demo with motion tracking and EMS haptic; 2) designed, prototyped and produced the vibration haptic sleeves on the forearm for final Sundance showcase; 3) supported Xin in the haptic feedback system
More information, check my TREE Project.