**Performative Mechanics of Experience
Like Blast Theory’s A Machine to See With , the performance is designed for two participants at a
time, whilst embracing viewing by a general audience. The location of each scenario will be a mix of
indoors and outside of real and constructed spaces/settings.
In one of the versions, one participant puts on a white robe, sits down and puts on an Vive headset
and motion tracking gear: she takes the role of the receptive participant. Inside the Rift view she
explores a virtual world. Using the Kinect or other motion sensing to sense the movement of the
participant’s body, she experiences the avatar’s body becoming as tall as the heavens and
transforming into the earth and cosmos. Each visual seen is accompanied by touch. This experience
becomes truly tangible by using haptic illusion techniques that play with the plasticity of subjective
multi-sensory awareness. But this will likely evolve once all the other collaborators and consultants
have time and opportunity to participate contribute.
The haptic sensation is generated by the other participant. The standing participant puts on a black
robe and her action is guided by audio and projections on the reclining participants costume: these
visualisations instruct her where to touch with paintbrushes and wooden drumsticks. The standing
participant becomes an active haptic performer, and yet always receptive to the guidance of the visual
projections that he follows like a visual score.
The virtual experience will be built using the Unity3d game engine. This development environment
provides a rich toolset for creating immersive installations, easily able to connect to the latest cutting
edge peripherals, and provides powerful graphical abilities. The Vive, virtual reality display will be the
means of viewing for as well as a huge field of view the Vive also provides rotation and positional
head tracking, which is used as a control input and gives users presence in their virtual environment.
This provides the highest possible resolution and fidelity of visual environments on the VR market. For
tracking the participants’ bodies we will use a Leap Motion, hand tracking device with a Neuron
motion tracking kit or a haptic suit which is a bespoke costume that has a mix of
hard and soft biosensors sensors, actuators and electronics to sensing heart rate, breath, GSR, EMG,
(EEG?), accelerometers, gyroscopes, actuating heat/cold, pressure, (hugging or constricting), sound,
and tickle.)
The performance journey is embarked on in pairs (participants a and b) who are familiar with each
other. Both participants put on wireless headphones, which initially guides them through setting up.
They both wear a costume, participant (a) has a smock with a white front and is loose fitting, this
functions as both a projection screen, and makes the topology of their body more terrain like.
Participant (b) has a black smock to help distinguish them in the kinect data so they can be masked
out. Participant (a) wears the Vive and assumes a sitting position in the center of the installation floor and participant (b) sits to participant (a) right facing them.
The installation begins playing a runtime rendered 3d scene in the Vive: this is the body of the VR
experience. They participant is challenged to various embodied scenarios detected by the kinect or leap motion. While this is happening participant (b) is witnessing prerendered animation projected on
participant (a).
The performance is designed to integrate various scenes and will
integrate 4 Action scenarios: from the body transformatory to the viscerally affecting.
Visual and touch effects include using the depth data from the kinect to generate a mesh topology and
from that a dynamic terrain of the island, including geological texturing, flora and fauna. This is kept
up to date with the participants current body position, easing the heightmap of the terrain, re-texturing
and adjusting the height of flora to match. While this is happening participant (b) is guided via audio
and the projection to touch participant (a) at various places to simulate meteorological and
cosmological events, as well as the movement and changes of life on the island.
On the other extreme, the action challenges that involve taking action for change through teamwork
and individual action. The scenarios need to:
➔ be realistic enough to feel threatening and terrifying for the participants and engage all the
emotional & physical, proprioceptive senses and responses (lump in throat, butterflies in
stomach, crying, goose bumps, etc.)
➔ feel urgent for participants and the outcome (such as survival of themselves or others), which
depends on them, not only to survive but to stimulate them to make real substantial choices
that will make real concrete change in their lives and for the planet.
The experience will also be made for multiple viewers in a 360 Dome, viewed on Youtube or outside
in a landscape with AR markers.
Timeline and deliverables
Research and narrative development will be done by the PI and co-PI, partner development consultations with performance professionals/VR designers/companies/experts, storyboarding for showcase presentation, prototype film footage capture, performers salaries for rehearsal and rehearsal space for performance of prototype for showcase presentation, prototype development/360 film and Mixed Reality (VR/AR/theatrical) design.
Phase 1
a) Expert consultations with psychologists, climate change scientists (data source?), immersive theatre directors - Months 1–4
b) Write 4-5 narrative scenarios Months 4–6
c) Workshops begin Months 6–10
d) Collate workshop data and analyse for design phase Months 10–13
Phase 2
a) Interface Design prototyping (360, 3D/Unity, etc) Months 13–17
b) Haptic wearable tech costume design and prototyping Months 13–19
c) Real-time data / AI modelling Months 13–18
d) Iterative Performance Design and audience testing Months 13–17
Phase 3
a) Iterative redesign of VR/ER Interface, haptic costume and final experience testing Months 18–21
b) Participatory Performance Testing – final live experiential performances Months 18–21
Phase 4
a) Completion - analysis of design, audience testing and experience testing Months 22–24
b) Write up of final methods and project reporting Months 22–24
Team Capability and Roles (Researchers/ collaborators)