Dec 15, 2015 | Comments Off on Virtual Depictions: San Francisco – Cinematic data-driven sculpture 1415
Mar 18, 2014 | 0 3824
The Beacons created by TEM in collaboration with Vincent de Belleval, the team were asked to create something unique and ambitious for Marcus Wainwright’s first show since the departure of David Neville. The team developed a method to create a virtual environment around a catwalk, to tell a story based around Thom Yorke’s ‘Coloured Candy’ using a dynamic content system piped through a room sized inverted zoetrope, created from 2.5m tall, 0.5 tonne, 60rpm motorised beacons, X3.
As the track begins the beacons awaken with a familiar voice, reminiscent of OK Computer-era ‘fitter happier’ text-to-speech reading from The Universal Sigh, the newspaper that accompanied Radiohead’s 2011 album The King Of Limbs.
The phasing kinetic motion of the beacons evolve with the audio to paint the space with light and movement. From an abstract horizon, geometric landscapes through to glitched chocolate box beach scenes, waveform analysis and light. The beacons become lanterns illuminating a virtual environment that evolves, as the track and content system reach apogee, the beacons create glimpses of another place, for just a few seconds, before track and sculptures lose phase.
Vincent de Belleval developed a process for feeding power and data to a series of revolving motorised projectors. TEM then formed these projectors into three large beacons and placed them in array in the centre of the Rag and Bone SS17 show. Using feedback data from the motors, the team align the rotation of virtual cameras in the real time content engine with the physical rotation, meaning they essentially paint static imagery on to the walls and gauzes of the venue in three sweeping 360° arcs.
The faster the projectors spin, the more the illusion of a complete immersive environment is created by the phenomenon of vision persistence. This kinetic content system allows them to build environments around the models and guests which the motorised projectors sample as they spin, like lighthouses illuminating virtual worlds. It also allows the use of content infrastructure as a conceptual and sculptural element in the show, instead of hiding it in the shadows.
The project was documented by google as part of their Daydream VR platform.
Credits: TEM in collaboration with Vincent de Belleval (Concept and Creative Direction), Action Time Vision (Production), The Hive (System engineering), Original Soundtrack by Thom Yorke and executive production by Prodject.
Follow us on Facebook