Created at the Bartlett School of Architecture / Interactive Architecture, Palimpsest uses 3D scanning and virtual reality to record urban spaces and the communities that live in them. The project aims to question/test the implication if the past, present, and future city could exist in the same place, layering personal stories and local histories of the city at a 1:1 scale.

In 1998, researchers discovered that mathematical proofs by Archimedes had been overwritten with biblical texts by monks in the 13th century. Documents such as this, with previous erasures still visible beneath the primary text, are known as palimpsests. Architecture can also be a palimpsest: as cities and buildings are modified and re-purposed, traces of their previous lives remain visible.

Their first initiative, The Camden Palimpsest, uses the UK High Speed Rail 2 project as a case study. It highlights stories of Camden residents – some of whom will lose their homes and workplaces – and explores how their lives will be transformed. Their virtual Palimpsests aim to create more inclusive planning practices, using emerging technology to directly connect communities, governments, and developers in conversation. They also become historical documents, digitally recording spaces and stories that might otherwise be lost.

The palimpsest diagram scanning

In order to create the palimpsest, the team relied on 4 types of 3D data capture. Above all, they wanted to achieve a unified aesthetic across very different scales, from the urban scale of a park, to the room scale of an apartment or pub, to the personal scale of a 3D video recording. The interactive virtual reality experience was built using Unity3D, and was viewed with an Oculus Rift. They also exported stereoscopic 360 videos of the experience that can be viewed on a Google Cardboard or Samsung VR headset. For the team, Unity proved to be an excellent platform for merging a wide variety of 3D content while providing dynamic and interactive user interfaces.

Print

To capture at the urban scale, they primarily used LiDAR (Light Detection and Ranging) scanners in collaboration with Fiona Zisch from ScanLab. After merging 7 scans of St James’ Gardens next to Euston Station, they uploaded a decimated point cloud of roughly 25 million points to use as the park base for the palimpsest. At architectural scale, the team used two methods to capture building and room scale content: Google’s Project Tango and Photogrammetry via AutoDesk ReCap. At this scale, they deliberately used technologies that are free and accessible (AutoDesk ReCap) or soon to be widespread tools (Google Tango) in order to make the process as inclusive as possible. Finally, to capture at personal scale, the team used a Microsoft Kinect with Jasper Brekelmans’ Brekel Pro Point Cloud V2 software. They conducted interviews both onsite and in a studio setting. The resulting captures were 3-5 minute recordings with each frame as a 3D model. Using Unity, they converted the 3D models into GPU-based TC Particles in order to make them interactive with high frame rates and performance. Audio was recorded and synced with the interviews. The recordings were placed where the interviewee’s voice would normally project and spatialized using a 3D sound engine.

This project examines how emerging technology in virtual reality and 3D scanning can be used to facilitate participatory design. It analyses the need for more inclusive community engagement in urban development projects and Virtual reality is examined as a more inclusive medium for architectural representation.

Team members: Takashi TorisuHaavard Tveito and John Russell Beaumont with support/direction from Ruairi Glynn.

Project Page | Interactive Architecture

Follow us on Facebook

Related post

Harmony by teamLab

Harmony by teamLab

Probable Universe – An infinite combination of alternate worlds

CGI Sculptures Explore the Chaos of Colors and Shapes

[do_widget adsense_728x90_1]