Last month while browsing along on the blogosphere (is that still a word?) we ran into this article about Chris Speed Visuals on ShareSynth and of course had to follow up for our own interview to find out more about these glitch-tastic creations. For people who remember the days of dragging around lots of old analog gear to gigs, as well as those who missed the chance, will have a lot to enjoy in this one.


Who are you and what do you do?

My name is Chris Speed but I go by Chris Speed Visuals and I am a audio-visual artist.

 

What hardware and software tools do you use?

I use a Macbook Pro predominately but I choose to pair that with many peripherals and other tools for real time video. This can include my Livid OHM RGB, an Akai LPD8 midi controller (for smaller setups), a Wiimote, a Microsoft Kinect, live camcorders, BlackMagic capture cards and analogue equipment such as video synths by Critter & Guitari / LZXBPMC’s Basic Cable, a Korg Kaoss Pad Entrancer and my trusted Edirol V4 for mixing between multiple sources. I also work on a custom Windows PC for the majority of my 3D motion graphics work.

I use VDMX as a compositional tool for the majority of my needs such as FX processing, VJ and audiovisual performances. When creating pre-recorded clips I use Cinema 4D, Adobe After Effects, Premiere Pro and Final Cut Pro X. I also have become obsessed Paracosm’s Lumen software since it is a great emulation of analogue video synthesis and looks awesome when combined with other video gear. I have also dabbled in programming with software such as Quartz Composer, Processing, Pure Data and Max. Finally I use Syphon a whole lot since it is essential for communicating between each program!

How does it all come together?

In a live context, it all comes together in VDMX since it allows me to combine the chaotic texture of hardware feedback with the more controlled software approach. I like to organise everything into layers and use blend modes to get the full aesthetic potential of the live inputs. I am fascinated by hybridising the randomness of vintage video gear with new developments in software, old meets new I guess is my methodology.

Tell us about your latest audio-visual project!

My audiovisual project (ERROR 404) was based around a brief for an art exhibition in which glitch was the thematic structuring point.

So basically, after mentally storyboarding the project, I started by creating monochrome scenes, characters and animations within Cinema 4D then rendered them as PNG sequences with Alpha Channels. I then imported everything into After Effects then composited/rendered them as video files along with the opening scrolling code which I learnt how to do using a basic online tutorial.

I then re-compressed those mov files into the Hap Alpha Codec for playback within VDMX; where the real fun begins! Basically I set up a patch in VDMX with four layers, one for the characters, one for backgrounds, one for a syphon input and one as a main pass to control the video feedback. So with the pre-rendered videos set to their respective layers, I then created a custom patch in the video synthesis software Lumen then outputted it to VDMX and vice versa via the Syphon protocol. I experimented with the blend modes for each layer till I got a colourful result I was happy with. Then with VDMX & Lumen already creating a feedback loop with each other; I wanted to add some more analogue texture and chaos to the distorted images.

I then used sent the VDMX output to my Edirol V4 video mixer using the Intensity Shuttle capture card by Blackmagic Design. I also placed a BPMC Basic Cable within the signal chain for extra video processing. From here I spent hours using the effects on the mixer with the basic cable until I got a collection of results I was happy with. I edited together all the captured footage using Premiere Pro.

Finally, I wanted to make the experience audiovisual so I spent some time developing complementary sound for the project. For the majority of the opening segment I used a online Microsoft Sam speech emulator which I then warped and stretched with Ableton Live. To match the analogue feedback of the images I wanted to do something similar with the audio so I used a contact microphone to pick up the electromagnetic waves from my speakers then fed that into a Korg Monotron Delay. I recorded the jam into Ableton and to conclude I synced the sound to the video back in Premiere Pro. The final result is Error 404! 🙂

I also have a new project I want to include! It is relevant since I used Lumen’s framebuffer effect to create a similar effect to Norman Mclaren’s ‘Pas de Deux.’


And of course if you loved this mix of the worlds of analog and digital video, be sure to check out Chris Speed Visuals on Vimeo for more awesome examples.

Related post

Pole Dancers Bend Light in Stunning Projection-Mapped Performance

Mattress Factory Installations with miniMAD

“Computer Thoughts” Projected onto a Giant Dome via Neural Network

[do_widget adsense_728x90_1]