Out of Line directed and developed by Moniker and Studio Puckey, Out of Line invites visitors to draw hundreds of lines on top of a music video. All lines are saved on their servers and are then rendered into the video to be seen by future visitors.

After signing up to Hey Lines, a fictive online agency for remote line drawers, you are sad to find your job will mainly involve the mindless filling in of graphs. Everything changes when Laura, your employer, goes home early for the day and takes you all with her.

“The video is set in an environment that reflects the new kind of flexible contractor relationships introduced by platform businesses such as Uber and Fiverr, where employees are considered just another commodity to be bought and sold. The video was also inspired by the defaced bathroom stalls that are anonymous internet comment threads as well as the complexities that arise when interacting with online audiences.” – Moniker and Studio Puckey

In order to sync up the video to their interface, the team wrote a library that is able to convert Adobe After Effects keyframes into a JSON array of timeline events. This way any adjustments to the video would also update their timeline. In order to play back what users are drawing on the server side, they needed a system that would allow them to feed the recorded user input frame by frame through the the same state logic as on the client side. The front end is written in Javascript and consists mainly of React and Redux.

Initially Moniker and Studio Puckey developed an internal library to manage the state called State Keeper. But soon after developing their first prototype, Dan Abramov released his Redux state management library, which made them ditch their code and do a rewrite.

Redux is a Flux inspired predictable state container. The idea is that instead of changing values in your state directly, your users fire off actions by interacting with your interface. These actions then modify your state and the changed state is rendered back into your interface. A perfect circular render-logic that allows for easy debugging and most importantly for us: server playback.

The website consists of two layers: an embedded Vimeo movie and the drawing interface on top of it. React allowed the team to easily prototype the interface and easily swap out different components as they finalised the project, without getting stuck in an intertwined mess of spaghetti code too quickly. The production version of their code uses Preact, which implements the React API in just 3kb.

As the user plays with the website, they record all inputted events and upload them to S3 in JSON format using their Unique S3 Uploader utility library.

When they re-render the video each hour, they first validate the received JSON data in order to make sure nobody uploading anything that could crash their render. Then they fire up a Node.js worker process for each CPU core, which they use to convert each user session into state objects frame by frame. These individual state frames are uploaded into a Redis memory store.

To render the video, they have their cores query the states for each frame and with the help of Node Canvas use the same drawing logic that they use on the front end, to render out the individual video frames to png files. These are then compiled into a video using FFMPEG and uploaded to Vimeo for compression and hosting.

Project PageMonikerStudio Puckey

Follow us on Facebook

Related post

Sophia Digital Art 2015

Sophia Digital Art 2015

This Giant Kaleidoscope​ is Really a Rave Cave

Icelandic church video mapping by Marcos Zotes

[do_widget adsense_728x90_1]