In October 2012, Large Blue, an agency in Covent Garden, London, commissioned me to collaborate on a Kinect driven interactive for their client, Alstom. The finished piece was exhibited at their annual company dinner at the Science Museum. It was a very fun but challenging project, and we hope to get further funding from Alstom to develop it into a fully fledged piece of educational software.
In February I was approached by the nice chaps at BBC R&D in Manchester
to work on their custom light painting system.
The system takes an SDI feed from a live camera and applies GLSL shaders to
the video in real time. I produced an application to run on the BBC’s bespoke
broadcast systems, and developed a long exposure GLSL fragment shader for
The system was first used to produce moving long exposure shots for the
fantastic NVA : Speed of Light Salford event. I think it’s fair to say that
the event was a big success and everyone was generally quite happy.
The whole process was documented on Blue Peter – check it out!
In September 2012, I developed a Kinect based touchless interactive display for EE, the UK’s first 4G/LTE network. The display features user outline tracking and various particle systems which display fonts, icons and user interaction feedback. I was a lead developer on the build, along with the awesome Michael Lawrence and Jonathon Curtis. Publicis Chemistry agency creatives were David Prideaux (ECD), Paul Westmooreland (AD) and Neame Ingram (CW).
The passer-by user’s skeleton and outline is tracked by a kinect sensor, mapped to particle repulsors and then used to influence the movement of a sprung particle grid. There is also some gesture recognition going on, which we developed from scratch. We used the cinder Kinect windows SDK in conjunction with openCV, and made heavy use of Cinder’s Timeline classes, and built a few custom animation classes to make our particles easier to manage and control.
Our main challenges were :
– Aggressive timescales (approx 10 weeks from concept to deployment)
– Complex branding guidelines determining the behaviour of the particles, as set out by Wolff Olins.
– The project was to be simultaneously deployed in 20 different locations, all of which needed custom installations
– Gesture recognition was a central part of the brief
I’d like to mention my gratitude to Stephen Schieberl for publishing his work on the Kinect SDK cinderblock, which this project relied upon heavily. Thanks dude! You rule!
Also thanks to Andrew Bell for his tireless work on Cinder and his very helpful responses to my various questions.
The project was commissioned by EE, and produced by London based agency Publicis Chemistry.
In June, 2012, I was asked to join a large group of installation artists working on a top secret project in a massive warehouse near Euston, in London.
Secret Cinema’s team of artists, perfomers, choreographers and production designers created a fully immersive live cinema experience, which I was happy to be a part of.
In response to the brief, I created an emotional programming lab for the David 8 android, which featured various sculptures, live interactions, and a sophisticated Kinect/OpenNI driver threat recognition system.
Didn’t get to meet Ridley Scott though.
In November 2008, I worked with Moving Brands to create an animated brand mark for global media and marketing services company MindShare Worldwide.
The work involved creating various different animated versions of the brand mark created by Moving Brands, for deployment across various media. I also produced a screensaver which showcases the new branding.
In september 2009, Albion London approached me to create a series of generative artworks for their client, Atomico Ventures.
The brief was to create a generative system which would take data from a client project (such as company share price, number subscribers, temperature in office – anything they could think to provide, really) and create an artwork which would contain a visual representation of that data.
Working with their design team, I created four different pieces, which take arbitrary feed data and incorporate it into a moving artwork. Additionally, each user can select which data/artwork combination they see by using the control panel on the top right.
This is a video from the Flash / Arduino workshop I led for tinker.it in April 2008. In the video, I’m demoing my flash based RFID reader – the hardware drivers are all written in flash, and should be totally platform independent. I really liked the idea of writing hardware drivers in Flash.
Basically, flash is controlling the hardware, then reading back the 4 byte public unique ID of the standard Oyster card, and rendering it as a colour and rotation value.