Really interesting use of motion control. Rather than the Leap Motion, using Mi-Mu gloves.

She’s playing soon I think…


Leap of Faith

This is me demonstrating the setup for the installation. using Ableton Live to control the music and trigger the video at the right times. The Leap is controlling effects simultaneously on Resolume to reflect the changes with the video.

The player can then control what happens with the remix, essentially creating music and audio without having any real knowledge of how they did it!

Resolving Resolume’s reactiveness

My initial plan has been to use Ableton live and Resolume arena together as I wanted to have the viewer actually remix on the fly. Initially looking at it I thought it would be great to trigger video and sections of songs. I attempted to put this together and have found that the Leap Motion only sends through CC messages (a continuous stream of data which has a value between 1-127) rather than piano note data (a single button press if you will, usually used to trigger samples or video clips). There may be ways to get around this, but due to time and ease, I’ve decided to stick to a pre arranged sequence and manipulate the video and audio with effects and plugins.

Attempting to put this all together I’ve encountered some issues. I originally exported my video at 1080p. I wanted it to be as sharp as possible. I had to convert my video into DVX format which is a proprietary format from Resolume and apparently allows it to play better. Previously I have found that a JPG sequence has worked well playing back in Ableton live’s video window. Avoiding H.264 as, although great for streaming, is terrible for programs like Resolume or Ableton. I’ve found though that the 1080p video is a little jerky, so may try it at 720p.

Setting up the controls for both Resolume and Ableton has been tricky as I’ve had to go via Geco, Leap Motion’s Midi controller software. It does work well, but there are so many options open to you, that you don’t know how people are going to use it, especially as they have no proper training and it’s just “wave your hands over this thing”. I’ve considering using a small picture which shows what to do, without stopping people from experimenting. Presently I have various controls mapped to movements, a distortion on the video matching a distortion on the audio. This is proving very tricky though as I’m manipulating several streams of video and audio.

All this said, I think I’ve figured out the problem. the controller would jump channel numbers if I mapped to another effect. I can solve this by mapping to Resolume’s control panel and then matching that to a global effect rather than a specific effect.

Videos for People

Here are some of the video loops and sections I’ve created for the backdrop video for ‘Trigger the landslide’. These will be used within the leap motion controlled installation also. Most of them have been made using Sound Keys within After Effects. some may not have sound as they may have only been tracked to a click.

render farm

A possible option for getting all this stuff rendered would be a render farm. It would cost but it would also get it done.

Look at these guys:

at least I can get an estimate for a job once I know what I’m doing.

A method of control

I’ve been doing a lot of thinking as to the control method for the audio/visuals. I now have the Orbit controller and have tried some experiments with the audio side of things. It works great, but I can see it being potentially an issue for those people that are not musicians or happy to experiment with button presses. There is also the possibility of people breaking the equipment, which would ruin it for others.

An alternative that I’ve been researching is the use of a device called the ‘Leap Motion’. This is a unit that uses motion control similar to things like the kinect camera, but an awful lot more accurate, mainly used for hands as opposed to full body tracking. A great thing about it is that I can install it into a pedestal and it has no working parts. People should be able to experiment and play around with their hand motion to see what different actions will have on both audio and video. It’ll allow for a lot of experimentation from the viewer and even push them to be more immersive.

Hand position, orientation, movement, and speed are all measured with both hands, so the amount of control offered it really quite vast. Using a program called Geco for Leap Motion, it’s possible to turn your hand movements into midi. This midi can then be sent to both Ableton live and Resolume to control the audio and video, respectively.

Craig Winslow created their interactive installation that has incorporated the Leap Motion;

potential issues with this are tracking from the unit itself, until I try some experiments I have to assume there may be problems with lighting. One thing I noticed about the ‘Growth’ exhibit is the pedestals have a small light projecting upward onto the hand which should help the camera pick up the hand movements.



One issue with a live remix by people who aren’t musicians is timing, chaos, dissonance and really not knowing what to do.

A couple of things that might be worth taking into account

– if using something like the launchpad, a live metronome playing back should give people a visual indication of timing.

– hierarchies of samples. certain samples played back will cut off other samples.

Blog at

Up ↑