Not so many updates recently as I’ve been working my arse off on quite a large project – not to mention moving and getting married!
The past few months has seen me working in collaboration with Galina Rin of the alternative rock/metal act Death Ingloria. A few months ago I did a lyric video for her after discussing our joint desire to create videos for each song on an album. I was creating videos for Khaidian and she was in the process of getting artists and animators to work on the ‘art’ portions of her project. Once I completed the video for Silent Running Engaged, a collaboration using the artwork of Anne Bengard, I got to work on the main work for Rin.
The main concept for the videos is using a comic that Rin produced in collaboration with 2000AD writer Hilary Robinson and artist Nigel Dobbyn. A tale of the fall of mankind thanks to genetic engineering and a malevolent A.I. the story follows the album that Death Ingloria has produced in the form of “The Wolf Onboard”. Despite the band being a one person outfit, the final result is a surprisingly cohesive beast, and it was my job to bring Dobbyn’s illustrations to life.
This has meant that I’ve taken each of the seven pages of Dobbyn’s work, split them into layers and recreated backgrounds, elements and figures that are key to the story. Using both the puppet tool and DuIK I have animated the figures so as to have some dynamic as they appear within the action. Some pages have been a challenge as they are relatively stark, meaning that some inventive use of the camera has been a necessity, along with inventive placement of the lyrics.
On top of all the animation involved for the actual releases, I am also producing live versions for her live performance. Using a round projection screen that has a back projection from a short throw projector. The videos are masked into a circle so as to fit the screen, but this has meant some editing of original camera angles and positions. I have a tendency to place key items to the side of the frame, so an adjustment of this was vital. Also we decided to have some lyrics placed within the video, which has meant some additional thinking about timing, size and placement. Live I have been running Resolume Arena with a main Mac controlling the timing via midi. Initially to use it for projection mapping and masking onto the screen, but given that Rin will eventually be running without this setup from one Mac, I decided to simply mask off the projection at the rendering stage.
You can find more information on Death Ingloria here:
Death Ingloria’s Album launch for “The Wolf Onboard” is on the 16th November 2017 at The New Cross Inn, London. EVENT LINK
Ok, another idea: create a video game that doubles as a music video. It doesn’t need to be super flashy, in fact, if it has a retro feel that may be even better. Open it up to people on mobile platforms and there’s a chance it might do something a little more viral than just the usual kind of video.
Seems like a couple of people have already explored this hybrid genre:
Looks like it might be possible to create something in a program called Gamemaker. it’s a pretty interesting way of creating stuff and may be a good way of developing something retro and simple at some speed.
Some more thoughts
Here’s Linking Park’s video where they use similar technology. In the Khaidian video of Martyrdom, it looks slightly block because I’ve been forced to only do full body shots of the band (due to the 360 degree view), on a low resolution camera. The Kinect v.1 is low res compared to the newer Kinect for Xbox one, which would have been great and greatly improved the clarity of image, but the limitations of RGBDToolkit mean I’m limited to the first iteration.
Interestingly, I may end up using the beta for ‘Depthkit’, which uses the Kinect v.2 which would be really interesting to use.
Really interesting use of motion control. Rather than the Leap Motion, using Mi-Mu gloves.
She’s playing soon I think…
This is me demonstrating the setup for the installation. using Ableton Live to control the music and trigger the video at the right times. The Leap is controlling effects simultaneously on Resolume to reflect the changes with the video.
The player can then control what happens with the remix, essentially creating music and audio without having any real knowledge of how they did it!
My first render went a little wrong. I wanted it to export as a PNG image sequence, due to it’s non lossy nature, but it started as a .MOV PNG sequence, which is the same thing but in a .MOV wrapper. Unfortunately toward the end of the render, 60 hours or so, I noticed a problem with some of the animation not triggering. I clicked further along in the timeline and the whole thing crashed. This meant I lost a good portion of work as I’ve been unable to recover the 9.5 gigs of footage. At this point I remembered the other reason why it’s so useful to render as an image sequence.
Exporting at 4k in H.264 is also a pain in media encoder. You have to stick everything on high or your size is choked to around 2200 pixels. Something to watch out on when using 4K in the future.
OCTANE REFERENCE SETTINGS: http://helloluxx.com/news/octane-render-kernel-settings-reference-library/
When I had my presentation, I was told by the tutors to watch out for it being too floaty feeling. I agree, and thing that the plexus layers will liven things up, but there are fairly good reasons why you don’t do dramatic or jerky camera movements. guidelines for good VR according to Occulus