A Case of Fours

It’s taken several months to complete this as there were a lot of storyboard elements and a very definite sense of what happened in Andre Chikatilo’s life. I didn’t want to stray too far from the actual story of Chikatilo and his absolutely crazy story, so I used it as a basis for the flow of the video.

 

A Case of Fours is based on the prolific serial killer Andrei Chikatilo a Soviet serial killer, nicknamed the Butcher of Rostov, the Red Ripper, and the Rostov Ripper, who committed the sexual assault, murder, and mutilation of at least 52 women and children between 1978 and 1990 in the Russian SFSR, the Ukrainian SSR and the Uzbek SSR. Chikatilo confessed to a total of 56 murders and was tried for 53 of these killings in April 1992. He was convicted and sentenced to death for 52 of these murders in October 1992 and subsequently executed in February 1994.

Chikatilo was known by such titles as the Rostov Ripper and the Butcher of Rostov because the majority of his murders were committed in the Rostov Oblast of the Russian SFSR.

Made with After Effects, Vue, Daz studio and Red Giant plugins.

https://thehydeproject.bandcamp.com

Advertisements

Black Rose Entertainment

I just spent the last couple of months filming and putting together a series of videos for Showgirl Troupe, Black Rose Entertainment. These guys are lovely and exceptionally talented, so well worth your time.

I did want to go a little further with all this, but time was against us. I filmed some footage using the RGBD toolkit but didn’t manage to get any into the final edits. Possibly something to use for later?

The choreography was originally to different music, such as Rhianna and music from Moulin Rouge. Due to copyright reasons and wanting a sense of uniqueness I decided to write three new songs specifically for the videos. It was a slight challenge to make the choreography match to new music, in particular the final video which incorporates all four dances and some fire work from Alice.

Videogame music video

Ok, another idea: create a video game that doubles as a music video. It doesn’t need to be super flashy, in fact, if it has a retro feel that may be even better. Open it up to people on mobile platforms and there’s a chance it might do something a little more viral than just the usual kind of video.

Seems like a couple of people have already explored this hybrid genre:

Qvalias game/video

 

Looks like it might be possible to create something in a program called Gamemaker. it’s a pretty interesting way of creating stuff and may be a good way of developing something retro and simple at some speed.

Prisma

So over the last few months I’ve noticed people using a new app on their photos that turns them into the closest approximation of a painting that I’ve seen from an app yet.

Cool, so how do I turn it into video?

Looks like someone already figured it out:

 

This is basically a video turned into frames and then each frame is processed by Prisma. Once completed, the frames are then reconstructed as a sequence.

I kinda like the first person point of view here. I wasn’t thinking this first of all, possibly just a camera following someone. Got some ideas along the lines of: protagonist travelling somewhere possibly being followed – the first person adds to the paranoia.

Brain squeezings: suddenly holding soil. face paint – looks freaky in prisma.

v.1

Some more thoughts

Here’s Linking Park’s video where they use similar technology. In the Khaidian video of Martyrdom, it looks slightly block because I’ve been forced to only do full body shots of the band (due to the 360 degree view),  on a low resolution camera. The Kinect v.1 is low res compared to the newer Kinect for Xbox one, which would have been great and greatly improved the clarity of image, but the limitations of RGBDToolkit mean I’m limited to the first iteration.

https://www.fxguide.com/featured/beautiful-glitches-the-making-of-linkin-parks-new-music-vid/

Interestingly, I may end up using the beta for ‘Depthkit’, which uses the Kinect v.2 which would be really interesting to use.

Leap of Faith

This is me demonstrating the setup for the installation. using Ableton Live to control the music and trigger the video at the right times. The Leap is controlling effects simultaneously on Resolume to reflect the changes with the video.

The player can then control what happens with the remix, essentially creating music and audio without having any real knowledge of how they did it!

Resolving Resolume’s reactiveness

My initial plan has been to use Ableton live and Resolume arena together as I wanted to have the viewer actually remix on the fly. Initially looking at it I thought it would be great to trigger video and sections of songs. I attempted to put this together and have found that the Leap Motion only sends through CC messages (a continuous stream of data which has a value between 1-127) rather than piano note data (a single button press if you will, usually used to trigger samples or video clips). There may be ways to get around this, but due to time and ease, I’ve decided to stick to a pre arranged sequence and manipulate the video and audio with effects and plugins.

Attempting to put this all together I’ve encountered some issues. I originally exported my video at 1080p. I wanted it to be as sharp as possible. I had to convert my video into DVX format which is a proprietary format from Resolume and apparently allows it to play better. Previously I have found that a JPG sequence has worked well playing back in Ableton live’s video window. Avoiding H.264 as, although great for streaming, is terrible for programs like Resolume or Ableton. I’ve found though that the 1080p video is a little jerky, so may try it at 720p.

Setting up the controls for both Resolume and Ableton has been tricky as I’ve had to go via Geco, Leap Motion’s Midi controller software. It does work well, but there are so many options open to you, that you don’t know how people are going to use it, especially as they have no proper training and it’s just “wave your hands over this thing”. I’ve considering using a small picture which shows what to do, without stopping people from experimenting. Presently I have various controls mapped to movements, a distortion on the video matching a distortion on the audio. This is proving very tricky though as I’m manipulating several streams of video and audio.

All this said, I think I’ve figured out the problem. the controller would jump channel numbers if I mapped to another effect. I can solve this by mapping to Resolume’s control panel and then matching that to a global effect rather than a specific effect.