The Wolf Onboard

Not so many updates recently as I’ve been working my arse off on quite a large project – not to mention moving and getting married!

The past few months has seen me working in collaboration with Galina Rin of the alternative rock/metal act Death Ingloria. A few months ago I did a lyric video for her after discussing our joint desire to create videos for each song on an album. I was creating videos for Khaidian and she was in the process of getting artists and animators to work on the ‘art’ portions of her project. Once I completed the video for Silent Running Engaged, a collaboration using the artwork of Anne Bengard, I got to work on the main work for Rin.

 

The main concept for the videos is using a comic that Rin produced in collaboration with 2000AD writer Hilary Robinson and artist Nigel Dobbyn. A tale of the fall of mankind thanks to genetic engineering and a malevolent A.I. the story follows the album that Death Ingloria has produced in the form of “The Wolf Onboard”. Despite the band being a one person outfit, the final result is a surprisingly cohesive beast, and it was my job to bring Dobbyn’s illustrations to life.

This has meant that I’ve taken each of the seven pages of Dobbyn’s work, split them into layers and recreated backgrounds, elements and figures that are key to the story. Using both the puppet tool and DuIK I have animated the figures so as to have some dynamic as they appear within the action. Some pages have been a challenge as they are relatively stark, meaning that some inventive use of the camera has been a necessity, along with inventive placement of the lyrics.

 

On top of all the animation involved for the actual releases, I am also producing live versions for her live performance. Using a round projection screen that has a back projection from a short throw projector. The videos are masked into a circle so as to fit the screen, but this has meant some editing of original camera angles and positions. I have a tendency to place key items to the side of the frame, so an adjustment of this was vital. Also we decided to have some lyrics placed within the video, which has meant some additional thinking about timing, size and placement. Live I have been running Resolume Arena with a main Mac controlling the timing via midi. Initially to use it for projection mapping and masking onto the screen, but given that Rin will eventually be running without this setup from one Mac, I decided to simply mask off the projection at the rendering stage.

You can find more information on Death Ingloria here:

facebook.com/deathingloria
patreon.com/deathingloria

Death Ingloria’s Album launch for “The Wolf Onboard” is on the 16th November 2017 at The New Cross Inn, London. EVENT LINK

23269857_10159539879120440_1805385283_o23224891_10159539879035440_1237936754_o23215626_10155694353723828_2814422794710368029_o

Advertisements

Videogame music video

Ok, another idea: create a video game that doubles as a music video. It doesn’t need to be super flashy, in fact, if it has a retro feel that may be even better. Open it up to people on mobile platforms and there’s a chance it might do something a little more viral than just the usual kind of video.

Seems like a couple of people have already explored this hybrid genre:

Qvalias game/video

 

Looks like it might be possible to create something in a program called Gamemaker. it’s a pretty interesting way of creating stuff and may be a good way of developing something retro and simple at some speed.

v.1

Some more thoughts

Here’s Linking Park’s video where they use similar technology. In the Khaidian video of Martyrdom, it looks slightly block because I’ve been forced to only do full body shots of the band (due to the 360 degree view),  on a low resolution camera. The Kinect v.1 is low res compared to the newer Kinect for Xbox one, which would have been great and greatly improved the clarity of image, but the limitations of RGBDToolkit mean I’m limited to the first iteration.

https://www.fxguide.com/featured/beautiful-glitches-the-making-of-linkin-parks-new-music-vid/

Interestingly, I may end up using the beta for ‘Depthkit’, which uses the Kinect v.2 which would be really interesting to use.

Leap of Faith

This is me demonstrating the setup for the installation. using Ableton Live to control the music and trigger the video at the right times. The Leap is controlling effects simultaneously on Resolume to reflect the changes with the video.

The player can then control what happens with the remix, essentially creating music and audio without having any real knowledge of how they did it!

Battling technology.

My first render went a little wrong. I wanted it to export as a PNG image sequence, due to it’s non lossy nature, but it started as a .MOV PNG sequence, which is the same thing but in a .MOV wrapper. Unfortunately toward the end of the render, 60 hours or so, I noticed a problem with some of the animation not triggering. I clicked further along in the timeline and the whole thing crashed. This meant I lost a good portion of work as I’ve been unable to recover the 9.5 gigs of footage. At this point I remembered the other reason why it’s so useful to render as an image sequence.

Exporting at 4k in H.264 is also a pain in media encoder. You have to stick everything on high or your size is choked to around 2200 pixels. Something to watch out on when using 4K in the future.

OCTANE REFERENCE SETTINGS: http://helloluxx.com/news/octane-render-kernel-settings-reference-library/

When I had my presentation, I was told by the tutors to watch out for it being too floaty feeling. I agree, and thing that the plexus layers will liven things up, but there are fairly good reasons why you don’t do dramatic or jerky camera movements. guidelines for good VR according to Occulus

http://www.cineversity.com/vidplaytut/render_virtual_reality_videos_with_cinema_4d_create_youtube_using_octane

https://www.freeflyvr.com/freefly-vr-how-it-works/

http://resolume.com/forum/viewtopic.php?t=4050

 

And on it goes

For the last few days I’ve been going crazy attempting to sort out issue after issue. It’s a ll a bit of a blur, so this may not be in any order.13239011_10156919808790440_5171223907330343128_n

Knowing that I was going to use a render farm and wanting to keep render times down somewhat, I knew that I needed to bake some of my animations. Essentially a few of the plugins I use are mograph type animations, this means that to ensure I have the same animation as I created, once I’ve received my files back from the render, I need to bake them to individual polygons.

I’ve been using Greyscale Gorilla‘s excellent plug-in called ‘Transform’. It breaks apart your models into polygons or chunks, and in nice new inventive ways. I was having issues with baking the ‘poly mode’ animations. GSG has already produced a tutorial for baking ‘chunk mode’, but nothing for poly. After a lot of research I decided that I would get in contact with Greyscale Gorilla themselves and ask how to tackle this seemingly simple but as far as I was concerned, impossible task. Brilliantly, Chris Schmidt sent back a solution for me.

Unhide GSG layers
Make “PolyFXInstance” editable.
Select ALL polygons of this model.
Disconnect (uncheck Preserve Groups)
Add a PointCache tag (Character tags menu)
Store State
Calculate
At this point you can turn off the PolyFX and the Effector!

Great stuff!

Part, the second-

Render farms. Ugh. so I know that I am simply not rendering this myself. After some quick calculations I worked out that it would take 3 months of 24/7 rendering to get this thing finished. At least.

Having to produce images at 4X the size of 1080p so that the visuals are high enough resolution and also stretch 360 degrees around you, takes sometime to process. It entails creating an ‘equirectangular’ as explained in this tutorial: Octane 360 in C4D

One of the great things about Octane is it’s use of onboard GPU power. it’s one of the main reasons I use it now, having purchased a GTX970 video card I found that it did speed up rendering a lot. The added viewport window is also brilliant, especially as C4Ds perspective wireframe is slooooooow at times. The biggest issue appears to be support from render farms. OTOY usually don’t give out licenses for render farms as they have aimed their engine at people who want to use their own PC and video cards. It’s totally scalable, which means that multiple cards offer an increase of exactly what that card would be capable of on it’s own. i.e two GTX 970s are twice as effective as one, etc…

This has meant I’ve had to go shop for my render farm requirements. After a LOT of searching I’ve had to settle on a Polish company called ULTRARENDER. to be fair, they’ve been very helpful and even have gone beyond what they needed to do already. The reason Ultrarender is different is because you rent a server filled with high spec cards rather than very fast processors and RAM. The machine I’m using currently has 6 GTX980s and a Tesla, which makes it pretty nippy. It’s not cheap though. I’m spending around £700 for a weeks rental. Hopefully I’ll have enough time to get at least one sequence finished for my presentation. That would be around 1.30mins, which fits in nicely with my 3 min slot.

Mostly so far I’ve been installing programs on the server using a program called TeamViewer. it allows for remote operation of the server and allows me to install everything I would need. I’ve then moved everything from Dropbox where it was stored earlier, to the server to render. The problem is the licenses to use various bits of software have been a total pain.

OTOY allow you to disconnect your program from one computer to use it on another, so no problem (other than the hour wait per deactivation/activation).

Greyscale Gorilla there’s no issue with at all. just install!

Maxon on the other hand don’t allow me to use my student copy of C4D on any other machine (other than the one originally set up with). I can use it, but the resolution is choked at 800×600 and I can’t save PNG sequences (which I need to). So I’m now uploading an older copy I have to try and make that work… continual installing. This sucks.

test 001test 002