Royal Chant: Hearing Voices Movement
this was fun! Mark & Wade shot some footage and sent it over. I made some stuff in Processing. I sent everything through a python script to do the neural transfer stuff. Download scripts and models here. This approach is very compute intensive, i was using 3 computers to render stuff and it still took a couple days. In retrospect I could have planned better and selected only the footage I needed but I am not that sort of video put-togetherer person. anyway, eventually I had a bunch of .movs that i then tried to import to Premiere… another day of Handbrake to .mp4 everything
Initially I was rendering at 30FPS but watching it gave me a headache, 5FPS with the amount of color and content that the neural transfer adds to a frame seems perfect. or there are new approaches that are temporally consistent and look really great, a bit too slick for RC.