Three years ago my friend David invited me to participate on his Christmas Experiments project – an advent calendar with one code experiment a day. He gave me about two months notice so I had plenty of time, but I spent most of it just trying to have an idea. There were a bunch of false starts until one happy day when I googled ‘christmas gifs’ and found this:
The Way Christmas Makes Me Feel, by Elliot Dear
Half reindeer, half Michael Jackson. Can’t go wrong with that. My idea was to trace the silhouette of the character and then use that as a base to create visualizations in HTML Canvas. And that’s what I did. And people liked it. It was fun!
This year David invited me to join the Christmas Experiments again, but this time I had only 3 weeks. I knew I had to start straight away. The idea had to come fast. And it did. How about a tribute to my old experiment, but this time in 3D? Wow, such brilliant, very technology, much moves.
Next, a quick feasibility check. I downloaded Blender, watched a few tutorials on modeling and rigging, found a couple of generic male models, a couple of reindeer heads and, most importantly, I found this mocap:
TurboSquid: Dance MJ Loop 03 – Pelvic Thrust
OK so all I had to do was to throw all those ingredients in a pan and start cooking. I thought I would have a dancing model in a Three.js scene in a couple of days and then I could go crazy on the shaders to make some cool visualizations.
I was wrong.
It started well. I learned the basics of modelling in Blender and was able to chop this guy’s head off and replace it with this deer head. He was looking cool. I called him mandeer.
I learned how to rig (following mainly these videos) and started testing some free mocap using the Makewalk plugin for Blender.
Around this time I showed the prototype to Damien at work and he got interested. We discussed a few ideas for the sound and he was keen to work on it. From that point onward we were a duo. We wrote to David and told him it was going to be a collaboration.
That’s when the problems started. No, not with Damien, he was great. With the mocap. I purchased the file from TurboSquid and tried to convert it from .BIP to .BVH so I could use it in Blender. It didn’t work. It really didn’t work.
The flow was to load the .BIP onto a biped in 3Ds Max, then export the animation as .FBX, open it in Motion Builder, clean the object tree, export it as .BVH, open the rigged model in Blender and load .BVH onto it. But somewhere in this broken telephone the information was not translated properly and all I could get on the other side was a cubist deformed pile of bones. I tried everything. I tried random combinations of export settings, I tried BVHacker, I googled every term imaginable, I read forums with desperate lonely comments posted in 2008, I waited for the planets to align, I called my mom…
At the same time Damien and I were clocking insane hours at the office – funny enough, on another advent calendar project – and there was very little time for anything else. The deadline for our experiment was approaching and we were not ready. We tried to give it a last push on the last day (December 3rd), but it didn’t happen. We missed the deadline. David was sad. We were sad. We ended up going live with a bloody ‘coming soon’ placeholder.
I never really managed to solve the .BIP to .BVH problem. In the end what worked for me was to create a pose for the biped in 3Ds Max, then export it as .DAE and import it directly in Blender (skipping Motion Builder), then rig the character again based on the new pose and adjust the twisted bones one by one, frame by frame. It was laborious, but at least it was getting somewhere.
We ended up going live 16 days later, on the 19th of December. Still crazy busy at work, but trying to progress with the experiment on every spare hour. No more time or energy to go crazy with shaders, unfortunately. I wanted to recreate some visualisations from my 2013 version, like the popping circles and the disco lines, I think they would look good in 3D, but I’ll have to leave them for the next time. What I ended up using was a combination of point lights with lambert shading and a directional light with a hatching shader.
Christmas Experiments 2016: Billie Deer
To Damien for the partnership, to David and William for being patient with us, to Michael Jackson for the beat, to Elliot Dear for the gif, to Mr.doob and all the amazing people making Three.js, to Ben Houston for tidying up the animation classes, to the ones behind the Blender exporter, to the people that take the time to upload tutorial videos to YouTube, to the guy from TurboSquid that took 72 hours to reply saying that their conversion support doesn’t cover animations, to the people that created a GUI to edit mocap just for Second Life (BVHacker), to the lonely guy that posted a question in 2008 and is still waiting for an answer and to Konami for the code.
Check the experiment here: http://christmasexperiments.com/2016/03/billie-deer