Back in 2013 I worked on a project which mixed dance with technology called Artism. At some point during the live show, the audience would see a dancer moving on the stage and behind him a projection with a 3D character performing the exact same moves. The character was morphing between a man and an ape.
At the time I was working on visuals for another part of the show, but I had access to the motion capture file and kept a copy to do something with it one day.
The day arrived when I was invited to submit an experiment for DEVX, part of Digital Design Days 2018 in Milan.
The theme was ‘the monolith’ and the first thing that came to my mind was this scene from 2001: A Space Odyssey with all the apes jumping around a monolith.
Monolith, ape, man, dance. There was something there.
The original mocap was an FBX file and contained only data about the bones, there was no skin so it couldn’t be used directly. It took me a while to figure out the best way to work with the data. A few months before, I was looking into ways of optimising Three.js’ own JSON format and even published a small package on npm about it. But I ended up not using it. I used glTF instead.
When I started I didn’t know how the model was going to be rendered. This is what I like about experiments, they don’t follow a set path, they just flow in the direction that suits them best as they go. In this case the coloured lines and ribbons in the final result appeared after several iterations and a lot of other unsuccessful ideas.
On the technical side, I think there are two interesting things to mention: one is how to find the positions of vertices influenced by bones, and the other is how to sort the vertices so that the line segments look pretty.
Transformed skin vertices
However, it was a different story for bones and skin weights. There are quite a few matrix operations involved. After hitting my head against the wall for a few days, I was saved by the right search term on Google which led me to this: https://stackoverflow.com/questions/31620194/how-to-calculate-transformed-skin-vertices. Thank you makc3d for unblocking the rest of this experiment.
Sorted body parts
Once a mesh is defined it is easy to change from drawing triangles to drawing lines or line segments. The challenge is to make the lines look good. Segments are drawn for pairs of vertices, so if in our model vertex 0 belongs to the right foot and vertex 1 belongs to the head, a straight line would be drawn across the model. If all the vertices are connected as such, we end up with a convex shape saturated with lines and the body becomes indistinguishable.
One way to improve that is to sort the vertices by their distance from each other in the first frame of the animation. It helps, but it is not enough. The best is to create a correspondence between a vertex and the body part it belongs to. Luckily, we can read body parts from the skeleton and we can check which vertex is influenced by which bone using
skinIndex. For a given vertex on a skinned mesh, get the index associated with the strongest weight, then get the bone name for that index and the result is i.e. vertex 2714 is part of the pelvis.
Now vertices can be grouped by body parts and lines can be drawn inside those groups. If the technique was applied on its own, this is how it would look:
And once again, I’ve added the konami code.
Go ahead and try it. Open the experiment and press:
up up down down left right left right b a
Where is the monolith?
It’s implicit =)
Check the experiment here: http://devx.ddd.it/en/experiment/1/