Something has happened over the course of my life to an artform that used to specialize in, you know, fun. On one hand big network cartoon shows aimed at kids have been scrubbed politically correct and while those targeting an older audience are still in the business of challenging polite society, they rely on snarky snideness as their stock and trade. Sure, Bugs and Daffy were snarky and snide but only during respites between manic slappy schtick. In today’s version I’ve been asked to consider condiments as a main course in the name of being edgy. Consider that for the millions of people they entertained, Wiley E. Coyote and Roadrunner never made a sound past me-meep!!!. (Emphasis theirs.)
I remember a time before “Photoshop” was a verb and CGI stood for Corrugated Galvanized Iron (OK, I don’t remember that, but I remember the time.) In that era, surrealist exaggeration and a world unbound by the laws of physics was the purview of cartooning (and, er, *cough* acid trips *cough*.)
In one of my research projects at Disney (one of several that went nowhere) I was looking at ways to heighten the level of emotional impact of gaming, which itself relies heavily on animation. The core findings of this research is a topic of another discussion but as I reminisced about what made cartoons so enjoyable to me as a kid I came to blame another chief culprit: 3D animation.
One of the key ingredients of a fun, whacky, frenetic cartoon is exactly that physics busting exaggeration. The pinnacle of this can be found in the form of Tex Avery’s [wikipedia] work, especially in the early 50′s at MGM. Some of the key drawing techniques he used was stretching, blurrng and, er, dismemberment.
It turns out, all of these are “hard” using 3D animation tools. I put hard in quotes because we all know that 3D art is driven by computers and it’s not as if a computer would find it any more “difficult” to do these three things than produce the CGI behind Life of Pi — if it were told to do so by the geeks. The problem is that there were a series of tiny, evolutionary steps along the short but concentrated history of 3D animation tools that individually are rational enough, but taken as a whole, would require undoing millions of hours of work and even more lines of code.
It has taken three decades since the first days of John Lassiter and crew’s work on bumblebees and desk lamps to settle on a system of describing animated versions of real world objects by using meshes of tiny triangles connected together to form each shape in the scene. This mechanism has been hardened (often out of the necessity to maximize computing resources) to depend on three key properties: 1) that the mesh keep its shape by not independently moving or resizing the triangles, 2) that they stay connected by sharing edges between adjacent triangles, and 3) that coloring the mesh (a.k.a. “texturing”) happens on the surface of each triangle and therefore not beyond the bounds of the mesh.
Behind the scenes, 3D artists may create the body parts of a virtual dog in the their tools separately, but they will combine them into one mesh, never to be torn asunder again, before delivery to the texturing team who don’t even have the means to draw outside the boundaries of the mesh. The actual animation is authored by creating pivoting virtual guides that are mapped to various points of the mesh so that different components of the dog can move and rotate in relation to the pivot. In the parlance of the software tools: bones are connected via joints to form a skeleton which is then skinned onto the mesh in a process called rigging. In most 3D animations, the mesh never changes shape, when the paw lifts and tail wags it’s just large sections of the tiny triangles are changing position relative to adjacent triangles.
All three of these hardened attributes run 100% counter to what Avery and Chuck Jones depended on in almost every frame of their cartoons for emotional impact.
But that’s not how all this started. Lassiter was an animator at Disney, an artist, who approached the whole thing from a classic animator’s point of view. (In case you need reminding, he was fired from his job at Disney for being too enthusiastic and distracted by the prospect of computer aided animation. That short sightedness cost Disney a little bit when they ended up buying into Lasseter’s Pixar 7 years later for $7.2 billion.) From his very first 1:50 experimental short funded by George Lucas, the ‘bumblebee’ link above, he insisted that the software allow him to stretch, blur and leave holes.
Consider the iconic Luxo Jr., the ‘desk lamp’ link above, where sure, the desk lamps are rigid forms that never change shape but the ball is squishy, you can bounce on it and it eventually (spoiler:) deflates.
What’s even more crazy is that 3D authoring tools in the wild all have rigging features that default to cartoony-stretching of the mesh when you move the bones around. This Maya tutorial, uploaded by a high school kid, shows how easy it is to rig a mesh of a cat which stretches with Tom (or was it Jerry?) elasticity. Without having any other context, I was a little heartbroken when one of the top comments on the video was “is there a way so you can lock the mesh so it doesn’t stretch?”
Using the Open Source tool Blender (which is a glorious kind of mayhem itself), I followed a tutorial to create a sphere rigged for stretching. Then in a few minutes, I created a short, cute animation of it trying to get airborne (the rigging would be hidden from a final version).
The information about the rigging in this case can be shared using a widely used format call OpenCollada and a few playback environments will “animate” the mesh, meaning it will move parts of the mesh in relation to the joints, but hardly any playback tool will stretch the mesh in the way illustrated above. So my example ends up looking like a spherical marble, never changing shape and therefore losing all expressiveness.
One of the fancier features of rigging software is where you apply kinematics to a skeleton that prevents “un-natural” positions. This used to the point of cartooning. Animators, especially those in gaming, have been going in the opposite direction of old school cartoonists: desperately seeking “photo-realism” coupled with imitating real world physics. I can’t help thinking this is a particularly geeky, non-artful, enterprise since it actually puts in practice the years of schooling in math and science that would be denied by almost any other programming job. (The vast majority of programming jobs are “configuration” oriented, like sorting airline prices by lowest fare and embedding up-sell widgets into iPhone games, as opposed to the more glamorous “science-y” work of calculating virtual spatter patterns.)
And of course you can technically “draw outside the lines” of a mesh – a “fur” texture would be a common example, where hair like fibers seem to extend past the mesh. But the fact is, in order to do true free-style 3D Tex Avery style drawing, including with dismemberment and blurring you will be fighting the tools at every step.
It’s really too bad because it’s not hard to imagine how much more fun could be added to gaming that relies so heavily on exactly these tools.