Stylized 3D Animation Pipeline (Maya to Blender)

For a project at Stuttgart Media University, we aimed to achieve a 2D look from 3D Animation. Here is a quick breakdown of our look development process and the pipeline we created for rendering.

Below is a trailer shot of the animated short, so you can get a feel for the style we were aiming for. Currently, it’s still running at festivals, so it hasn’t been published yet.

First Tests

Since we knew we would be using Maya as our main DCC, we conducted our initial tests entirely within Maya, experimenting with flat-shaded 3D models and (ai)Toon Shading for outlines to achieve a 2D appearance.

We quickly realized the limitations of this approach—specifically, we found that we couldn’t manually place lines (without adding extra geometry) or apply textures to the lines, except for noise. This meant we couldn’t have 100% control over the outlines. Moreover, the render times were significantly longer than anticipated.

Render Test with aiToon Shading

In this render test, the mouth outline wasn’t created, which meant we’d have to generate it by adding extra geometry or altering the character’s geometry. The eye outlines seem to look a bit wonky as well. This would have to be done on a per-shot basis, as outline creation is influenced by the specific animation.

Blender Grease Pencil

After a bit more testing, we eventually decided to switch to Blender for our outline generation and rendering since it comes with Grease Pencil and quite powerful 2D Animation tools. Blender includes a modifier that creates outlines from 3D models, which can then be baked and used as Grease Pencil Strokes. These strokes can be altered using the 2D Animation tools.

This is a proof of concept for this workflow:

Possibility of altering the generated Grease Pencil Lines

For this proof of concept, we cached all objects from Maya, imported them into Blender, and assigned shaders. Then we moved every group of objects that should have their own outlines into its own collection and created a Grease Pencil object from the corresponding collection. The LineArt Modifier of the Grease Pencil object is then baked and reprojected on a 2D plane to ensure that each outline of the same object has the same thickness, no matter how far it is from the camera (which is one of the principles we defined during the preproduction process).

Adding Modifiers to Greace Pencil Object to make Lines look less “technical”

To make the outlines look more painterly and human-made right out of the box, we added a stack of modifiers to the Grease Pencil object, defining noise and imperfections in the length of the generated lines. We also added basic brush textures to introduce even more variance in thickness.

Pipeline Integration

After figuring out the workflow, we knew that we would have to integrate it into our pipeline to execute it sustainably. The core elements here include a publisher from Maya to prepare all the necessary data to assemble and update the shots in Blender, scene assembly in Blender, Grease Pencil generation and cleanup in Blender, and submission to our render farm so we could use the output in compositing. We also wanted to be able to run all of these steps in a headless Blender, allowing our animators to generate test renderings straight from Maya. For ease of use, animators can submit the test renders with a simple UI during the publish process.

This video shows the Pipeline we used in the end. Additional details are described below.

Data generation in Maya

In Maya, we used Pyblish as our publishing framework because it’s easy to set up and configure. We implemented plugins to cache every animated object as an Alembic Cache and write out all of the texture assignments in a JSON file. Since we had issues correctly reading Alembic Cameras in Blender, and Blender apparently doesn’t support animated Alembic visibility yet, we also baked out all of the camera values we would need and frames for Alembic visibility in separate JSON files. Moreover, we added one context JSON file to describe the context we are in so we could automatically set it correctly during the scene assembly in Blender.

Scene Assembly in Blender

To set up our scenes in Blender from the data gathered in Maya, we implemented a pipeline mostly using functions from Blender’s API (bpy). Our Shot Assembly loaded all animated objects into separate collections or referenced a separate config to determine their correct collection. This way, Shot Assembly could automatically create a Grease Pencil object for every collection and apply a simple stack of modifiers to it. This also enabled the setup of render layers/view layers, which are mostly based on collections in Blender. Each object received a simple cell shader, either with the image texture or base color (collected from Maya) applied. Due to issues with using Alembic cameras, our scene assembly created a new camera and applied the baked data as keyframes. To allow more freedom in compositing, overscan was added to the camera. As we didn’t have many shots for our production, we decided not to translate our Qt interfaces, created for other DCCs, to Blender, and instead used simple menus.

Shot Assembly Menu

Grease Pencil in Blender

Once the Blender scene was set up, there were a few more steps needed to make it look closer to our reference and allow artists to have full control over the generated outlines. As a first step, we implemented an operator to set all Subdivision Surface Modifiers applied during Shot Assembly to render quality, and then baked every Grease Pencil object. This operator raised every subdivision modifier to render resolution during baking with a context manager, and then lowered them back to their original settings afterwards to make the viewport faster again. Since Grease Pencil objects consist of points, we can handle those points in our scripts. Blender already ships with an operator for reprojecting Grease Pencil on a 2D plane. However, since we wanted this reprojection for the full time range and running this operator consecutively for every frame via the Python API proved to be quite slow, we decided to rebuild Blender (again, spoilers to further down in the text) and implement the reprojection for the full time range in the C++ source code.

Now that every Grease Pencil object lives on a 2D plane, we are almost ready for cleanup. However, since Blender adds a keyframe on every frame during baking, we had a lot of unnecessary data to be dealt with by our cleanup artists. Therefore, we needed one more operator: This one checked which frames the meshes were actually moving and deleted all Grease Pencil keys that weren’t necessary.

Other than that, we only had a few more conveinence operators to add other grease pencil materials/strokes.

The entire process described above could also be run in batch mode, allowing the cleanup artists to simply open the prepared scene that was automatically generated upon Maya publish.

Assembled Scene in Blender

Notes on using Blender in a production Pipeline

As this was our first time using Blender in a production pipeline, we encountered a few challenges along the way. One of them was that Blender only allows node names to be fewer than 64 characters long, which isn’t a problem with good naming and namespace conventions. Since Maya and other DCCs don’t have this limitation, and we didn’t know we would use Blender from the start of the project, we didn’t enforce it in the pipeline from the beginning. This came as a very unpleasant surprise, as 1:1 mapping of every object from Maya to the imported objects in Blender was nearly impossible. We considered hashing all node names in Maya and automatically renaming the nodes on publish, but after realizing that we were using Blender as more than just a headless render engine, there wasn’t really a way around rebuilding Blender with longer node name support (changing MAX_ID_NAME and references in the source code).

Longer node name support, yay (symbolic image)

Moreover, Blender breaks the convention of almost every other DCC by not supporting Qt for interfaces, so we couldn’t simply reuse all of our interfaces. That’s why we decided to keep the entire UI in Blender within menus and logging statements.

There are a few more challenges, such as numerous API changes making it difficult to find good examples in some documentation, and Blender not being able to fully handle Alembics (Blender doesn’t read animated Alembic visibility, which we used in rigs) that we discovered along the way. Overall, it was a lot of fun to start scripting in Blender though!

Pipeline Overview

Overview over the full pipeline we used. Arrows are usually automated processes.

Outlook

The pipeline we created still has several flaws that could be improved upon in future productions. One major area for improvement is handling animation updates. Currently, there’s no way to update the animation and then integrate it in the already setup Blender scene, especially after cleanup artists have started altering Grease Pencil lines. A possible implementation for this update could reuse the functionality of our checker for mesh changes between frames and compare the updated animation against the currently imported one. This way, we would only have to rebake and reproject the frames where animation actually changed, preserving all the work done by cleanup for the rest of the time range.

There are probably many more cool ideas. If you have any, please let me know.

Credits/Thanks

At this point I want to thank everyone being involved in the production of Frogging Hell, especially the main crew credited here:

Crew Credits:

  • Ann-Kathrin Burhop: Modeling, Animation, Production
  • Philipp Hofmann: Rigging, Look Development, Animation, Modeling
  • Chiara Hoffmann-Zeller: Lead Animator, Editing, Modeling
  • Louis Kuschnir: Sound, Animation, 2D Artist, Modeling
  • Robin Lemke: Compositing, Pipeline TD, Look Development, Modeling
  • Louis Reichardt: Director, Animation, 2D Artist, Concept Art, 2D Animatic, Modeling
  • Julian Schmoll: Rigging TD, Pipeline TD, Look Development TD
  • Nathalie Winkler: Lead 2D Artist, Lead Texturing Artist, Lead 2D Effects Artist, Concept Art
  • Magdalena Zygadlo: Animation, Concept Art, 2D Animatic, 2D Artist, 2D Effects, Modeling

Useful Links

Some of the Code building our pipeline can be found here (warning, some of them contain a lot of spaghetti code):

External