top of page


2D Blend Spaces


I have a big interest in creating tools that are user friendly and generally look good. Tools are something I have worked on during most of my game projects at The Game Assembly. For my specialization I decided to add a core feature, 2D Blend Spaces to an animation editor tool that I developed for our in-house engine.


Blend Spaces are something that I consider to be among one of the most important systems for an animator to have access to. I wanted to make one within the animation editor tool, since it gives a powerful visualization on a 2D map of how animations blend based on certain values / parameters.


My goal with this specialization is to re-create the interface Unity has for its animation controller tool. The mockup below (created in Unity) is what I ultimately wished to be able to make in our own engine.

Animation Blending

To start, we add several animations to our blend space and spread them out on the grid. The red dot visualizes what the position (parameter value) currently is in correlation to the rest of the animations (light blue dots).

I used Delaunay Triangulation, an algorithm that creates triangles of a given amount of points. To decide which animations to blend between, I pick the triangle in which the parameter position currently resides in. Then, depending on the distance between the vertices (animations) and the position (parameter value), I give the animations a weight. The weights are then used to see how much % of the animation I want to use from the blend space.

To deal with cases that arise when the position is outside of the triangle space, I simply do a check to see what triangle is closest to the position and use that. This results in smooth blending and transitions between triangles and outside the triangle space.

To optimize, the triangulation only happens once during loading.


Adding a different node type to the animation state machine system that handles a bunch of animations at once instead of a single animation requried me to deal with blending between states in a different way. I handled this is by freezing a pose at the beginning of the transition, and blending that pose into the other blend state.

I read about this method in a blog post by Unity [1][2], on how they solved interruption source, and applied it to this blending, and the result can be seen below.


1 second transition blending between two blend space nodes.

Editor Transitioning

What a transition between blend spaces looks like in the editor.
PS: It's not timed with the image to the left 😅.

The code above is how I blend between a pose from the previous blend space and the current blend space I'm transitioning into.

I start by getting the weight to determine how many % of the animation I want to run from that specific point in the blend space. I use the weight to get a bone transform, and then use that value to blend between the interrupted pose. The blendInterp variable determines how far from the interruption pose into the current blend space I have transitioned.

I take that value, and blend it with the layer weight to get the final value of the bone transform.

Editor Interface

Since Unity is using Immediate Mode GUI, which is visually very similar to the library Dear ImGui, it wasn't that much of an issue to begin re-creating their 2D Blend Space interface.


Using Unity's blend space editor as complete reference for the editor.


This is the result of my implementation.

As seen above, it's very similar to what Unity has, and that's exactly the purpose of it. To re-create a familiar work environment, which in turn allows the users to instantly recognize and know the work flow of the system. But this also creates an expectation of the tool, since it's so similar to Unity's editor, and what features may be missing quickly become apparent.

Part of this specialization was to create a user-friendly interface to work with. 


Visual display of blend weights, which helps the user see how dominant an animation is in comparsion to its neighbours.


The grid allows for zooming in/out. Allowing the user to place animations at any distance depending on parameter values from the code.


I finished the ground work of the blend space in about 2,5 weeks. This allowed me to implement and test it thoroughly in our current game project, a Third Person Shooter. Since it's a heavily requested feature by the animators whose job it is to set up animation controllers, it also gave me testers for the system and to get feedback from. Having 2,5 weeks left of the specialization meant I had time to fix all the bugs and implement missing Quality of Life features.

In hindsight I feel like maybe I underscoped this specialization project since it went smoother, and finished quicker than expected.

8 Directional Movement

The biggest gain of this implementation allows us to easily implement an eight directional movement and visualize it, which we're using for our aiming stance.

Aim Offset

Having blend spaced allowed us to easily implement aim offset, which turns the character to where the camera is looking.

Look  Direction

A very discrete thing our animators added was an additive look direction, which gets the angle of the camera and turns the head towards where you're looking, clamping between values so the player doesn't break his neck.

Future Improvements

As all work went into adding a 2D Blend Space with the same features as Unity has, I didn't give 1D Blend Space a visual editor tool to work with. If our animators only need 1D for something, they're now just using 2D with a single axis line to blend on (effectively making it 1D, I guess?). It's not something that's super important for us to have, but would be a nice feature to have nonetheless.

Dev Blog

During the production of this feature, I ended the day with a screenshot of my progress and a summary of my day's work. My documentation does, however, end halfway through the project as the feature was complete and implemented into our third person shooter. So I spent the coming two weeks testing the feature with our animators, and then the final week working on my website.

Week 1 & 2 (Production)


Begun researching Unity's Blend Space system for 2D Simple Directional, and toyed around in their editor to see how everything works in practise.

I created a feature branch from our main branch in Perforce P4V and made a scene in our editor to have as a working hub.


The first step was to add a blend space node to the animation editor. Inspired by Unity's Blend Tree node that has the parameter values visualized on the node itself.

The next day I began working on an inspector view for the blend space node.

I created a grid, drew some circles in different colors, all using ImGui's drawlist functionality.

I then used Delaunay's Triangulation algorithm to create triangled between the points on the grid.

I added a function to see which triangle the red dot belongs to. I drew a highlight of that triangle to easier see it.

I created a class for the points on the grid, so that they can contain data for animations and weights.

The points on the grid required stuff like position data, animation data and a weight counter.

Since I already had a class created for it, I used it as a base for this.

A feature I wanted was to be able to scale the grid by zooming in/out. This is because we don't know how big the parameter values are going to be until we use them.

Since, for example, a movement parameter could be anything from 2.0 to 2000.0, we still want to be able to see the points on the grid.

I began working on a way to get weights against the vertices (blue dots that contain animation data) of a triangle, depending on where the position of the red dot is (the parameter value).

I now have all the data I need to be able to begin working on the system to blend animations.

The next thing was to try it out. For quick results I started by blending between poses to see if it would even work as indended, and as you can see on the images to the left, it works!😀

Except for some weird artifact caused by our implementation of the FBX SDK, which stretches some of the vertices (like on the crossbow).

The last and final step is to blend between animation frames, which, since we already have a system for animations, was not that difficult to add.

After this step, all that's left is to clean up the code, optimize it and test it.

Week 3 & 4 (Testing)

These two weeks have been nothing but improving on the tool. I started by merging the feature branch into our main branch on Perforce P4V. Then once that was done, it was already ready to be used, so I informed the animators on my team about it.

I think the biggest thing about developing a tool is that you get "blinded" by it. As soon as it started being used, the animators had tons of feedback and bugs that I needed to work on, stuff that I couldn't find during the production. You're not actually done when you're done.

Some of the stuff that were missing:

  • Blending between normal states and blend spaces

  • Interruption source functionality

  • Keyframe event callback

  • Functionality to blend between points that don't create a triangle

  • Override and Additive blend spaces

Week 5 (Website)

No more work was required on the feature, so I focused on working on the website instead.

Special Thanks

3D Character model by William Utterberg.

Animations & Rig by Jacob Fridholm.

Engine & Rendering systems by William Arnback.

bottom of page