Home » News » Interviews » Will Wanderson interfacing Blender 3D with Open Sound Control

Will Wanderson interfacing Blender 3D with Open Sound Control

I’m Will Anderson, a Scottish animator & filmmaker. I’ve been making character-driven animated films for a solid 10 years professionally now. These include BAFTA-winning shorts ‘The Making of Longbird, ‘Have Heart’, and more recently ‘Betty’. I also collaborate with stop-frame animator/filmmaker Ainslie Henderson, making lots of different works for tv, web, game & theatre projects.

Will Anderson, a Scottish animator & filmmaker - Blender 3D

That said, I’m not here to bang on about all that, I’m here to talk about Blender, my recent fascination with it, and some things I’m finding exciting about interfacing it with my work. Hopefully, it comes in useful for artists who are new to it and are looking for a little intro to just a tiny bit of some of the exciting things that it can do.

First things first… I am not pretending I am a pro with Blender (I’m actually a total newb) only really properly jumping in mid-2020 in the height of a Scottish Covid19 lockdown. A lot of the skills are transferable between applications, so having experience with the rigging of characters and how all that works was definitely a plus, but shouldn’t intimidate you. Almost entirely before Blender, I was using a 2D animation software called Moho Pro. It’s a fantastic, intuitive rigging and animation software that suits my 2D aesthetic of pretty graphical work perfectly. But we don’t learn by sitting tight and not looking at other options, do we?!… Bear in mind I’m coming from 2D here, so everything I outline will be in relation to 2D looking design. I like to say that I used to make work that’s 2D but looks like 3D, now I say I make 3D work that looks 2D 😉

Moho controller rigs from short film ‘Betty’

Rigging is the secret

Being an animator, and a character animator most of the time, I know all too well that having an efficient rig is totally essential in getting the best performance out of something… and keeping your sanity (as this is where you will spend most of your time), among the thousands of keyframes, scrubbing back and forth on timelines. Rigging really is the secret here. When I started Blender I just focused on this element solidly, more specifically rigging a GreasePencil object. Grease Pencil is the new 2D-looking side of Blender that I highly recommend you explore, as it is very exciting indeed.

I’m probably not here to get into specifics of rigging, but I found LevelPixelLevel’s rigging tutorials very useful at this stage, he lucidly and calmly explains the principles of rigging in Blender in simple steps, throwing in total chunks of gold in terms of hints and tricks to make rigging work more efficiently. He also has a whole series specifically rigging with GreasePencil objects.

I also found these people to be very knowledgeable and clever:

Blender rig using Time Offset Modifier

A big part of 2D character work is having control over graphic symbols that swap out (things like mouth shapes, eye blinks, and expressions). You’ll often hear about ‘Rigging a head turn’ in 2D… this principle is used often in After Effects, more commonly with the ‘Joysticks ‘n Sliders’ plugin. With Moho, this is easily achieved using Smart Bones. So how do we do this in Blender?

How ‘Rigging a head turn’ in Blender

The action constraint really helps us out here… When we’ve attached all our elements to bones on the face we make 4 actions, for Left, Right, Up and Down extreme positions and give them the action constraint targeting a controller bone of your choice. Then your controller bone (I’ve positioned a big circle in front of the face for this) gives you an intuitive way to position your character’s face.

Blender face rig Gif

Speed up the process of animation production in Blender

I’m really busy most of the time, and have made A LOT of animation in my time… So I wanted to find out how to speed up the process of animation production. I’ve got a controller now, so how can I move it around more easily? This leads me to think about MIDI hardware and OSC(Open Sound Control) applications (editor’s note: TouchOSC*). Reading into this a little and looking at some great tutorials by Jimmy Gunawan he talks about using the face tracking software in detail, and an addon now called ‘AddRoutes’. In short, the addon easily enables you to transfer position data from the software into Blender.

* TouchOSC is a modular control surface toolkit for designing and constructing custom controllers that can be used on a multitude of operating systems and devices. TouchOSC can be used on touch-screen mobile devices as well as desktop operating systems using traditional input methods. TouchOSC can communicate with other software and hardware using the MIDI and Open Sound Control protocols in a variety of ways and via many different types of wired and wireless connections simultaneously.

Gif of FaceOSC controls in Blender

This is only scratching the surface and pretty simple when you break it down… Remember, this is a 2D rig, so I’m only really targeting that main controller bone, the rest is already set up in the facial rig I outlined.

Using modular OSC/MIDI applications

This is fine, but I need something with a bit more control to be honest, where I can have multiple controls and even use multiple devices to try out some kind of live puppetry experiments. After all, animators are puppeteers really. This is where using modular OSC/MIDI applications comes in handy. You can easily set all the different types of controls you want, customise the layouts, and assign them to different parts of the rig. I just assigned them to null objects, then simply attached/parented the controllers of my rig to these nulls. I used TouchOSC as I like it, but there are free options around. Pretty simple huh?

TouchOSC app on iPad using AddRoutes addon

As I said earlier, it really is mainly in the rigging where most of this comes back to. An efficient rig is at the heart of all this. When you realise that it’s just positional data, you can be creative as to how you map it in blender… maybe a ‘Damped Track’ constraint on another bone gives you the angle you need for the eye direction?… I started to see that design directed some of the rigging methods. For example, the way I’ve been designing eyes only really requires an angle to calculate. When you have that, setting up a controller is very straight forward.

I’m barely scratching the surface with the interfacing of controls. My advice is get into rigging. It’s super fun and creative when you work out how it works. I think what the developers at Blender are doing on the GreasePencil side of things is really inspiring right now, and I really think it’s the way forward, making 2D looking work in a fully functional 3D environment.

This was the first thing we made using the TouchOSC approach:

XMAS IS CANCELLED | adult swim smalls

Ainslie and I shared the body and face of each character and recorded the performances live using the auto-key function.

The thing I really enjoy about Blender is how creative you can be with it. You can bring along your own thing, in my case 2D graphic looking character design, and get it into a new space, and start controlling it with new methods not considered before… I’m so excited about seeing what will happen.

What will you do with it? 


Links to Will Anderson:

Knowing more of Will Anderson:

** BAFTA Film Awards 2013 WINNER – ‘Short Animation’ – BAFTA Scotland Awards – AWARD FOR ANIMATION
* BAFTA nominee 2018 – Short Animation – BEST SHORT FILM – British Animation Awards 2018

3 thoughts on “Will Wanderson interfacing Blender 3D with Open Sound Control”

  1. How does TouchOSC on the iPad actually communicate with the computer? There’s their “briddge” app but do you need a midi piece of hardware as well?

    Reply

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.