Building a custom Animoji with ARKit and Blender

This post takes a look at ARKit Face Tracking on iPhone X, XS, XR and iPad Pro 2018. It is based on my WWDC 2019 Scholarship Submission. I am not an artist. All models I created for this post are just for illustration purposes. But I want to look into the code required to create your own Animoji. This post is divided into four steps:

  1. Basics of ARKit Face Tracking
  2. Creating a character model in Blender
  3. Getting the character into SceneKit
  4. Animating the character

ARKit Face Tracking

ARKit Face Tracking is easy to setup. Simply start an ARFaceTrackingConfiguration where it is supported and ARKit sets everything up for your.

When a face is detected by the True Depth Camera ARKit creates an ARFaceAnchor. Then, make your ViewController confirm to ARSCNViewDelegate and implement renderer(_:didAdd:for:) where you add and load your face.

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard anchor is ARFaceAnchor else {

To receive updated facial expressions also implement the renderer(_didUpdate:for:) method as well.

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
        DispatchQueue.main.async {
            guard let face = anchor as? ARFaceAnchor else {

To extract the facial expressions from a face use the .blendShapes property. ARKit automatically assigns certain facial expressions a weight from 0 to 1. Examples for BlendShapeLocations are for example .eyeBlinkLeft or .jawOpen. The meaning if the weight for each blend shape can be found in the documentation.

Creating a character model in Blender

Now that we now about the basics of Face Tracking let’s take a look into how to create an animatable character in Blender.

Netral model in Blender

Let’s assume you have a simple face model in blender with a neutral facial expression. Blender supports shape keys. These are modification you can do on the base geometry. Press the plus button (you need to be in object mode). This creates the neutral Basis. Now modify your character, for example by closing the left eye (only move vertices, all shapes need to be topologically identical, that is, have the same number of vertices, edges and faces). Then press the plus button again and name the key e.g. CloseLeftEye. As you can see, you can assign a weight between 0 and 1 for the shape (sounds familiar?). Now, create a shape key for every ShapeBlendLocation you want to support in ARKit. As you can see you can then individually assign an arbitary weight to each shape. Blender than applies all shape modifications at the same time.

Shape Keys in Blender

Get the character into SceneKit

Export the model as an .dae file (check that Include Shape Keys is set in the options). For SceneKit to read this file you need to fix a few properties in the file. Thankfully, there is a great tool by JonAlle on GitHub that does this for us.

Then add the file to your .scnassets where you can convert it to an .scn file. In the Xcode editor you can also confirm that the shape keys are working as expected.

Character in SceneKit

Animating the character

In SceneKit all shape keys are referenced by a number in which is was defined in Blender. To make working with those easier we can create a simple enum that makes the reference explicit.

enum Modifier: Int {
        case closeLeftEye = 0
        case closeRightEye
        case openMouth
        case smileLeft
        case smileRight
        case liftLeftBrow
        case liftRightBrow
        case tongueOut

We than use the SCNMorpherClass to apply the differnt keys with their weights.

func changeModifiers(_ modifiers: [Modifier: CGFloat]) {
        for (modifier, value) in modifiers {
            characterNode.morpher?.setWeight(value, forTargetAt: modifier.rawValue)

All we have to do now is to change the weights when we get a new update for our face anchor.

func process(_ blendShapes: [ARFaceAnchor.BlendShapeLocation : NSNumber]) {
        let modifiers = blendShapesToModifiers(blendShapes)

func blendShapesToModifiers(_ blendShapes: [ARFaceAnchor.BlendShapeLocation : NSNumber]) -> [CharacterController.Modifier: CGFloat] {
        var modifiers: [CharacterController.Modifier: CGFloat] = [:]
        if let leftEye = blendShapes[.eyeBlinkRight] {
            modifiers[.closeLeftEye] = CGFloat(truncating: leftEye)
        if let rightEye = blendShapes[.eyeBlinkLeft] {
            modifiers[.closeRightEye] = CGFloat(truncating: rightEye)
        if let mouthOpen = blendShapes[.jawOpen] {
            modifiers[.openMouth] = CGFloat(truncating: mouthOpen)
        if let smileLeft = blendShapes[.mouthSmileRight] {
            modifiers[.smileLeft] = CGFloat(truncating: smileLeft)
        if let smileRight = blendShapes[.mouthSmileLeft] {
            modifiers[.smileRight] = CGFloat(truncating: smileRight)
        if let browLeft = blendShapes[.browDownRight] {
            modifiers[.liftLeftBrow] = 1 - CGFloat(truncating: browLeft)
        if let browRight = blendShapes[.browDownLeft] {
            modifiers[.liftRightBrow] = 1 - CGFloat(truncating: browRight)
        if let tongue = blendShapes[.tongueOut] {
            modifiers[.tongueOut] = CGFloat(truncating: tongue)
        return modifiers

That’s it. Using the SCNMorpher together with Blender’s shape keys and ARKit blend shapes is super easy to setup.

If you (like me) are completely untalented in creating 3D models you can simply download one from the internet. If you are a student you can use the Autodesk Character Generator, where you can create character models in an RPG-like fashion, for free. They even support shape keys!