Since it was first announced, I have been interested in experimenting with the iPhone X’s fancy new TrueDepth front facing camera. As soon as I got my hands on one, I downloaded the Unity ARKit Plugin and started digging into the new face tracking API’s. The creepy grey mask in the example project immediately reminded me of the final boss from Starfox (SNES), Andross. I found this video of the final battle from Starfox and thought it would make an awesome face tracking experience. This coalesced just days before the VR Austin Jam 2017 was set to begin, giving me the perfect idea for my Jam entry.

I knew going into the weekend, the secret to a successful hackathon is limiting scope. So I decided to focus on getting the face tracked Andross rig working first while my dev partner, Kenny Bier, focused on game mechanics. Luckily, Jeff Arthur (Banjo’s talented 3D artist) supplied me with the low poly Andross model, Starfox’s Arwing, and the door-like Andross projectiles before the Jam began so I had assets to work with.

This Unity blog post got me started by explaining at a high level how to access the iPhone X user’s face position, rotation, and blend shape location properties. Basically, you begin a new ARKitFaceTracking session, subscribe a FaceUpdated event and access the blend shape locations from within that function using the ARFaceAnchor anchorData.blendShapes dictionary.

private UnityARSessionNativeInterface m_session;

void Awake()
 {
m_session = UnityARSessionNativeInterface.GetARSessionNativeInterface();
 }

void Start () {
 Application.targetFrameRate = 60;
 ARKitFaceTrackingConfiguration config = new ARKitFaceTrackingConfiguration();

config.alignment = UnityARAlignment.UnityARAlignmentGravity;
 config.enableLightEstimation = true;

if (config.IsSupported ) {
 m_session.RunWithConfig (config);
 UnityARSessionNativeInterface.ARFaceAnchorAddedEvent += FaceAdded;
 UnityARSessionNativeInterface.ARFaceAnchorUpdatedEvent += FaceUpdated;
 UnityARSessionNativeInterface.ARFaceAnchorRemovedEvent += FaceRemoved;
 }

}
void FaceUpdated (ARFaceAnchor anchorData)
 {
 mAnchorData = anchorData;

currentBlendShapes = anchorData.blendShapes;
 mouthOpenInt = andross.GetComponent().sharedMesh.GetBlendShapeIndex("MouthOpen");

//Open Mouth
 currentBlendShapes.TryGetValue("jawOpen", out jawOpenAmt);
 andross.GetComponent().SetBlendShapeWeight(0, jawOpenAmt * 100);

//Left Eye Blink
 currentBlendShapes.TryGetValue("eyeBlink_L", out l_eyeOpenAmt);
 andross.GetComponent().SetBlendShapeWeight(1, l_eyeOpenAmt * 100);

//Right Eye Blink
 currentBlendShapes.TryGetValue("eyeBlink_R", out r_eyeOpenAmt);
 andross.GetComponent().SetBlendShapeWeight(2, r_eyeOpenAmt * 100);

}

Once you have the iPhone X blend shape hook ins, you route them to the imported blend shapes that correspond to your model. As a test, I imported a fully rigged model from the Unity Asset Store and got the mouth flapping. 

*IMPORTANT: ARKit’s blend shape values operate from 0-1 but your mesh’s blend shape weights operate from 0-100, so remember to multiply by 100 or you won’t see any animations*

Next, I had to rig my own model to be driven by these values.

I knew very little about creating blend shapes going in but I found this article that explains it fairly well. Normally, you would rig an entire face before creating the blend shapes so that the animation would render realistic muscle movement. However, due to the low poly nature of my Andross face, I could skip the rigging step and just manipulate the individual vertices by hand. I created three blend shapes: left eye closed, right eye closed, and mouth open. 

Once I exported the face mesh out of Maya with the blend shapes attached and imported it into Unity, I could manipulate the blend shape weights in the editor. 

After swapping out some variables, I replaced my example face rig with Andross and got my first retro game boss animoji working as intended.

I wanted all the user input to rely on facial expressions such as opening your mouth to fire and closing your eyes to turn ‘invisible’, allowing the Starfox bullets to pass through Andross without hurting him. So all I had to do was trigger functions based off the blend shape weight value (and control firing with a coroutine so that there wasn’t 1 million projectiles coming out Andross’s mouth!).

The bulk of the time spent after this was just creating the *game* part of it: randomizing enemy flight paths, firing projectiles, health systems, placing UI elements (all retro 2D assets created by the lovely/talented Kaci Lambeth), game over/win conditions and generally attempting to make it fun. After squashing a litany of bugs and balancing gameplay, Starfox AR was ready… to make people look strange in public!

Download here: https://rigelprime.itch.io/starfox-ar