A newly released iOS app from Epic Games lets developers record facial expressions that can be imported directly into the Unreal Engine, using little more than an iPhone’s front-facing camera. The Live Link Face app, which is available to download now from the App Store, can stream this facial animation data in real time directly onto characters in Unreal Engine, which Epic hopes will make facial capture “easier and more accessible” for creators.
Epic’s app works by building on a couple of Apple’s existing technologies, including its ARKit augmented reality platform, and the TrueDepth camera that Apple introduced with the iPhone X in 2017. It’s the same technology that powers Apple’s Animoji and Memoji, which map your facial expressions onto cartoon avatars. Only now it can be used to animate avatars in the engine powering many of the world’s most popular games.
Epic isn’t the first company to have thought of using Apple’s technology as an animation tool. It didn’t take long for developers to start making facial capture apps after the iPhone X’s release, and we also saw it used to generate facial expressions for a Walking Dead augmented reality game in 2018. But having the functionality built directly into the Unreal Engine, which is used by millions of developers around the world, could give it a much larger reach, and make it much easier for artists to use it in their work.
Epic says that its Live Link Face app can work in situations ranging from an artist’s home office to a soundstage with multiple actors in motion capture suits with head-mounted rigs, and that it can natively adjust between them. The app can be controlled remotely, so Epic says you could set it up to simultaneously start recording on multiple iPhones using a single command. Once imported, the facial animation data can be adjusted in-engine.
If you’re a developer who’s interested in trying out the app for yourself, the Epic’s documentation for the feature can be found here.