Skip to main content

Epic’s latest tool can animate hyperrealistic MetaHumans with an iPhone

Epic’s latest tool can animate hyperrealistic MetaHumans with an iPhone


MetaHuman Animator lets Unreal Engine users quickly capture a performance and apply the resulting animation to a MetaHuman character. Epic claims the process can be completed in ‘minutes.’

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Today, Epic is releasing a new tool designed to capture an actor’s facial performance using a device as simple as an iPhone and apply it to a hyperrealistic “MetaHuman” in the Unreal Engine in “minutes.” The feature, dubbed MetaHuman Animator, was detailed at the Game Developers Conference in March but is now available for developers to try out for themselves. Epic has also released a new video today produced by one of its internal teams to show what the tool is capable of.

While Epic’s short film shows off some impressively subtle facial animation, the big benefit the company is emphasizing is the speed with which MetaHuman Animator produces results. “The animation is produced locally using GPU hardware, with the final animation available in minutes,” the company’s press release reads. That has the potential to not just save a studio money by making performance capture more efficient but also, Epic argues, it could allow them to experiment and be more creative. 

A screenshot showing the animation data captured from a camera.
The performance data being captured from a camera.
Image: Epic Games

“Need an actor to give you more, dig into a different emotion, or simply explore a new direction?” Epic’s press release asks. “Have them do another take. You’ll be able to review the results in about the time it takes to make a cup of coffee.” Facial animation can be applied to a MetaHuman character “in just a few clicks,” Epic says, and the system is even smart enough to animate a character’s tongue based on the performance’s audio.

Performance capture using iPhones has been possible in the Unreal Engine since at least 2020 with the launch of Epic’s Live Link Face iOS app, but now, it’s combined with the high level of detail promised by Epic’s MetaHuman technology. As well as working on the iPhone 12 and up (which is capable of capturing both video and depth data), Epic says MetaHuman Animator can also be used with “existing vertical stereo head-mounted camera [systems] to achieve even greater fidelity.”

Epic says the Blue Dot short film released today should give some idea of what its animation tool is capable of. It was produced by Epic Games’ 3Lateral team and stars actor Radivoje Bukvić delivering a monologue based on a poem by Mika Antić. Although Epic says it’s possible to tweak animation post-capture, it claims “minimal interventions” were made on top of the MetaHuman Animator’s performance capture to achieve these results.

If you want to learn more, Epic has released an instructional video on how to use the tool. Documentation is also available via the MetaHuman hub on the Epic Developer Community.