Skip to main content

The best use for AI eye contact tech is making movie stars look straight at the camera

The best use for AI eye contact tech is making movie stars look straight at the camera


It’s stupid. It’s funny. And I like it.

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

A GIF of the character of Anton Chigurh from No Country for Old Men staring directly at the camera thanks to the adjustment by AI.
Anton Chigurh from No Country for Old Men — not someone you want to make eye contact with.
Image: via Twitter / Daniel Hashimoto

Over the past few years, a bunch of tools have been released that use AI to edit video calls in real time so that the caller is making eye contact with the camera. FaceTime can do it. Microsoft Teams can do it. And Nvidia Broadcast can do it, too. (Provided, in each case, you have the necessary hardware or software.)

This tech comes with a bunch of interesting questions, of course. Like: is constant unbroken eye contact good or a bit creepy? Are these tools useful for people who don’t naturally like eye contact? Or is this all just the thin edge of a wedge labeled, for the sake of argument, “the increasing use of AI to create a more polished digital version of ourselves is contributing to an increased sense of alienation and loss of identity”?

My answer to that last question is: “yeah, probably.” But forget that high-brow trash for now, because here’s the stupidest and best use case of this technology yet: editing movie scenes so actors make eye contact with the camera.

Master of the form is Daniel Hashimoto, a VFX specialist known for his Action Movie Kids channel on YouTube and weekly podcast on the industry. Here’s some of his best work, via his Twitter:

Chatting via Twitter DM, Hashimoto told us that he created the clips by simply connecting up his web browser to Nvidia’s Broadcast software, which offers the feature and does all the processing in real time. As an expert in visual effects, he says he was astonished by the results.

“From a technical perspective, the technology is incredible. The tracking is real-time, and the lighting and color matching is very impressive,” said Hashimoto. “I even discovered that the effect animates in and out, and respects a person’s head direction, so it will correct the eyes, when they are looking just off screen, but respects if a person is clearly turning their head and looking at something else. There’s no doubt in my mind that every videoconferencing tool will have some version of this within the next little while.”

He notes that some people misinterpreted his edits (particularly to a scene from The Bear, second above) as an attempt to “improve” the original but stressed that “this couldn’t be further from the truth.”

“As I mentioned online, this show, this performance, and this scene are transcendent — and it was pretty sacrilegious to have used it,” said Hashimoto. He jokingly added that if anyone was offended by the video, they could call into his live show tomorrow to talk about it.

With that in mind, here’s one last edit — with a very appropriate subject matter: