Skip to main content

Binance executive claims scammers made a deepfake of him

Binance executive claims scammers made a deepfake of him

/

There’s no hard evidence that deepfakes were used, though

Share this story

An illustration depicting a feature-less face against a pink, white, and blue background.
Illustration by Alex Castro / The Verge

Patrick Hillmann, chief communications officer at the world’s largest crypto exchange, Binance, claims scammers made a deepfake of him to trick contacts into taking meetings.

Writing in a blog post titled “Scammers Created an AI Hologram of Me to Scam Unsuspecting Projects,” Hillmann claims that a “sophisticated hacking team” used video footage of interviews and TV appearances to create the fake. Says Hillmann: “Other than the 15 pounds that I gained during COVID being noticeably absent, this deep fake was refined enough to fool several highly intelligent crypto community members.”

The only direct evidence Hillmann offers for the claim is a screenshot of a conversation with an anonymous individual who claims to have had a Zoom call with Hillmann. Hillmann denies it, and his interlocutor responds: “they impersonated your hologram.”

When reached for comment by The Verge, Hillmann said: “I have only seen a still capture of the supposed deep fake which was shared by one of the teams that took part in a call. We are not sharing that now because we have been advised by the team investigating the matter that specific details of the deep fake being made public might result in further copycats.”

Hillmann added that at least four groups reached out to him to say they’d had a call with someone using his likeness, and that Binance’s cyber investigations team was looking into the case. “Our hope is to prevent other projects from falling prey to these types of scams,” he said over email. “There is no way for us to know the extent of this scam because it was perpetrated completely outside our ecosystem on platforms like Telegram and LinkedIn.”

Fears of deepfake scams have, so far, outstripped real-world damage

Although there has been much discussion of the potential of deepfakes to impersonate people in video calls, there have been no definitively confirmed cases to date. Audio deepfakes have been used to impersonate people over the phone, and video deepfakes have been shared on social media to boost scams (a recent example used a deepfake of Elon Musk, a common target for impersonation in crypto scams). But it’s not clear if the technology in its most accessible form is sophisticated enough yet to sustain an impersonation during a live call. Indeed, experts recommend that the simplest way to tell if you’re talking to a deepfake is simply to ask the individual to turn their head, as machine learning models used to create the deepfake do not generally include a face’s profile.

Meanwhile, fear of the threat of deepfakes is much more widespread. In 2021, for example, European politicians claimed they’d been tricked by a deepfake video call of a Russian dissident. However, reporting by The Verge revealed that the incident was the work of Russian hoaxers who used only makeup and deceptive lighting to impersonate their target.

On the other hand, the world of cryptocurrency is certainly rife with scams based on impersonation. These are usually more low-tech, relying on stolen photos and videos to populate fake social media profiles, but given the highly technical communities that follow crypto, it’s not implausible that people might try their hand at a more sophisticated plot. It’s also certainly true that the potentially lucrative proceeds of crypto scams make individuals like Hillmann extremely attractive targets for impersonations. A deepfake of a crypto exec could be used to boost confidence in a scam project or seed information that would turn the market in a desired direction.

Update, August 24th, 13:50AM ET: The story has been updated with comment from Patrick Hillmann.