Skip to main content

Frank Ocean fans are getting scammed with fake AI-generated songs

Frank Ocean fans are getting scammed with fake AI-generated songs

/

An AI song scammer wreaked havoc on a Frank Ocean fan community by selling fake songs, according to a Motherboard report.

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

US-CULTURE-MUSIC-PANORAMA
Image: ANGELA WEISS / AFP via Getty Images

Fans of Frank Ocean lost thousands of dollars after a scammer sold them fake AI-generated songs, according to a report from Motherboard. It was only a matter of time.

The scammer made off with around $13,000 CAD by selling a collection of bogus “leaked” tracks by the artist to a Discord server of his fans. To build trust, the scammer reportedly shared one real unreleased Frank Ocean song on Discord and then jumped at the opportunity when they saw that fans couldn’t tell the difference between the real track and a fake one.

“We now live in a world where nobody knows if a song is made by the artist or by a robot,” a member of the Discord told Motherboard.

This is one of the most urgent problems with AI-generated songs for artists and labels: can they control how an artist’s likeness and voice are being used for music made using AI models?

It’s not clear whether training AI models on a corpus of an artist’s work is copyright infringement, but it’s the argument that music labels, news publishers, and Getty Images are making against generative AI companies. Widespread access to the technology is so new that there isn’t a legal precedent yet, and it could take years to get there. But there are laws about how a person’s identity and likeness can be used to make money.

Tricking people into buying fake songs you say are Frank Ocean’s, meanwhile, is almost certainly illegal, depending on where you live — I recently wrote about the legal tangle of the right of publicity in the context of AI-generated Drake songs that surfaced last month.

The case with the fake Frank Ocean songs hits squarely on something legal experts pointed out: AI-generated songs that are convincingly good could hurt artists because they could take away their ability to profit from them. And bad songs could be harmful too, if fans are fooled by the tracks and start to think the artist is on the decline. Meredith Rose, senior policy counsel at Public Knowledge, told me she expects the AI music legal fight could prompt more uniform laws around the right of individuals to control how their image is used.

Is AI-generated music good for listeners? Definitely not if you’re getting taken for a ride by a scammer with the name “Mourningassasin”! But there could be a market for music that sounds like your favorite artist. In 2021, composer and musician Holly Herndon released a voice model trained to sound like her. Grimes recently said she’d split royalties 50 / 50 with anyone who makes an AI-generated song using her voice that goes viral. Who hasn’t wished for more music from bands who’ve broken up or artists — like Frank Ocean — who haven’t released music in a while?

Then again, part of what I enjoy about following artists is watching how their work progresses — what changes and what stays the same, what themes they choose to focus on, and how they respond to a changing world. If that includes experimenting with AI models in their music, that’s exciting and interesting to me. But I’m not pining for tracks that rely on the voice of someone recognizable while delivering a worse listening experience — the lyrics and style of the AI-generated songs out there are not exactly giving “smash hit.”