Skip to main content

Instagram asks suspected bots to verify themselves with video selfies

Instagram asks suspected bots to verify themselves with video selfies

/

It’s unclear how widespread the verifications are

Share this story

Illustration by Alex Castro / The Verge

Instagram is asking some users to provide a video selfie showing multiple angles of their face to verify that they’re a real person, according to screenshots posted to Twitter by social media consultant Matt Navarra. The social network has long struggled with bot accounts, which can leave spam messages, harass people, or be used to artificially inflate like or follower counts. A followup tweet from Meta-owned Instagram says it’s asking suspicious accounts to verify they’re human, not bots.

According to XDA Developers, the company started testing the feature last year but ran into technical issues — Instagram says it “introduced video selfies more than a year ago.” Multiple users have recently reported being asked to take a video selfie to verify their existing accounts.

Another writer on Twitter, Bettina Makalintal, posted a screenshot of the help screen for the step where you actually take the video selfie — it reiterates that it’s looking at “all angles of your face” to prove that you’re a real person and shows that the verification screen is showing up for multiple people.

I made several attempts at setting up a sketchy-looking Instagram account and was never presented with the video challenge. Instagram posted on Twitter that accounts that had suspicious behavior (such as quickly following a ton of accounts) could be asked to do a video selfie. The company also reiterated that the feature doesn’t use facial recognition, and said that Instagram teams review the videos. Instagram says that “one of the ways” video selfies are used is to help curtail bots, leaving the door open for other uses.

The move may surprise some, given Meta’s recent announcement that it would be shutting down one of its Face Recognition features. As the company has since reiterated, though, it was only shutting down a specific Facebook feature, not Meta’s use of facial recognition as a whole. Nevertheless, the message from Instagram is that the video selfie feature won’t use face recognition at all and that the video will be deleted after 30 days.

Meta’s promise to not store or post the data may not reassure some users who are already distrustful of Meta / Facebook. People may remember the time when a bug let attackers access Instagram users’ supposedly private birthday info (which you’ll soon be required to provide to use the app) with just a DM. Of course, Instagram hadn’t promised to delete that birthday info like it says it’ll do with the video selfie, but it’d be hard to blame people (especially minors or those who want to stay anonymous) for feeling uncomfortable with providing that data if they’re asked.

Updated November 17th, 1:20AM ET: Added information tweeted by the Instagram Comms account and updated the headline.

Updated November 18th, 12:11PM ET: Clarified Instagram’s timeline for video selfies being released, and how it says they’re used.