Skip to main content

Apple removed Telegram from the App Store over distribution of child pornography

Apple removed Telegram from the App Store over distribution of child pornography

/

Telegram was forced out of the App Store while it handled the situation

Share this story

telegram messenger

Last week, Apple mysteriously removed secure messaging app Telegram and its more efficient counterpart, Telegram X, from the App Store for “inappropriate content,” a move that many users found curious as it came without any concrete explanation. Now, thanks to confirmation of the authenticity of an email exchange between a Telegram user and Apple marketing chief Phil Schiller courtesy of 9to5Mac, we know the “inappropriate content” was in fact child pornography being distributed via Telegram’s mobile apps.

“The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps,” Schiller wrote to the user. “After verifying the existence of the illegal content, the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”

Telegram was unavailable in the App Store for roughly 24 hours

Distribution of child pornography is among the most serious offenses on the web, and both the users and platforms involved in the act are often held responsible in various capacities to prevent such images and videos from being shared and from being allowed to propagate in any way whatsoever on the internet. Nearly every social network and large tech platform on the planet uses a wide variety of digital protections to prevent child pornography from being posted and to detect it immediately upon its distribution. That’s done through the use of databases compiled by federal law enforcement and hashing technology to detect and track files as they move across networks.

Telegram, however, appears to have not been quite as prepared in this case, prompting Apple to take the entire app down while the messaging company figured out how to remedy the situation. “We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the App Store,” Telegram CEO Pavel Durov said in a statement last week. “Once we have protections in place we expect the apps to be back on the App Store.” In took roughly one day for Telegram and Telegram X to return to the App Store.

The company has had similar issues in the past concerning terrorism, and it’s been harshly criticized by governments for failing to grapple with how criminals use its end-to-end encrypted chat features. After Indonesia threatened to ban the app in July of last year over ISIS propaganda, Telegram created a special team to moderate content in the country.

Here’s Schiller’s email in full, via 9to5Mac:

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.

We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.

I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.