Skip to main content

Apple has a porn problem, and it's about to get worse

Apple has a porn problem, and it's about to get worse

/

Adult content in Vine and Twitter apps raises questions only Cupertino can answer

Share this story

vine porn
vine porn

On Sunday, a number of news outlets ran stories covering the rise of easily-accessible pornography on the new video sharing app Vine, causing a firestorm of debate online. The New York Times' Nick Bilton tweeted that pornographic material was discoverable thanks to simple hashtags such as #porn.

Vine doesn't have a porn problem — Apple has a policy problem

But the truth is that Vine doesn't have a problem with porn, at least not one that isn't shared by any other social media app. Apple has a problem: its App Store's puritanical, unevenly-enforced policies for adult content. Vine is just today's example.

The Twitter-owned app and service launched last week to much fanfare, mostly due to its ingenious editing functions, which allow users to stop and start a video recording. Vine is also notable as one of Twitter's first major departures from its core social networking business. The iOS-only app was prominently featured by Apple as an "Editor's Pick" in its App Store the day it launched.

The news that pornography or nudity would find its way into a popular social app, which is focused on image or video sharing, takes a backseat to the larger question of how Apple will handle this flare-up. Recently the company pulled a popular photo sharing application from its App Store called 500px citing the discovery of "pornographic images and material." Apple offered this statement:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We've asked the developer to put safeguards in place to prevent pornographic images and material in their app.

A cursory search of #porn and related hashtags within the Twitter iOS app unearths a cornucopia of adult material, yet Apple has taken no action in the case of that app. The existence of pornography on Twitter and in similar apps is also not a recent occurrence — Twitter in particular has long been used for such sharing. Yet Apple has made much out of its tight partnership with Twitter, adding native Twitter functionality into iOS as part of a recent update to the software.

The situation draws even more attention to the vague and sometimes confusing rules of Apple's App Store guidelines, and more clearly showcases the sporadic and often unusual criteria the iPhone-maker uses to decide the fates of applications. As marketshare of Apple's iPads and iPhones has grown, the company has come under increasing fire over interpretations of its own rules in regards to offensive or objectionable content.

Twitter responded to inquiries about Vine with the following statement:

Users can report videos as inappropriate within the product if they believe the content to be sensitive or inappropriate (e.g. nudity, violence, or medical procedures). Videos that have been reported as inappropriate have a warning message that a viewer must click through before viewing the video.

Uploaded videos that are reported and determined to violate our guidelines will be removed from the site, and the user that posted the video may be terminated.

According to Vine's terms of service, below are the violations the company says it will take more decisive action on, "inappropriate" content is not listed as a catalyst. Rather, it's content which:

  • Impersonates another person or entity in a manner that does or is intended to mislead, confuse, or deceive others;
  • Violates the rights of a third party, including copyright, trademark, privacy, and publicity rights;
  • Is a direct and specific threat of violence to others;
  • Is furtherance of illegal activities; or
  • Is harassing, abusive, or constitutes spam.

We've reached out to Apple for comment, and will update the post with more information as we get it.