Can Mark Zuckerberg fix Facebook before it breaks democracy? That’s the headline on Evan Osnos’ 14,000-word profile of the Facebook CEO, after two years’ worth of scandals, in the New Yorker. That question is maybe unanswerable — what would it mean to fix Facebook, or democracy? — but the article does a better-than-average job at exploring its contours.
Osnos had the chance to interview Zuckerberg several times, even breaking (banana) bread with the CEO at his home. The New Yorker excels at bouying its narrative profiles with bright, colorful anecdotes, and Osnos’ tales of Zuckerberg’s board-game dominance are delightful:
A few years ago, he played Scrabble on a corporate jet with a friend’s daughter, who was in high school at the time. She won. Before they played a second game, he wrote a simple computer program that would look up his letters in the dictionary so that he could choose from all possible words. Zuckerberg’s program had a narrow lead when the flight landed. The girl told me, “During the game in which I was playing the program, everyone around us was taking sides: Team Human and Team Machine.”
The bulk of Osnos’ profile consists of stories that are well known to readers of this newsletter, though perhaps not the New Yorker readership at large. (The 2016 election, check; former employees lamenting the platform’s effects on society, check; Cambridge Analytica, check; Alex Jones, check.) Among the reporting Osnos draws on is my own — he includes quotes from Zuckerberg’s former pollster, Tavis McGinn, from a piece I did in February.
The profile is at its best in its conclusion, which sums up the CEO’s perspective — and the task ahead of him — as well as anyone I’ve seen do it:
The caricature of Zuckerberg is that of an automaton with little regard for the human dimensions of his work. The truth is something else: he decided long ago that no historical change is painless. Like Augustus, he is at peace with his trade-offs. Between speech and truth, he chose speech. Between speed and perfection, he chose speed. Between scale and safety, he chose scale. His life thus far has convinced him that he can solve “problem after problem after problem,” no matter the howling from the public it may cause.
At a certain po int, the habits of mind that served Zuckerberg well on his ascent will start to work against him. To avoid further crises, he will have to embrace the fact that he’s now a protector of the peace, not a disrupter of it. Facebook’s colossal power of persuasion has delivered fortune but also peril. Like it or not, Zuckerberg is a gatekeeper. The era when Facebook could learn by doing, and fix the mistakes later, is over. The costs are too high, and idealism is not a defense against negligence.
Facebook, and Zuckerberg, derive much of their confidence from earlier times that the company seemed to be on the brink of oblivion: the initial roll out of the News Feed; the introduction of the privacy-obliterating Beacon product; the botched IPO and improbably successful transition to becoming a mobile software company.
But as Osnos notes, the challenges faced by the company today are different both in scale and in kind. In Myanmar, the Philippines, Sri Lanka, and Germany, Facebook has been credibly linked to outbreaks of violence. Its ability to moderate the vast amount of content it ingests in a way that balances speech and safety has been mixed at best. As Osnos writes: “These are not technical puzzles to be cracked in the middle of the night but some of the subtlest aspects of human affairs, including the meaning of truth, the limits of free speech, and the origins of violence.”
Historically we have not asked tech companies to resolve these issues before they expand into new countries, even when they have no understanding of the political situation and their moderators do not speak the local language. If we have learned anything from Facebook’s travails, it is surely that they should have. But what’s done is done.
There was a time when seeing new interviews with Zuckerberg made me feel jealous. But increasingly I see the limits of asking tech CEOs questions about their work. A profile of a Zuckerberg, or a Jack Dorsey, or an Evan Spiegel, always seems to be circling around the question: is this person basically a good guy? I suspect that’s one reason they sit for profiles like this: they are basically good guys, and a magazine writer who visits them at their homes will see this and report it back to the world at large.
But lately I find myself less interested in reading tech CEOs perform their thoughtfulness. During the Alex Jones deplatforming drama, I wrote that Twitter’s dithering was frustrating because the company so often substitutes thinking for action. Dorsey gave several interviews during this time, and I read them all, and I learned almost nothing. It isn’t that the questions were bad, or that Dorsey sidestepped them. It’s that what he thinks is ultimately less consequential than what he does.
Facebook, to its credit, has taken many more actions to shore up its platform over the past two years than has Twitter. (Twitter, an admittedly smaller company with far fewer resources, has enacted many of the same steps Facebook has, but only months later, as if in slow motion.) And yet influence campaigns still unfold; hate spreads virally, and the platform lurches from one crisis to another.
I understand the value, from Facebook’s perspective, of regularly putting forward Zuckerberg to affirm that he is working on the problem. But I can’t help but feel like we knew that already.
Maybe tech platforms can be “fixed,” or maybe they can’t. But either way, it’s not an oral exam. And we ought not to treat it like one.
Pranav Dixit and Ryan Mac turn in a richly reported story about how misinformation spread on WhatsApp led to the murders of five men who had come to a tiny Indian town to trade. Facebook has taken a handful of actions designed to reduce the virality of encrypted WhatsApp messages, but in the meantime Indians are going town to town performing plays in an effort to get people to stop lynching strangers:
Ever since the Rainpada lynching, these students have traveled through the district at least a couple of times each week to perform a 16-minute street play, whose Marathi title loosely translates to “Don’t Play With People’s Lives, Don’t Fall for Rumors.” Tonight’s performance is in Dhule at a government-run hostel for girls from neighboring tribal hamlets like Rainpada.
“It’s important to sensitize young people who have moved out of their villages to towns like Dhule about misinformation,” Shahaji Shinde, who runs a Dhule-based NGO called the Navanirmiti Sanstha, told BuzzFeed News. “They are the ones who use Android phones and WhatsApp. They are the channels through whom misinformation reaches their villages.”
Twitter finally did the right thing, after exhausting every other option:
After weeks of equivocation, Twitter permanently suspended the accounts of Infowars and its founder Alex Jones on Thursday, following similar moves by other large tech companies, including Apple, Facebook, YouTube, and Spotify. The decision came after a series of provocations from Jones that Twitter deemed in violation of its “abusive behavior” rules.
The incident that inspired Twitter to action appears to have been a series of tweets containing a nine-minute Periscope video of Jones and his camera operators confronting CNN reporter Oliver Darcy. In the video, Jones lambastes Darcy as “the equivalent of like the Hitler Youth” and accuses him of “smiling like a possum that crawled out of the rear end of a dead cow.”
A day after Twitter banned Jones, Apple — which had previously tossed him a lifeline — followed suit. (People had speculated Apple kept Jones in the App Store so as not to draw attention to its duopoly in app distribution; it appears that the real answer may have been that they simply hadn’t yet observed his bad behavior on his app.)
Apple confirmed the app’s removal to BuzzFeed News, but declined to comment, pointing to its App Store Review Guidelines. The company said Infowars would not be permitted to return to the App Store.
The first clause of those guidelines explicitly rejects “defamatory, discriminatory, or mean-spirited content, including references or commentary about religion, race, sexual orientation, gender, national/ethnic origin, or other targeted groups, particularly if the app is likely to humiliate, intimidate, or place a targeted individual or group in harm’s way.”
Max Fisher and Katrin Bennhold have another good case study on how the YouTube algorithm spread misinformation and hate speech. “How many steps are there on YouTube’s algorithm from news story to fever swamp? ‘Only two,’ Mr. Serrato said. ‘By the second, you’re quite knee-deep in the alt-right.’”
Mr. Serrato scraped YouTube databases for information on every Chemnitz-related video published this year. He found that the platform’s recommendation system consistently directed people toward extremist videos on the riots — then on to far-right videos on other subjects.
Users searching for news on Chemnitz would be sent down a rabbit hole of misinformation and hate. And as interest in Chemnitz grew, it appears, YouTube funneled many Germans to extremist pages, whose view counts skyrocketed.
Can the government regulate search or news algorithms? Jared Schroeder investigates:
In the Herald case, the paper refused to comply with the law. Its editors argued the law violated the First Amendment because it allowed the government to compel a newspaper to publish certain information. The Supreme Court resoundingly agreed with the Herald. Justices explained that the government cannot force newspaper editors “to publish that which reason tells them should not be published.”
What’s important in 2018, however, isn’t simply that the law was unconstitutional. Justices used the decision to highlight that the government cannot compel expression. At the same time, they recognized that newspapers are businesses as well as journalistic endeavors. In one passage, the majority opinion explains that a “privately owned newspaper” has only two responsibilities: to publish information that is of interest to “a sufficient number of readers—and hence advertisers—to assure financial success; and, second, the journalistic integrity of its editors and publishers.”
Tony Romm and Brian Fung find a curious omission in Attorney General Jeff Sessions’ plans to meet with states about tech companies’ outsized influence:
Democratic attorneys general from key states said they have not yet been invited by the Justice Department to its upcoming review of tech companies, prompting criticism that the Trump administration’s inquiry is a politically charged attack on the tech industry.
U.S. Attorney General Jeff Sessions first announced on Wednesday that he was gathering state attorneys general to examine whether companies like Facebook, Google and Twitter are “intentionally stifling the free exchange of ideas” online. The goal of the meeting, the Justice Department said in a statement at the time, is to follow up on a hearing that had just taken place on Capitol Hill with Facebook and Twitter and consult with “a number” of states to figure out if Silicon Valley’s conduct is “hurting competition.” The meeting is tentatively scheduled for Sept. 25 in Washington.
Snap was forced to remove Al Jazeera’s Discover channel in some parts of the world, Ahmed Zidan reports:
Search for “Al-Jazeera” on Snapchat, and the first result that comes up is a ubiquitous publisher channel in the app’s famed vertical layout. That is, unless you are in Saudi Arabia, the United Arab Emirates (UAE), or Bahrain. Users in these counties are instead offered a list of stores and restaurants that bear a similar name to the broadcaster.
A clue to what happened to Al-Jazeera’s Discover channel can be found in Snapchat’s latest transparency report, released in May. The report reveals how in the second part of 2017 the company received–and complied with–its first governmental removal requests, all of which came from those countries. A spokesperson for Snap Inc., the parent company of the app, later confirmed to CPJ via email that the requests all related to Al-Jazeera Arabic Discover Publisher Story, and that the company removed the channel “to comply with local laws.”
Imran Khan, who for all intents and purposes was Evan Spiegel’s No. 2, is out at Snap, Sarah Frier and Nate Lanxon report. “There is never a perfect time to say goodbye,“ Khan notes. And this is true. But there are also better times to say goodbye — such as, for example, the time right after you turn the company around. Or show signs of progress. Or really any time other than the literal nadir of the company’s existence, which is arguably right now???
I am not about to apologize for using one of these groups to purchase my large and vicious komodo dragon, Balthazar. But just so you know:
A wildlife monitoring group says research it has conducted since 2016 has found a sharp increase in the number of people belonging to Facebook groups in Thailand where endangered animals are bought and sold.
The monitoring network TRAFFIC said its researchers found 1,521 animals for sale online in 12 Facebook groups in Thailand in less than a month of monitoring in 2016. Follow-up research on the same 12 groups showed that at least nine were still active in July this year, with one becoming secret, and their overall membership had increased to 203,445 from 106,111.
Last week Pew had data reporting that large numbers of Americans had deleted Facebook — but that barely seems to be evident in the company’s reported usage numbers. Leonid Bershidsky wonders what gives — and in so doing, makes this sharp point:
Investors and advertisers should take into account polls such as Pew’s along with Facebook’s user numbers. The combination could be a better indication of true activity levels than the company’s bot-distorted estimates. And even if significant numbers of users are tempted to say they use Facebook less than they actually do, that’s significant information for advertisers. If users feel guilty about logging on to Facebook and don’t have a positive experience, they might be less likely to buy things advertised on it.
No one understands how the News Feed works, despite Facebook’s efforts to the contrary, according to more new data from Pew. (Also: adults who say they do understand why posts are included in their News Feed are lying! At the level of the individual post, it is unknowable to anyone who doesn’t work at the company, and maybe not to anyone who works there, either.)
When asked whether they understand why certain posts but not others are included in their news feed, around half of U.S. adults who use Facebook (53%) say they do not – with 20% saying they do not understand the feed at all well. Older users are especially likely to say they do not understand the workings of the news feed: Just 38% of Facebook users ages 50 and older say they have a good understanding of why certain posts are included in it, compared with 59% of users ages 18 to 29.
Ousted Facebook co-founder Eduardo Saverin has sympathy for his old co-workers:
“It’s difficult to see a company that’s so close to my heart going through a point of public scrutiny,” said Saverin, who now lives in Singapore and invests in startups through his B Capital. “I have tremendous faith in Mark and the team that he’s built around him and, frankly, the intentions that Facebook was born out of from day one.”
At least three of Facebook’s 34 fact-checking partners have been trolled, doxxed or threatened for working with the social media company, Daniel Funke reports:
“The attacks, they were just relentless, they were persistent,” said Yvonne Chua, co-founder of Philippine fact-checking project Vera Files, which joined Facebook’s project in April. “This is not the first time these institutions or individuals have been attacked by the trolls. But this was more sustained — every day for more than two weeks, three weeks.”
Is Pinterest slow and smart or slow and lazy? Damned if I know, but the company now has 250 million users a month, reports Erin Griffith in this profile of Pinterest and its sincerely charming CEO, Ben Silbermann:
The company is on track to top $700 million in revenue this year, a 50 percent increase over last year, according to a person familiar with the company. There is wide speculation that it will go public next year.
If Pinterest continues its trajectory, it could change the narrative of what it takes to build a successful company in Silicon Valley, a meaningful feat at a time that the start-up world is seeking new templates for leaders. If it doesn’t, it’ll serve as another example of wasted potential, or worse, a cautionary tale.
One of the absolute weirdest things about the Cambridge Analytica scandal — and there were a lot — is that Joseph Chancellor, who helped harvest millions of user profiles that were then sold to the data-analytics company, has worked at Facebook since 2015. Well, he doesn’t any more. If you were present in the office when a group of people had cake for Joseph and wished him well on his future endeavors, and could describe the mood of said gathering, please get in touch!
Fathin Ungku says Facebook’s first Asian data center will open in 2022.
How bad are things going for Alex Jones? Well, for one thing he’s running a sale!
The best-selling Survival Shield X-2 nascent iodine drops were discounted 40 percent, to $23.95, while Alpha Power, a product marketed as boosting testosterone and vitality to “push back in the fight against the globalist agenda,” was half off, at $34.95.
Mat Smith asked to see his Tinder history and, under GDPR, was given an HTML file. I would never do this in 1 million years!!
In early Tinder chats, I used some mildly explicit words – nothing too racy, just enough to make me hate myself. Context-free, I used the word “horny” 23 times.
There’s a new feature-phone version of WhatsApp for India. It’s a move designed to expand WhatsApp’s dominance in India, and yet more evidence that growth reminds the company’s prime directive, no matter what it might be leading to.
Ever since it turned out that YouTube is a nightmare hell pit of ad-supported extremism, platforms have sought to reassure advertisers that their campaigns will not run alongside terrible things. Facebook has taken a new step in this regard:
According to an announcement being made Sept. 10, advertisers can now see where their ads might appear on in-stream videos, including those in Watch and Instant Articles, before, during or after a campaign, so they can block specific publishers or content categories from those placements. They can also review and update from a list that Facebook regularly adds to.
Those lists, however, only let advertisers block publishers they’re aware of, which means their ads could still appear against unsuitable content from publishers they don’t know.
Data scientist Eliza Khuner said Facebook’s family leave policies aren’t nearly as liberal as they’re made out to be:
I love my job, but I love my baby even more. When I told Facebook I wanted to work from home part-time, HR was firm: You can’t work from home, you can’t work part-time, and you can’t take extra unpaid leave. In mid-July, with the heartache of a break-up, I sent my resignation letter. I also wrote another note describing my agonizing choice, saying that Facebook could and should do better for families. I posted it internally, in a group for Facebook employees worldwide. I wondered if anyone would read it.
My phone started buzzing. More than 5,500 Facebook employees reacted in support. Hundreds commented, telling me I wasn’t alone. Mothers shared how they struggled to perform at work and be there for their kids, and how sad they were to miss the special moments. Fathers said they felt the strain of not being with their children. People with no kids chimed in with their support.
Siva Vaidhyanathan’s entire op-ed, which laments the number of fake accounts created on Facebook and suggests it’s a much higher percentage of the user base than previously reported, is based on a misunderstanding. When Facebook says that it deleted 1.3 billion accounts, it did so in almost all cases before those accounts were ever counted as active users. Bad take!
Researcher Christopher A. Bail says Twitter’s stated intention to disrupt echo chambers is likely to polarize the country further:
We surveyed more than 1,200 Twitter-using Republicans and Democrats about their political views. Then we paid half of them to follow for one month a bot we created that retweeted messages from elected officials and other opinion leaders from the other political party.
Instead of reducing political polarization, being exposed to opposing ideas increased it. Republicans who followed a Democratic bot for one month expressed social policy views that were substantially more conservative at the conclusion of the study. Democrats who followed a Republican bot exhibited very slight increases in liberal attitudes about social issues, but those effects were not statistically significant.
And finally ...
In February I told you about Ben Grosser’s Twitter Demetricator, which strips all engagement metrics from Twitter so that all tweets appear to be equal. (As if, Ben!) Today Ben wrote to introduce me to “Safebook,” a new piece of browser-extension agitprop that removes literally everything from Facebook in order to make it safe. Says Ben:
Given the harms that Facebook has wrought on mental health, privacy, and democracy, what would it take to make Facebook “safe?” Is it possible to defuse Facebook’s amplification of anxiety, division, and disinformation while still allowing users to post a status, leave a comment, or confirm a friend? With Safebook, the answer is yes! Safebook is Facebook without the content, a browser extension that hides all images, text, video, and audio on the site. Left behind are the empty containers that frame our everyday experience of social media, the boxes, columns, pop-ups and drop-downs that enable “likes,” comments, and shares. Yet despite this removal, Facebook remains usable: you can still post a status, scroll the news feed, “watch” a video, Wow a photo, or unfriend a colleague. With the content hidden, can you still find your way around Facebook? If so, what does this reveal about just how ingrained the site’s interface has become? And finally, is complete removal of all content the only way a social media network can be “safe?”
If you can answer any of Ben’s questions, email me!
Talk to me
Send me tips, comments, questions, and Zuckerberg board game anecdotes: firstname.lastname@example.org.