A new bipartisan bill, introduced on Wednesday, could mark Congress’ first step toward addressing algorithmic amplification of harmful content. The Social Media NUDGE Act, authored by Sens. Amy Klobuchar (D-MN) and Cynthia Lummis (R-WY), would direct the National Science Foundation and the National Academy of Sciences, Engineering and Medicine to study “content neutral” ways to add friction to content-sharing online.
The bill instructs researchers to identify a number of ways to slow down the spread of harmful content and misinformation, whether through asking users to read an article before sharing it (as Twitter has done) or other measures. The Federal Trade Commission would then codify the recommendations and mandate that social media platforms like Facebook and Twitter put them into practice.
“For too long, tech companies have said ‘Trust us, we’ve got this,’” Klobuchar said in a statement on Thursday. “But we know that social media platforms have repeatedly put profits over people, with algorithms pushing dangerous content that hooks users and spreads misinformation.”
For years, Democrats have pursued ways to address misinformation online, while Republicans have criticized these efforts as threats to free speech. But sparked by testimony from Facebook whistleblower Frances Haugen in 2020, members of both parties have started working together to find ways to regulate algorithms that address both children’s issues and misinformation. Lummis’ support of the bill signals a significant step forward in this process.
“The NUDGE Act is a good step toward fully addressing Big Tech overreach,” Lummis said in a statement on Thursday. “By empowering the [NSF] and [NASEM] to study the addictiveness of social media platforms, we’ll begin to fully understand the impact the designs of these platforms and their algorithms have on our society. From there, we can build guardrails to protect children in Wyoming from the negative effects of social media.”
Last March, Reps. Anna Eshoo (D-CA) and Tom Malinowski (D-NJ) first introduced their Protecting Americans from Dangerous Algorithms Act, which also focused on algorithmic amplification. Unlike Klobuchar’s bill, the House measure would amend Section 230 of the Communications Decency Act to any legal immunity platforms have when they’re found to have amplified content that violates civil rights.
Removing Section 230 liability protection has been the largest hurdle facing lawmakers seeking to address harmful algorithmic amplification. Tech and public interest groups like Public Knowledge have already come out in support of the Klobuchar measure, noting that its absence of 230 changes makes it one of the better models for algorithm regulation.
“Public Knowledge supports this legislation because it encourages informed decision-making to address a known problem: the promotion of misinformation,” Greg Guice, director of government affairs at Public Knowledge, said in a statement on Thursday. “Most importantly, the bill does all of this without tying compliance to Section 230 immunity.”
There’s little time left for Congress to pass tech legislation before the midterm elections heat up later this year. In an interview with The Verge last month, Klobuchar was optimistic in lawmakers’ abilities to pass broad, bipartisan bills before the end of the year.
Speaking on the Social Media NUDGE Act, Klobuchar said, “This bill will help address these practices, including by implementing changes that increase transparency and improve user experience.” She continued, “It’s past time to pass meaningful reforms that address social media’s harms to our communities head-on.”