Facebook is shrinking fake news posts in the News Feed

Photo by Michele Doying / The Verge

Facebook is making fake news posts in the News Feed harder to see by shrinking the size of the links to content that has been verified by third-party fact-checkers as inaccurate, according to a report by TechCrunch. “We reduce the visual prominence of feed stories that are fact-checked false,” a Facebook spokesperson confirmed to the news site.

According to screenshots shared by TechCrunch, content deemed to be inaccurate show up on mobile as a headline and image in one small row, while accurate news links feature a large picture and are considerably more noticeable.

Facebook detailed the new strategy to combat misleading information at the Fighting Abuse @Scale conference in San Francisco, where the director of News Feed integrity Michael McNally and data scientist Lauren Bose spoke. Bose and McNally spoke about Facebook’s ecosystem approach to tackling fake news, including removing false accounts and assets like clusters of fraudulently created Pages, banning ads on malicious pages, and limiting the distribution of false posts.

Facebook will also use AI combined with flagged user reports to moderate content. It will do so by prioritizing articles for fact-checking by using falsehood prediction scores generated through machine learning. “We use machine learning to help predict things that might be more likely to be false news, to help prioritize material we send to fact-checkers (given the large volume of potential material),” a Facebook spokesperson told TechCrunch. But, moderation will always be a problem as we’ve pointed out before — especially on a huge platform where posts are constant. In December, Facebook announced it would no longer use Disputed Flags to identify false news and would instead use its Related Articles instead.


There is no solution for fake news at present time. The genie is out of the bottle, and it’s destroying our society from the inside.


Even that tiny thumbnail is a bit too much for fraudulent news articles. I as a user should also have the option to universally hide all articles being shared by friends with unreliable accuracy. I’ll still follow the sources I trust but I definitely don’t need to be reminded how naive/ignorant some of the people I went to school with are.

You know FB only shows you content from people when you connect with or interact with their posts. If these people are truly that naive/ignorant, un-follow them and your feed will clear on its own.

You know FB only shows you content from people when you connect with or interact with their posts.

FB will reach pretty far into friend-of-a-friend territory. Plus it ignores context of interactions. My wife will see political crap from people that she only interacts with as part of a dog rescue.

I tend to just hide news sources from appearing in my news feed. The issue with this is that more and more keep popping up. A universal option to hide content from sources with unreliable accuracy would be far more convenient. Much more than unfollowing every naive/ignorant friend that I may actually get along with but isn’t smart enough to decipher fraudulent articles from legitimate ones.

I think adding an individual setting would be a very good approach. It wouldn’t actually change anything unless it was opt in (which would get some people up in arms), but would be good PR.

I still can’t figure out why they don’t just put a bright red "Note: This link has been independently verified as fraudulent." Message below these and leave it at that.

Why show them at all?

cyncial reason – dwell time on the page increases FB revenue, so you scrolling past things is beneficial to them.

not ao cynical reason – they are honouring freedom of speech in a way and your right to consume what you wish, with a warning about believing it, so you can make an informed decision, even if you still go with what they think is fake

UX and visual and attention prominence really plays a big part on the spread of fake news, this is a welcome change but there’s no saving on Facebook slow death

I see that they are trying to do the right thing, but what they should be doing is:

  1. Warning the person posting the link that it already has been flagged as inaccurate before it is sent, preferably with a link to a page which lists the fact checker flagging it and reasoning to why it is flagged. That way unknowing people have an option to not continue to spread inaccuracies. Trolls won’t care, but we can’t do much about them anyway. If the link is flagged at a point in time after it was posted, send a message to the sender informing them of it being flagged. That way reasonable people can reach out to their friends saying that they now have been informed and ask to stop others from spreading the inaccuracies or even delete their original post altogether.
  2. Hide the links in the post with a bright red message to the reader: "This post contains one or more links that have been hidden because they have been flagged as leading to inaccurate material. The sender has been notified of this. If you want to see the link despite this warning, you can unhide the flagged links in the post". By letting the reader knowing this they can start to see if one of their friends is constantly spreading inaccuracies despite being informed that what they are spreading is probably not true, it will help the recipient unfriend or unfollow that person if they feel they don’t want to be fed inaccuracies. If one of my "friends" were to constantly spread lies knowingly, s/he wouldn’t be a friend of mine very much longer.

If they try to make it harder to read they inaccuracies already acknowledge that the content is troublesome and they know it. Go the whole way and say so, explicitly, to everyone.

People are free to say almost (that is within the letter of the law) what they want, but the platforms should police themselves and flag, not delete, inaccuracies that are not illegal. Spreading lies are much easier than disproving them, and a platform that has too many inaccuracies spread through them will lose its value to regular people (aka non-trolls/non-extremists). It lies in the interest of these platforms to police themselves.

View All Comments
Back to top ↑