Facebook for its latest attempt to combat the sharing of propaganda, fake news and made-up stories, is going to start showing users more info about where their news is coming from. As part of a News Feed update, the social network will now provide more context around the links people see. Giving users access to see information from the publisher’s Wikipedia page, a link to follow that publisher’s Facebook Page, and other links that might be related.
In light of this, Facebook is testing a new button that will pull up more information about a story’s publisher and some context on the article. In an example, Facebook showed how tapping the button — which appears as a small “i” — on an Associated Press story about the recent eclipse will pull up a card explaining what The Associated Press is, related stories about the eclipse, and a heat map of where in the world the article is being shared and who else you know who’s shared it.
The new feature will allow users to get information on the source of a news article with a single click without leaving Facebook and its news feed.
Facebook blog post signed by product managers Andrew Anker, Sara Su and Jeff Smith reads "We are testing a button that people can tap to easily access additional information without needing to go elsewhere" In some cases, if that information is unavailable, Facebook "will let people know, which can also be helpful context." Also contained in the post, Facebook says "Helping people access this important contextual information can help them evaluate if articles are from a publisher they trust, and if the story itself is credible."
Facebook will do this all automatically, which means humans won’t be compiling this additional information. The hope is that people will use the info to better understand where their news is coming from, and won’t be fooled by phony or ill-intentioned publishers.
The move is the latest by Facebook to stem the flow of fake news, hoaxes and disinformation after a series of revelations showing how unverified news went viral on social networks during the 2016 US election, in many cases resulting from Russian-led efforts and is currently constituting a threat for Russia to ban Facebook in 2018 as precautions for Russia’s national elections.
The information about the “i” was made known a day after US senators said they would ask executives from Facebook, Google and Twitter to testify at a November 1 hearing on Russian efforts to manipulate internet platforms during the election campaign and to testify on how they will curb misinformation and manipulation.
The button is just a test for now, so it’s not appearing for everyone. Facebook also says that “this is just the beginning of the test,” which seems to imply that it plans to expand the kind of information that’ll pop up when you hit the little button. I think it would also help if Facebook refocused its product on being a network for people to tell their friends about moments from their lives, instead of an endless jumble of links in identical boxes.