Full Fact, a UK charity, founded in 2010 has been employed by Facebook to help deal with the spread of fake news.
Fake news is something that has become a part of our lives and it is harder and harder for many people to weed out. Such is the sophistication of the ‘fake news’ makers.
A study published in the journal Science recently revealed that Facebook users aged over 65 shared more than 7 times as much fake news content as users aged between 18 and 29.
Full Fact is a fact-checking service. Facebook said that it will focus on misinformation that could damage people’s health or safety or undermine democratic processes.
Full Fact will review stories, images and videos and rate them based on accuracy.
Facebook said it was working “continuously” to reduce the spread of misinformation.
Sarah Brown, training and news literacy manager at Facebook, said:
“People don’t want to see false news on Facebook, and nor do we. We’re delighted to be working with an organisation as reputable and respected as Full Fact to tackle this issue. By combining technology with the expertise of our fact-checking partners, we’re working continuously to reduce the spread of misinformation on our platform.”
Full Fact explained that users can flag up content they think may be false and its team will rate the stories as true, false or a mixture of accurate and inaccurate content.
It will only be checking images, videos and articles presented as fact-based reporting. Other content, such as satire and opinion, will be exempt.
If something is found to be fake, it will appear lower in the news feed but will not be deleted.
Full Fact says they will be tackling “everything from dangerous cancer ‘cures’ to false stories spreading after terrorist attacks, to fake content about voting processes ahead of elections.”
“This isn’t a magic pill. Fact-checking is slow, careful, pretty unglamorous work and realistically we know we can’t possibly review all the potentially false claims that appear on Facebook every day. But it is a step in the right direction.”
Facebook has come under scrutiny from politicians in both the UK and US. They have been accused of helping to spread misinformation. Fake news that could have an effect on the way people vote in elections or referendums.
The Brexit referendum and the 2017 general election were both found to have been tarnished by fake news, and social media firms have been threatened with regulation if they fail to do something about the issue.
The social network now works with fact-checkers in more than 20 countries.
In the UK, Full Fact will initially be the sole fact-checking partner. Will Moy, the charity’s director, welcomed Facebook’s decision, saying:
“Fact-checking can take hours, days or weeks, so nobody has time to properly check everything they see online. But it’s important somebody’s doing it because online misinformation, at its worst, can seriously damage people’s safety or health.”
“There’s no magic pill to instantly cure the problem, but this is a step in the right direction. It will let us give Facebook users the information they need to scrutinise false or misleading stories themselves and hopefully limit their spread – without stopping them sharing anything they want to.”
The BBC reported, ‘
‘To illustrate the problem that fact-checking services and social networks face, the BBC has learned that a video went live last weekend falsely suggesting that smart meters emit radiation levels that are harmful to health.
The original video had more than four million views in four days before being taken down.’
Sacha Deshmukh, chief executive at campaigners Smart Energy GB, said: “Smart Energy GB welcomes the news today from Facebook to take action on the phenomenon of misleading videos on its platform in the UK.
“But for this to work effectively, Facebook must guarantee that the speed of action will match the speed with which such misleading stories spread.
“Facebook provided an environment in which, over 72 hours, the video attracted more than four million views.
“This is clearly unacceptable and, moving forward, the challenge for Facebook is whether their new system will be nimble enough to swiftly fact-check misleading content and protect their users.”