Facebook is continuing its efforts to fight fake news and stop the spread of false information on the platform. Today, the company announced a number of new initiatives to reduce activity by bad actors on the site, from expanding its fact-checking operations to more countries to using machine learning to find foreign Pages that share hoaxes for financial gains.
“With more than a billion pieces of content posted every day, we know that fact-checkers can’t review every story one-by-one. So, we are looking into new ways to identify false news and take action on a bigger scale,” writes Facebook product manager Tessa Lyons on the Facebook news blog.
The site will be extending its third-party fact-checking program to 14 more countries. The fact-check tests it uses on photos and videos — to determine if a photo or video has been manipulated or edited to reflect false information — is being extended to four new countries.
Facebook’s fact-checking partners will also start leveraging Schema.org’s Claim Review, an open-source fact-checking platform.
“This will make it easier for fact-checkers to share ratings with Facebook and help us respond faster, especially in times of crisis,” writes Lyons.
Before now, Facebook used ratings from fact-checkers to identify Pages and domains that habitually share false news. The company says it’s now beginning to use machine learning to find and demote foreign Pages that spread hoaxes to people in other countries specifically for financial gains — similar to the clickbait farms that were identified in Macedonia two years ago.
In addition to expanding its efforts to more countries and using machine learning more extensively, Facebook will be launching a new website connected to the independent research commission it announced in April, focused on how social media impacts elections. It is also hiring a staff and establishing procedures to make sure the initiative is fully independent.
“We’re currently working with the commission to develop privacy-protected data sets, which will include a sample of links that people engage with on Facebook,” writes Lyons. “Over time, this externally-validated research will help keep us accountable and track our progress.”
Initially, Facebook CEO Mark Zuckerberg had shrugged off implications that the spread of fake news on the platform had impacted the 2016 elections. Zuckerberg has since reversed his statements — acknowledging the site was slow to respond to the threat of Russian interference on the site and apologizing for the company’s lack of efforts to stop coordinated misinformation attacks during recent political events.
Since getting its feet held to the fire after the Cambridge Analytica scandal, Facebook has gotten much more serious about the spread of false information on the platform and bulked up its efforts to fight fake news and be more transparent with users.