Facebook to tackle organised misinformation campaigns
Facebook has announced new steps to fight ‘information operations’ led by governments and organisations designed to sway public opinion
On April 27, social media giant Facebook released a landmark report on its fight against misinformation, in which it said governments and other organisations were leading well funded and highly organised efforts to influence public opinion and make political gains. Facebook said such “information campaigns” encompassed both the circulation and promotion of “false news, disinformation, or networks of fake accounts aimed at manipulating public opinion”.
In addition to the findings, the company is developing a two-part plan to crack down on the fake accounts used to drive such efforts. The first step involves identifying them using machine learning techniques that, according to Reuters, are as sophisticated as the methods used by government intelligence agencies. Facebook’s security teams then review and delete the errant profiles if necessary.
The problem is most pronounced in areas where there is an unfolding political contest
“While information operations have a long history, social media platforms can serve as a new tool of collection and dissemination for these activities”, Facebook said in the report. “Through the adept use of social media, information operators may attempt to distort public discourse, recruit supporters and financiers, or affect political or military outcomes.”
The scale of the problem is somewhat limited across the service as a whole. In the period between September and December 2016, less than 0.1 percent of interactions on the site were such information operations, Facebook said. However the problem becomes more pronounced in areas where there is an unfolding political contest. For example, in a security note on April 12, Facebook announced it had recently cracked down on 30,000 accounts linked to information campaigns in France.
The new report is important for Facebook in terms of retaining user confidence, which has been a particular headache since the 2016 US presidential election. The site’s algorithms and moderators act as a filter between all the information posted to the site and the information that appears on users’ news feeds. Many people began to see this as a problem last year when they claimed it caused an ‘echo chamber’ effect, which saw users only encountering news and opinions that aligned with their own views. Consequently, many called for Facebook to redefine itself as a ‘media service’, rather than just an internet company. The new report is a major step in the company’s efforts to avoid doing this.
“Societies will only be able to resist external information operations if all citizens have the necessary media literacy to distinguish true news from misinformation”, the report said, suggesting the company remains unwilling to shoulder the entire responsibility of moderating content. It has already announced plans to flag-up news stories it deems legitimate, and has plans to invest in education initiatives such as the News Integrity Coalition to teach people to be more discerning online in future.