Combating Disinformation and Fake News: The Role of Sentiment Analysis.

Mr Nikos Dimakopoulos

Senior Project Manager

People’s communication and information sharing have been revolutionized by social media. However, the rise of social media has also resulted in an increase in misinformation and fake news (D&FN). D&FN has the capacity to cause major harm by disseminating misleading or false information, which has the potential to sway public opinion, generate confusion, and weaken trust in established sources of information. D&FN’s influence on public opinion is evident. False information spreads swiftly on social media because individuals often accept what they see online and rarely take the effort to check the material’s veracity [1].  According to Oxford Internet Institute’s report [2], D&FN can dramatically influence public opinion and even election outcomes. One of the most visible examples of D&FN’s detrimental impact occurred during the 2016 US Presidential election. Russian operatives utilized social media platforms to promote fake information about the candidates in order to influence the outcome of the election. The misinformation spread by these operatives most definitely influenced the election outcomes, and its impact may still be felt today, with many individuals believing in bogus conspiracy theories.

Sentiment analysis could be useful in addressing this issue as is a technology that analyzes text using natural language processing and machine learning to assess the emotional tone of the author. This could be especially useful in detecting D&FN because false information is frequently generated to elicit an emotional response from the reader. It can discover patterns in the language used by those propagating D&FN by assessing the sentiment of social media posts. For example, if a message is discovered to be emotionally charged, with phrases like “outrage,” “fear,” or “anger,” it may be identified as potentially containing disinformation. Sentiment analysis, in addition to detecting D&FN, can assess the efficiency of initiatives targeted at preventing the spread of false information. For example, if a fact-checking article is produced in response to a false claim, sentiment analysis might be used to analyze how the public’s emotional response to the claim changes following the publication of the fact-checking piece. Also, Sentiment analysis is an essential tool that can help combat D&FN by identifying patterns in the language used by people spreading disinformation and measuring the effectiveness of interventions aimed at countering it. It is crucial to develop effective strategies to combat D&FN to ensure that the public has access to accurate and trustworthy information, especially in today’s increasingly polarized and divisive social and political environment.

As it is evident that the spread of D&FN is a growing problem that significantly impacts public opinion, the EU has been taking a multifaceted approach to combat the spread of D&FN, which includes working with social media platforms, establishing a Rapid Alert System, investing in media literacy programs, and collaborating with fact-checking organizations. These efforts represent an important step in the fight against disinformation and its impact on public opinion. It is worth noting that several research and innovation projects have been funded by the EU in fighting the spread of D&FN. One such innovation action project is the FERMI project [GA No. 101073980], which aims to develop a framework to detect and monitor the way that D&FN spread. The project also seeks to identify relevant security countermeasures for different locations and segments of society. The technical coordinator of this project is ITML, which leads the efforts for integrating several technologies from technology partners. ITML also leads the developments for the Sentiment Analysis module of the FERMI platform, which is an important step towards addressing the impact of D&FN on public opinion.

To find more about FERMI and its partners follow us on Twitter, Youtube, Mastodon and connect with us on LinkedΙn
This project has received funding from the European Union’s HE research and innovation programme under grant agreement No. 101073980

[1] Lazer, David & Baum, Matthew & Benkler, Yochai & Berinsky, Adam & Greenhill, Kelly & Menczer, Filippo & Metzger, Miriam & Nyhan, Brendan & Pennycook, Gordon & Rothschild, David & Schudson, Michael & Sloman, Steven & Sunstein, C. & Thorson, Emily & Watts, Duncan & Zittrain, Jonathan. (2018). The science of fake news. Science. 359. 1094-1096. 10.1126/science.aao2998.

[2] Social media manipulation by political actors now an industrial scale problem prevalent in over 80 countries – annual Oxford report