In today’s digital age, disinformation and fake news (D&FN) pose significant challenges and with the internet as a primary information source for millions, the accuracy of online content is often questionable. Social networks, news sites, and web platforms are essential for communication and learning, however, they can also disseminate propaganda that appears to be credible information. This issue is particularly concerning in the context of religious extremism, violent radicalization, migration, and terrorism where such content can mislead the public, undermine trust, and foster fear and division, potentially resulting in violence and hate crimes. Nearly two-thirds of EU citizens read news online, and over half search for health information, highlighting the broad impact of D&FN on society.
Governments and institutions recognize the severity of D&FN and have developed strategies to combat it. Specifically, Law Enforcement Agencies (LEAs) have taken over the responsibility for monitoring social media and analyzing the impact of false information related to crimes. Since the rapid spread of false information and the anonymity of many sources make this a challenging task, LEAs require significant resources and expertise in media literacy, data analysis, and civil security. In addition to this challenging task, differing legal and cultural contexts complicate international cooperation against D&FN, highlighting the need for comprehensive strategies.
The EU Funded FERMI project stands at the forefront of contributing to the combat of the D&FN phenomenon, employing a holistic and cross-disciplinary methodology to create a robust framework for analyzing and mitigating its impact. The FERMI is designed to thoroughly analyze D&FN and their sources, taking into account a wide range of socioeconomic factors that influence both the dissemination of such incidents and their subsequent effects on various dimensions of society. By leveraging a range of innovative technological developments, FERMI aims to empower EU LEAs to detect and monitor the spread of disinformation by equipping them with essential AI-powered tools and knowledge to combat D&FN. This proactive approach allows for the implementation of effective countermeasures tailored to address specific needs related to political extremism, dangerous behaviors and public health crises.
FERMI’s technologies will be validated through 3 key use cases:
Use case 1: It addresses political interference by far-right extremists, utilizing AI tools to detect D&FN about migration issues and predict their impact on crimes, aiding authorities in making informed decisions
Use Case 2: It focuses on managing the spread of D&FN during health crises, specifically COVID-19, by analyzing the link between D&FN and social unrest to predict and mitigate risks of contagion and violence
Use Case 3: It examines the economic impacts of left-wing extremism and related terrorist activities, incorporating police expertise into digital solutions to monitor extremist propaganda and its economic consequences.
In the FERMI project, ITML is responsible for leading the development of the Sentiment Analysis module, a crucial component in understanding the impact of disinformation and false news (D&FN) on public opinion. Additionally, ITML serves as the technical and innovation coordinator, overseeing the integration of various technological outcomes of the project into the use cases that will be validated.
This project has received funding from the European Union’s Horizon Europe Research and Innovation program under grant agreement No101073980.
Find more information about the project: https://fighting-fake-news.eu/