Fake news has reshaped political landscapes, particularly in democracies where public opinion plays a critical role in elections. Donald Trump’s return to power in the 2024 U.S. presidential election offers a pertinent example of how misinformation can be weaponized for political gain. Building on the tactics he first employed in 2016, Trump once again dominated public discourse by delivering sensational narratives and labeling critical or opposing media as “fake news.”
This strategy, while controversial, proved remarkably effective. Trump’s campaign capitalized on economic anxieties, particularly inflation, while spreading unverified claims about his opponent’s policies and past actions. By framing himself as the champion of “truth” against a corrupt establishment, he energized his voter base and manipulated media outlets into amplifying his narratives, often without adequate fact-checking.
His ability to provoke emotional reactions, from indignation to passionate support, demonstrated a deep understanding of the modern digital information landscape, where engagement often matters more than accuracy.
Trump’s 2024 victory shows the dangers of disinformation in modern democracies. It illustrates how strategically deployed misinformation can divide societies, weaken trust in institutions, and shape political outcomes. As misinformation continues to evolve, understanding and combating its influence is essential to safeguarding democracy and ensuring informed public discourse.
The fight against misinformation relies on three key pillars: First, we must understand its causes and effects, using education to help people recognize and counter misleading information. Next, implementing fact-checking tools is essential for verifying the accuracy of information and stopping the spread of false claims. Finally, supporting independent journalism guarantees access to reliable news and diverse perspectives, which are vital for maintaining an informed society.
What is fake news and why it can mislead us?
Fake news refers to false and sensationalist information presented as legitimate journalistic reporting. It targets people’s fears and uncertainties, making it easy to believe and spread. When faced with unknown or frightening topics, individuals are more likely to fall into the trap of misinformation or conspiracy theories.
This issue is not new; even in 1807, Thomas Jefferson warned about the corruption of truth in the press, highlighting that doubts about media reliability have existed for centuries. In today’s digital age, the internet has amplified this challenge, creating an endless stream of information that is not always accurate or meaningful.
Irak chemical weapons lie: a case of state-sponsored misinformation:
In 2003, Colin Powell presented false evidence to the UN Security Council to justify the U.S. intervention in Iraq. “Look at the image on the left; the two arrows supposedly prove that these bunkers store chemical weapons.” This was a state lie used to justify the United States’ entry into war, and six weeks later, with their allies, they launched an offensive against Saddam Hussein’s regime.
The George Washington University: Chemical Munitions Stored at Taji
The social media paradox: connection vs. misinformation
“With Facebook, share and stay connected with your loved ones.”
Socials media were created to stay connected with people around the world and share ideas easily. It looked like everyone had the chance to speak and learn from others. Nowadays, Platforms like Facebook, Twitter, and TikTok still thrive on freedom of expression, allowing anyone to publish content.
However, it also has a negative impact on public opinion, as some online news often engage emotions. People are tempted to share them quickly and spread it around the world. Therefore, the accessibility that the platform gave us has increased distrust toward traditional media and fueled the success of disinformation sites. Social media and advanced technologies make it increasingly challenging to detect false information, especially as the forms of misinformation evolve over time.
To counter this, citizens must adopt critical thinking and fact-checking habits to identify reliable information and resist manipulative narratives designed to exploit emotional and cognitive biases.
The Business of fake news: Profits and Power
Social media platforms operate on algorithms designed to maximize user engagement. These algorithms analyze user behavior, preferences, and interactions to deliver content that keeps individuals on the platform longer. The more users engage, the more data is collected, creating a feedback loop that allows platforms to improve their targeting capabilities.
This approach, based on data, is the foundation of the advertising model. By learning about users habits and preferences, platforms can show highly targeted ads to businesses and political groups. Advertisers are willing to pay more for these personalized ads because they are more likely to reach the right people and achieve their goals.
Fake news often plays into this system because of its ability to attract attention. For example, controversial content generates higher engagement rates, more clicks, shares, and views. On the other hand, social media algorithms amplify this type of content because they keep users interacting with the platform. This cycle makes the spread of misinformation not only possible but profitable for both content creators and the platforms hosting it.
All in all, this model maximizes profits and power by monetizing user attention and personal data, making misinformation a lucrative tool for revenue and influence.
Impact on our democratizes
Disinformation is a serious danger to democracy because it undermines the freedoms that define these systems.
Hidden identities and no responsibility
Digital platforms allow people to stay anonymous, letting them spread false information without taking responsibility for their actions.
Echo chambers and filter bubbles
Anonymity encourages group behavior where individuals lose their sense of self and adopt collective ideologies, often strengthening echo chambers. This is often referred to as a “filter bubble.” It creates a “state of intellectual isolation,” driven by algorithms that consistently suggest content aligning with the user’s beliefs and opinions. As a result, this makes users think that everyone agrees with them, while they are kept away from different opinions.
“It is not the essence of a democracy to be exposed only to the opinions one already holds.”
- Julia Jäckle, student at the Graduate Institute Geneva, author of a study on social media and democracies.
Shaping identities and legitimizing falsehoods
Erving Goffman’s concept of “self-presentation” explains how individuals actively shape and perform identities to fit social contexts and meet group expectations. In the case of online misinformation, when people shape their online personas to match their group’s beliefs, they risk legitimizing false information within their social circles.
In the digital world, the absence of face-to-face interaction allows people to remain behind the scenes, making it easier for them to express thoughts or emotions they might not share in person. This dynamic also allows them to express feelings that may not match reality, further blurring the lines between authentic emotions and the carefully crafted images they present of themselves.
In the end, why disinformation is so attractive for consumers like you and me?
All in all, disinformation often uses emotions to grab attention. Studies show false information spreads faster than the truth because sensational content appeals to people’s interests and emotions. As a result, disinformation blurs the reality and the foundations of democracy by polarizing societies, encouraging extremism, and eroding trust in factual discourse.
How to recognize and fight against fake news as a citizen?
Pre-bunking to build immunity
An effective strategy to combat misinformation is pre-bunking, often referred to as a “vaccine” against fake news. This method involves exposing individuals to small doses of false information, accompanied by explanations of common manipulation techniques, to strengthen their ability to resist them.
Derived from the psychological inoculation theory developed by William McGuire during the Cold War, pre-bunking aims to prepare citizens to critically analyze the information they receive.
Before the Indonesian elections in February 2024, Jigsaw and Google launched a pre-bunking campaign in collaboration with popular influencer. This initiative focused on three common disinformation tactics: using emotions to mislead, taking information out of context, and making false accusations, which experts found were common during elections. The campaign used short, impactful videos on platforms like YouTube, TikTok, and Instagram to educate users on how misinformation exploits emotions to distort perceptions. Reaching 57.6 million users, including over half of voters aged 18 to 34, this proactive approach demonstrated its effectiveness by improving the ability to identify false information by approximately 5%.
The pre-bunking method is not without criticism. Some argue that exposing people to small amounts of misinformation repeatedly might cause unintended effects, like making them more doubtful of accurate information online. Even so, the purpose of pre-bunking is to build mental resilience, helping people spot manipulation tactics and think critically about the messages they see.
Fight false information with simple reflex of fact check?
Understanding the Source:
Always ask, “Who is speaking?” Identify the individual or organization behind a statement and consider their motives. If the source is anonymous, be extra caution, as it becomes harder to verify credibility.
Similarly, “Who is filming?” The perspective of the person behind the camera can influence how events are portrayed.
Always question, “Who published the information?” Check whether the source is reliable and explore the “About” section of websites for additional context. Make sure the website is authentic and not a fake copy of a trusted news site. On social media, remember that verification badges can now be purchased and don’t always guarantee reliability.
When content cites a journalist, ask, “Who is this journalist?” A quick search can reveal their credentials, expertise, and past work, providing insight into the reliability of the information.
Evaluating the Content:
Start by asking, “What is in this article?” Read the entire piece rather than relying solely on headlines or excerpts, which can sometimes be misleading.
Ask yourself, “What does the site look like?” The professionalism of a website can often reflect its credibility. Also, verify if the story has been covered by other reputable media to confirm its accuracy.
Getting the Context:
Pay attention to details like, “When was the information published?” The publication date is crucial to ensure the content is current and relevant.
For videos, ask, “Where was this created?” Understanding the location and context can help interpret the situation accurately.
If you still have doubts: use simple fact-checking tools
Reverse Image Search
To check if a photo has been used in a different context or time, you can use tools like TinEye or Google Images. Simply upload the image or copy and paste its URL into the tool. It will display where the image has appeared online, allowing you to verify the dates and websites. For example, a photo of a protest might actually be from years ago and a completely different country.
Google verification on your phone
Short tutorials available here:
Google reverse image: https://www.youtube.com/watch?v=p5e9wTdAulA
TinEye (In French): https://www.youtube.com/watch?v=dKQZEXdyVjk
Video Verification:
To find the original source of a video, use the YouTube Data Viewer by Amnesty International. Copy the video’s YouTube link and paste it into the tool. It will display key information, such as the original upload dates and metadata, which you can cross-check with other sources. If the video isn’t on YouTube, search for relevant keywords in the title or content. This process provide the video’s authenticity.
Amnesty international YouTube DataViewer
Tutorial available here: https://www.youtube.com/watch?v=KGqEc1Bgzio
Dedicated Platforms:
You can check if information online is true using tools like Hoaxy or Decodex. With Hoaxy, you just type a keyword or the title of the information to see how it spreads online and if it comes from trustworthy sources. Decodex lets you check websites or claims to see if they are reliable or questionable. For example, if someone posts an interesting statistic on social media, you can use Decodex to check if it’s true before sharing it.
Also, services like AFP Factuel, the fact-checking team of Agence France Presse (AFP), share verified articles, especially about popular topics.
Short tutorials available here:
Décodex (In French): https://www.dailymotion.com/video/x5abn8h
Hoaxy available: https://www.youtube.com/watch?v=YoO7DDMvhUQ
Combating Misinformation and Strengthening Democracy: Exploring Switzerland’s Future Initiatives
National governments and the European Union have been actively addressing the critical issue of misinformation and its impact on democracy, often through legislative measures. One of them also consists of having independent medias to guarantee representations of opinions.
Strong public media are vital for democracy because they establish diverse perspectives, reflecting different cultures and opinions, which is essential for an informed society. Their independence protects against commercial or political pressures, maintaining high-quality, unbiased reporting.
As disinformation spreads on social media, public media are a trusted source of truth, fighting against manipulation and echo chambers. This is especially important in Switzerland, where people directly vote on political decisions. Independent and well-funded media help making sure citizens can make informed choices. If public media are weakened, the quality and variety of information may suffer, putting democracy at risk.
In Switzerland, for instance, the UDC party has introduced the “200 Francs is enough” initiative, which seeks to reduce the annual radio and TV license fee from 335 to 200 francs. This fee is essential for public media to survive. The initiative has sparked significant debate over the financial and cultural implications of such a reduction.
Proponents argue that it would reduce the financial pressure on households and businesses, particularly small and medium-sized enterprises (SMEs), while opponents fear it would weaken public media, reduce content diversity, and negatively impact linguistic minorities.
In response, the Federal Council has proposed a counter-project, suggesting a gradual reduction of the fee to 300 francs by 2029 while attempting to limit the financial impact on SRG SSR. However, this compromise raises concerns among media actors.
They warn that such reductions could compromise the quality and independence of public media, weakening their essential role in Switzerland’s direct democracy.
This debate highlights the challenge of balancing financial responsibility with the need to protect reliable information, a fundamental pillar of democratic strength in the digital era.
The power of social media is unprecedented in the history of capitalism.
In conclusion, we should not underestimate the impact of false information. Its simplicity can make it attractive, causing us to see problems in clear-cut black-and-white terms. But the truth is often found in the many shades of colors behind we overlook. Understanding these details is key to seeing the whole picture and fighting misinformation effectively.
Fake news represents a profound challenge to democratic societies, damage trust in institutions, polarizing public opinion, and even influencing economic markets. From manipulated images in political propaganda to algorithm-driven misinformation campaigns targeting businesses, disinformation exploits the speed and scale of digital platforms for profit and power. The private industries and the public area are concerned by this issue and need to work together.
As citizens of our democracy, we also must stay rational and adopt a different approach to recognizing and combating fake news.
First, understanding the roots of disinformation is crucial, this involves examining its causes and impacts, particularly the role of artificial intelligence and algorithms in amplifying misleading narratives.
Second, prebunking and fact-checking tools are essential for equipping individuals with the skills to identify and verify information. Initiatives that teach citizens to analyze sources, context, and motives behind content help build resilience against manipulation.
Finally, supporting independent journalism permits the availability of reliable information and diverse perspectives.
Every click, share, and comment shape the digital ecosystem. Be aware that you, as a citizen, can do something to participate in a well-informed world.
Samantha Cotter
Sources:
Goffman, E. (1959). La mise en scène de soi : La présentation de soi dans la vie quotidienne. Paris : Les Éditions de Minuit. (Titre original : The Presentation of Self in Everyday Life).
Comment les réseaux sociaux menacent nos démocraties ? Géopolitis 20 janvier 2019
CTRL+Z (vidéo) – Le pré-bunking, autre remède contre les fakes news
Le pré-bunking sur la désinformation électorale en Indonésie
Psychological inoculation improves resilience against misinformation on social media
From Thomas Jefferson to John Norvell, 11 June 1807
Google verification on your phone
Tutorial of Google reverse image
Amnesty international Youtube DataViewer