In an era of information overload, distinguishing between truth and falsehood is becoming increasingly difficult. The digital age has ushered in a wave of disinformation campaigns, employing subtle and sophisticated manipulation techniques to shape public perception and erode trust in democratic institutions. Unlike misinformation, which may stem from honest mistakes, disinformation is intentional, designed to deceive, provoke, and manipulate (Wardle & Derakhshan, 2017).
Those behind disinformation campaigns use various covert strategies aimed at undermining democracy. In this article, we dissect some of these nuanced and sometimes hidden tactics, some of which exploit our very beliefs, values, concerns, and personal data on social media.
Common techniques
Before delving into the more subtle disinformation strategies, it is important to briefly touch on the more common forms of disinformation that the average reader may have heard about. Some of the most prevalent tactics as outlined by the United Nations High Commissioner for Refugees (UNHCR) include:
- Fabricated Content – Completely false information designed to mislead.
- Manipulated Content – Genuine images or text altered to fit a misleading narrative.
- Imposter Content – Fake sources impersonating reputable institutions.
- Misleading Content – Facts presented deceptively to influence perception.
- False Context – Accurate information deliberately placed in an incorrect context.
- Satire and Parody – Humorous but misleading content, often mistaken for real news.
- False Connections – Mismatched headlines, visuals, or captions that distort meaning.
- Sponsored Content – Advertising disguised as editorial content to push an agenda.
- Propaganda – Information engineered to control attitudes, values, and beliefs.
- Reporting Errors – Genuine mistakes by news agencies that can be exploited.
The rise of Artificial Intelligence (AI) has further enabled disinformation through synthetic media—AI-generated or manipulated content designed to deceive (Chesney & Citron, 2019). One of the most notorious examples is Deepfakes, where AI is used to fabricate realistic-looking videos or audio clips, often for nefarious purposes such as political manipulation, financial fraud, or character assassination. Similarly, speech synthesis can create artificial voices that mimic real individuals, making fake statements appear authentic.
Social Media Manipulation: Deceptive Digital Strategies
In the UNHCR’s estimations, social media platforms have also become a battleground for disinformation operatives who exploit human psychology and algorithms to spread false narratives. Some of the common deceptive techniques include:
- Sockpuppetry – Fake online identities used to manipulate public opinion or circumvent bans.
- Sealioning – Persistently questioning individuals in bad faith to derail discussions.
- Astroturfing – Orchestrated campaigns that disguise sponsors as grassroots supporters.
- Catfishing – Deceptive online identities used for scams, fraud, or emotional manipulation.
These tactics make it increasingly difficult for the public to discern authentic discourse from orchestrated deception, thereby distorting democratic debates and silencing dissenting voices (Benkler, Faris, & Roberts, 2018).
The hidden manipulation tactics
Apart from the more common strategies outlined above, there are very nuanced and sublime methods in which nations, groups, organisations, and individuals churn out disinformation campaigns. It is these hidden tactics that the Centre for Intelligence and Strategic Analysis (CISA – Ghana) wish to draw attention to.
The Emotional Manipulation Trap
One of the most effective ways disinformation spreads is by targeting human emotions. Content designed to evoke fear, anger, or excitement is more likely to go viral, as people react impulsively rather than critically analysing the information. Delphine Colard, a Strategic Communication Expert at the European Parliament, in a documentary series titled: “How Disinformation Works,” noted that disinformation spreaders “push content on you that raises your fear, anger, or disgust in order to provoke a reaction without thinking.”
This emotional manipulation is particularly dangerous as it encourages rapid sharing, amplifying falsehoods and deepening social divisions. Strong language, sensational headlines, and dramatic visuals serve as triggers, making individuals less likely to verify the information before spreading it further.
Polarisation: Exaggerating Differences to Create Division
Polarisation is another insidious strategy used to disrupt democratic discourse. It involves emphasising the most extreme viewpoints while suppressing moderate perspectives, thereby making it seem as though society is irreparably divided. This technique erodes the possibility of compromise and fosters hostility between different social and political groups.
Sarah Anhborg, another European Parliament expert, warned in the same documentary series that artificial polarisation can lead to real-world divisions, where opposing sides demonise each other instead of engaging in constructive dialogue. The ultimate goal is to paralyse democratic decision-making and preventing societies from reaching consensus on important issues.
Flooding: Overloading the Information Space
Flooding is a deliberate tactic aimed at drowning out factual discourse by bombarding the information space with conflicting or irrelevant messages. This approach is particularly evident during crises, elections, or conflicts, where disinformation actors flood social media with contradictory narratives to sow confusion and distrust.
As Colard explains, “The real goal is not to convince us about one particular story but to confuse us,” making people question the very notion of objective truth. This erosion of trust in the media and institutions leads to public apathy, where citizens disengage from democratic participation altogether.
The Bias Trap: Exploiting Preconceived Notions
Disinformation is most effective when it aligns with pre-existing beliefs. Disinformation campaigns leverage algorithms and data analytics to push false narratives tailored to specific audiences. When people encounter information that confirms their biases, they are less likely to scrutinise it critically, making them more susceptible to manipulation (Benkler et al., 2018).
Miscontextualisation: Twisting Truths to Mislead
Another subtle form of manipulation involves presenting real information out of context. A genuine photograph may be repurposed to depict an event it was never related to, or a speech may be edited to completely change its original meaning. By blurring the line between fact and fiction, this strategy can deceive even the most discerning individuals (Chesney & Citron, 2019).
Silencing Dissent: Targeting Critical Voices
Beyond spreading falsehoods, disinformation campaigns also actively work to suppress truth-tellers. Journalists, activists, and political dissidents are often subjected to harassment, cyberattacks, and character assassination to deter them from speaking out. Some tactics include:
- Flooding social media with negative comments to intimidate individuals.
- Using AI-generated deepfakes to discredit opponents.
- Hacking personal accounts and leaking sensitive information.
The intended effect is self-censorship, where individuals withdraw from public discourse out of fear, thereby weakening democratic debate and decision-making.
Protecting Democracy from Disinformation
Given the increasing sophistication of disinformation tactics, vigilance is crucial. Here are steps individuals can take to safeguard against manipulation:
- Verify Before Sharing – Fact-check information from multiple reputable sources.
- Be Wary of Emotional Triggers – Question highly emotional or sensational content.
- Recognise Manipulation Techniques – Educate yourself on common disinformation strategies.
- Support Free Press and Critical Voices – Defend journalists and activists against silencing tactics.
- Demand Accountability from Tech Companies – Advocate for stronger regulations to curb the spread of disinformation on social media.
Conclusion
Disinformation campaigns are not just about spreading lies—they are about manipulating public perception, eroding trust in institutions, and undermining democracy. The ability to recognise and resist these subtle manipulation techniques is essential in preserving a well-informed and engaged citizenry. By staying vigilant and critically assessing the information we consume, we can help protect democratic integrity from the growing threat of disinformation. It is also very important that government, organisations and individuals implement the recommendations made by security experts at the end of CISA’s high-level international conference held in Accra on Thursday, 7 November and Friday, 8 November 2024 at the Lancaster Hotel on the theme, ‘‘New Paradigms for Ensuring Peace and Security in Africa – The need for closer collaboration with non-governmental security and intelligence organisations.’
Those recommendations, encapsulated a multi-faceted approach to tackling mis/disinformation as listed below:
- Government & Media Collaboration – Media Commissions should work with traditional media to ensure credible reporting and encourage fact-checking on social media.
- Social Media Partnerships – African governments must engage platforms like WhatsApp, Facebook, and X to curb fake news, especially during elections.
- Intelligence & Cybersecurity Initiatives – Agencies should create content and algorithms to counter misinformation on trending issues.
- Capacity-Building – Security and intelligence agencies need better tools and training to track and combat disinformation effectively.
- Personnel Accountability – Sanctions must be imposed on security personnel misusing social media to spread false information.
- Homegrown Solutions – Africa should develop its own digital strategies rather than rely on potentially compromised Western technologies.
- Legal Framework & Prosecution – A robust legal system must be in place to swiftly prosecute perpetrators of disinformation.
- Community Involvement – Traditional, religious, and opinion leaders should be engaged to help combat misinformation.
- Official Government Accounts – Verified government social media accounts should serve as sources of truth and public verification.
Disinformation has become part and parcel of geopolitics and geoeconomics. Superpowers leverage it for various purposes that serve their interest. In certain cases, it has been weaponised by certain foreign powers against democracy in various parts of the world where their interests may have come under threat. It is, therefore, imperative for African countries to take CISA’s recommendations seriously to safeguard their democracies against weaponised disinformation so as avert a derailment from the modest progress chalked in the past two decades.
References
Anhborg, S. (2024). European Parliament Expert on Disinformation. European Parliament.
Benkler, Y., Faris, R., & Roberts, H. (2018). Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. Oxford University Press.
Chesney, R., & Citron, D. (2019). “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security.” California Law Review.
Colard, D. (2024). Strategic Communication Expert at the European Parliament.
CISA – Ghana. (2024). High-Level International Conference on Peace and Security in Africa. Accra, Ghana.
UNHCR. (2023). Understanding Disinformation and its Impact on Democratic Institutions. United Nations High Commissioner for Refugees.
Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Council of Europe.