Generations of casual lying, PsyOps, propaganda and truth-bending have culminated into an era of not just information, but also normalized prevarication
In a previous article on fake news and misinformation/disinformation, readers here were challenged to read everything critically.
Cybercriminals, trolls, and even influential corporations and even governments can make use of eight common techniques to control narratives, spread propaganda, and even cause large groups of people to develop a strong emotion (subliminal programming) for a cause or “belief” without even knowing it.
In this continuation of that article, let us explore other techniques used in social media, mass media and even everyday activities to “win friends and influence people” for various dishonest agendas. Note that the techniques may or may not be consciously used, as the people who abuse them may have simulated what they learned from years of exposure to such effective tactics when engaged in arguments, convincing prospects, or simply when handling their own children.
Consume information with caution
Have you read through arguments and heated debates on social media or in the office? Both sides of the argument have various ways to defend themselves, cast the other party in bad light, or simply imploring those on their side to support their story — factual or otherwise.
Similarly, everything we read and see in the (online/broadcast) news, social media groups, advertisements, documentaries and opinion pieces by “experts” is like a “call to action”: supplying a set of carefully crafted statistics, testimonials, claims, opinions and supposed studies to back the narrative and gain our support.
However, watch out for logical fallacies, logical fallacies, fancy hyperbole, humor/sensationalism and cavalier play of words to take you off guard. To guide you along, here is a short list of techniques (and examples) to slowly win you over!
- Out-of-context quotes: Quoting someone’s words without context, to change the meaning or support a different case. Example: A politician’s speech reduced to a misleading soundbite that is abused by his opponents’ supporters to create indelible lies that linger even after the fake news fades off.
- Cherry-picking data/supporting evidence: Using specific data points that support a particular narrative while ignoring contradictory data. Example: Citing a single study or group of studies, to support a controversial theory while ignoring broader consensus by multiple groups. In the field of scientific and academic research, there have been notable examples notable examples of authorities abusing official data to support political convenience and to control opposition. The invasion of Iraq in 2003 due to monumental government errors highlights how abuse of data can be a weapon of mass destruction in the information age.
- Selective omission: Presenting only certain facts while omitting others — to create a biased narrative. Example: Highlighting negative aspects of an event without mentioning positive outcomes or mitigating factors.
- Use of disingenuous “experts”: Presenting biased experts to lend credibility to false claims. Example: Using experts with conflict-of-interest ties — to support an argument.
- Appeals to emotions: Using emotionally charged language or imagery to evoke strong reactions and overshadow rational analysis. Example: Viral videos designed to provoke anger or fear.
- Manipulating or intentionally misinterpreting published headlines: Quoting the headlines of published articles but intentionally misinterpreting the words, or even changing some of the words, or not revealing it is an old story that is no longer up-to-date — to support an argument or to spread misinformation.Example: Taking a sensational article published five-years ago and posting it online today in 2024, without adding a disclaimer that it is an old/outdated story, or worse, adding some meaningless garble to make it support one’s argument or proposal. This is sometimes called “misleading amplification/headline manipulation”.
- Astroturfing: Creating fake grassroots movements or public support to give the impression of widespread agreement with government policies. Example: Organizing pro-government rallies or social media campaigns that appear to be initiated by ordinary citizens but are actually orchestrated by public agencies or related supporters. Note: Bodies in power, such as governments, influential renowned academic institutions and even international non-governmental organizations — have ever been implicated in astroturfing due to their access to resources such as scientists, experts, the mass media and people of influence.
- Echo chambers and Filter bubbles: Leveraging algorithms and/or discussion group ground rules to create environments where users are only exposed to information that reinforces their beliefs. Example: Social media groups that consistently share information that is cherry-picked or tweaked to support the group’s ideology — even when some of the information is outdated or unverifiable.
- Planted trolls and harassers: Planting accomplices on adversaries’ turfs to disrupt discussions, spreading mis/disinformation through harassment and inflammatory comments. Example: Conducting of coordinated troll attacks to derail conversations and cause internal conflicts and noise.
- Whataboutism: Deflecting criticism by pointing to the flaws or misdeeds of others.Example: Responding to allegations of corruption by highlighting corruption in other administrations.
- False attribution: Attributing false statements or actions to credible sources. Example: Engaging “experts” aligned with one’s own causes to issue authoritative statements calculated to impress the opposition’s supporters.
- Abuse of the “conspiracy theorist” label: Callously accusing adversaries of being conspiracy theorists to discredit genuine concerns or opposition. Additionally, using gaslighting methods to label protestors as conspiracy theorists can ridicule and discredit them, causing readers with non-critical thinking skills to dismiss the protestors’ claims. This tactic not only undermines legitimate dissent but also bolsters the credibility of the liars when they employ similar gaslighting in the future. Example: Numerous well-known scenarios dubbed conspiracy theories have turned out to be true, but the widespread (albeit eroded) trust in the mass media has made people shut out all reason and critical thinking the moment they see a group being branded as conspiracy theorists.
- Citing or creating clickbait content: Creating sensationalist content to generate clicks and ad revenue. This tactic not only undermines legitimate dissent but also bolsters the credibility of the liars when they employ similar gaslighting in the future. Example: Articles with outrageous titles leading to low-quality content.
- Bot networks: The use of automated accounts to amplify false narratives. Example: Bot armies liking and sharing disinformation to increase visibility.
- Doxxing and personal attacks: Publishing the guarded private information of an opponent/quarry to intimidate or discredit individuals. Example: Releasing personal details of journalists to deter them from their work.
- Fabricated documents: Creating or altering documents to spread false information. Example: Forged letters purportedly from government officials. Astroturfing tactics can also leverage on biased documentation created by legitimate authorities (i.e., the supported documents are really official testimonies, but harbor conflict-of-interest agendas and biases.)
- False balance: Presenting fringe theories as concepts that as credible as established facts. Conversely, refuting fringe theories when the latter prove viable in explaining growing incidents of anomalies. Example: False balance can be a thing when one side of an argument (usually an outlier incident that defies traditional explanation) presents an overwhelming deluge of “established facts” faster than the opposition can come up with fringe theories or facts that nevertheless have not gained momentum. Yet, the winning theory could belong to those on the fringes (heliocentrism, the Big Bang Theory, Plate Tectonics — were all fringe ideas when scientists were failing to use their prevailing “facts” and knowledge to explain observed geological events).
- False balance: Presenting fringe theories as concepts that as credible as established facts. Conversely, refuting fringe theories when the latter prove viable in explaining growing incidents of anomalies. Example: False balance can be a thing when one side of an argument (usually an outlier incident that defies traditional explanation) presents an overwhelming deluge of “established facts” faster than the opposition can come up with fringe theories or facts that nevertheless have not gained momentum. Yet, the winning theory could belong to those on the fringes (heliocentrism, the Big Bang Theory, Plate Tectonics — were all fringe ideas when scientists were failing to use their prevailing “facts” and knowledge to explain observed geological events).
- Manipulated search engine results: Using SEO techniques to ensure intentionally misleading information appears at the top of search results. Example: Reporting events out of chronological order and/or without attributing the time of occurrence, to create a false narrative, or to dig up old incidents as current events to incite fear/panic/emotions in the present day.
- Temporal Illusion: Manipulating perceptions of time to distort the understanding of events. Example Reporting events out of chronological order and/or without attributing the time of occurrence, to create a false narrative, or to dig up old incidents as current events to incite fear/panic/emotions in the present day.
- Electorate manipulation and propaganda: Usage (usually by political parties or governments) of a mix of astroturfing, cherry-picking facts, gaslighting, and abuse of the moral high ground to manipulate public perception.Example: Governments justifying controversial actions by framing them as necessary for national security, delaying dissent until key “national objectives” are achieved before relenting to protests.
Recommended reading on misinformation and disinformation techniques
- “The Misinformation Age: How False Beliefs Spread” by Cailin O’Connor and James Owen Weatherall: Explores how misinformation spreads and why it is so pervasive, examining the social dynamics and network effects that make it difficult to combat.
- “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy” by Cathy O’Neil: Discusses how data algorithms and their misuse can create and propagate false narratives, impacting society significantly.
- “Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics” Yochai Benkler, Robert Faris, and Hal Roberts: Provides an in-depth analysis of how misinformation and disinformation have infiltrated political discourse, focusing on the US context.
- “Information Wars: How We Lost the Global Battle Against Disinformation and What We Can Do About It” Using by Richard Stengel. Stengel shares his experience fighting disinformation during his tenure at the US State Department, offering insights into the global impact of false information.
- “Trust Me, I’m Lying: Confessions of a Media Manipulator by Ryan Holiday: A firsthand account of how media can be manipulated, providing an insider’s perspective on the techniques used to spread misinformation.
- “Propaganda” by Edward Bernays: A classic text on the principles of propaganda and public relations, explaining how public opinion can be shaped and controlled.
- “The Shallows: What the Internet Is Doing to Our Brains” by Nicholas Carr: This book examines how the internet affects our cognitive abilities, including how we process and perceive information, contributing to the spread of misinformation.
- “So You’ve Been Publicly Shamed” by Jon Ronson: Jon Ronson explores the impact of public shaming in the internet age, touching on how misinformation can fuel online mob behavior.
- “Hate Crimes in Cyberspace” by Danielle Keats Citron: An examination of online harassment and how disinformation can be weaponized to attack individuals and groups.
- “The Attention Merchants: The Epic Scramble to Get Inside Our Heads” by Tim Wu: This book traces the history of advertising and media manipulation, revealing how attention is captured and exploited in the digital age.
- “The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think” by Eli Pariser: Explains how algorithms create echo chambers and filter bubbles, leading to a polarized and misinformed public.
- “Post-Truth” by Lee McIntyre: Discusses the rise of “post-truth” culture, where objective facts are less influential than appeals to emotion and personal beliefs.
- “Blur: How to Know What’s True in the Age of Information Overload” by Bill Kovach and Tom Rosenstiel: A guide to critical thinking and media literacy, helping readers discern truth from misinformation in a world of information overload.
- “The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters” by Tom Nichols: Examines the growing disdain for expertise and how it contributes to the spread of misinformation.
- “Deepfakes and the Infocalypse: What You Urgently Need to Know” by Nina Schick: An exploration of deepfake technology and its implications for misinformation and trust in media