Generations of casual lying, PsyOps, propaganda and truth-bending have culminated into an era of not just information, but also normalized prevarication

However, watch out for logical fallacies, logical fallacies, fancy hyperbole, humor/sensationalism and cavalier play of words to take you off guard. To guide you along, here is a short list of techniques (and examples) to slowly win you over!

  1. Out-of-context quotes: Quoting someone’s words without context, to change the meaning or support a different case. Example: A politician’s speech reduced to a misleading soundbite that is abused by his opponents’ supporters to create indelible lies that linger even after the fake news fades off.
  2. Cherry-picking data/supporting evidence: Using specific data points that support a particular narrative while ignoring contradictory data. Example: Citing a single study or group of studies, to support a controversial theory while ignoring broader consensus by multiple groups. In the field of scientific and academic research, there have been notable examples notable examples of authorities abusing official data to support political convenience and to control opposition. The invasion of Iraq in 2003 due to monumental government errors highlights how abuse of data can be a weapon of mass destruction in the information age.
  3. Selective omission: Presenting only certain facts while omitting others — to create a biased narrative. Example: Highlighting negative aspects of an event without mentioning positive outcomes or mitigating factors.
  4. Use of disingenuous “experts”: Presenting biased experts to lend credibility to false claims. Example: Using experts with conflict-of-interest ties — to support an argument.
  5. Appeals to emotions: Using emotionally charged language or imagery to evoke strong reactions and overshadow rational analysis. Example: Viral videos designed to provoke anger or fear.
  6. Manipulating or intentionally misinterpreting published headlines: Quoting the headlines of published articles but intentionally misinterpreting the words, or even changing some of the words, or not revealing it is an old story that is no longer up-to-date — to support an argument or to spread misinformation.Example: Taking a sensational article published five-years ago and posting it online today in 2024, without adding a disclaimer that it is an old/outdated story, or worse, adding some meaningless garble to make it support one’s argument or proposal. This is sometimes called “misleading amplification/headline manipulation”.
  7. Astroturfing: Creating fake grassroots movements or public support to give the impression of widespread agreement with government policies. Example: Organizing pro-government rallies or social media campaigns that appear to be initiated by ordinary citizens but are actually orchestrated by public agencies or related supporters. Note: Bodies in power, such as governments, influential renowned academic institutions and even international non-governmental organizations — have ever been implicated in astroturfing due to their access to resources such as scientists, experts, the mass media and people of influence.
  8. Echo chambers and Filter bubbles: Leveraging algorithms and/or discussion group ground rules to create environments where users are only exposed to information that reinforces their beliefs. Example: Social media groups that consistently share information that is cherry-picked or tweaked to support the group’s ideology — even when some of the information is outdated or unverifiable.
  9. Planted trolls and harassers: Planting accomplices on adversaries’ turfs to disrupt discussions, spreading mis/disinformation through harassment and inflammatory comments. Example: Conducting of coordinated troll attacks to derail conversations and cause internal conflicts and noise.
  10. Whataboutism: Deflecting criticism by pointing to the flaws or misdeeds of others.Example: Responding to allegations of corruption by highlighting corruption in other administrations.
  11. False attribution: Attributing false statements or actions to credible sources. Example: Engaging “experts” aligned with one’s own causes to issue authoritative statements calculated to impress the opposition’s supporters.
  12. Abuse of the “conspiracy theorist” label: Callously accusing adversaries of being conspiracy theorists to discredit genuine concerns or opposition. Additionally, using gaslighting methods to label protestors as conspiracy theorists can ridicule and discredit them, causing readers with non-critical thinking skills to dismiss the protestors’ claims. This tactic not only undermines legitimate dissent but also bolsters the credibility of the liars when they employ similar gaslighting in the future. Example: Numerous well-known scenarios dubbed conspiracy theories have turned out to be true, but the widespread (albeit eroded) trust in the mass media has made people shut out all reason and critical thinking the moment they see a group being branded as conspiracy theorists.
  13. Citing or creating clickbait content: Creating sensationalist content to generate clicks and ad revenue. This tactic not only undermines legitimate dissent but also bolsters the credibility of the liars when they employ similar gaslighting in the future. Example: Articles with outrageous titles leading to low-quality content.
  14. Bot networks: The use of automated accounts to amplify false narratives. Example: Bot armies liking and sharing disinformation to increase visibility.
  15. Doxxing and personal attacks: Publishing the guarded private information of an opponent/quarry to intimidate or discredit individuals. Example: Releasing personal details of journalists to deter them from their work.
  16. Fabricated documents: Creating or altering documents to spread false information. Example: Forged letters purportedly from government officials. Astroturfing tactics can also leverage on biased documentation created by legitimate authorities (i.e., the supported documents are really official testimonies, but harbor conflict-of-interest agendas and biases.)
  17. False balance: Presenting fringe theories as concepts that as credible as established facts. Conversely, refuting fringe theories when the latter prove viable in explaining growing incidents of anomalies. Example: False balance can be a thing when one side of an argument (usually an outlier incident that defies traditional explanation) presents an overwhelming deluge of “established facts” faster than the opposition can come up with fringe theories or facts that nevertheless have not gained momentum. Yet, the winning theory could belong to those on the fringes (heliocentrism, the Big Bang Theory, Plate Tectonics — were all fringe ideas when scientists were failing to use their prevailing “facts” and knowledge to explain observed geological events).
  18. False balance: Presenting fringe theories as concepts that as credible as established facts. Conversely, refuting fringe theories when the latter prove viable in explaining growing incidents of anomalies. Example: False balance can be a thing when one side of an argument (usually an outlier incident that defies traditional explanation) presents an overwhelming deluge of “established facts” faster than the opposition can come up with fringe theories or facts that nevertheless have not gained momentum. Yet, the winning theory could belong to those on the fringes (heliocentrism, the Big Bang Theory, Plate Tectonics — were all fringe ideas when scientists were failing to use their prevailing “facts” and knowledge to explain observed geological events).
  19. Manipulated search engine results: Using SEO techniques to ensure intentionally misleading information appears at the top of search results. Example: Reporting events out of chronological order and/or without attributing the time of occurrence, to create a false narrative, or to dig up old incidents as current events to incite fear/panic/emotions in the present day.
  20. Temporal Illusion: Manipulating perceptions of time to distort the understanding of events. Example Reporting events out of chronological order and/or without attributing the time of occurrence, to create a false narrative, or to dig up old incidents as current events to incite fear/panic/emotions in the present day.
  21. Electorate manipulation and propaganda: Usage (usually by political parties or governments) of a mix of astroturfing, cherry-picking facts, gaslighting, and abuse of the moral high ground to manipulate public perception.Example: Governments justifying controversial actions by framing them as necessary for national security, delaying dissent until key “national objectives” are achieved before relenting to protests.

Recommended reading on misinformation and disinformation techniques

  1. “The Misinformation Age: How False Beliefs Spread” by Cailin O’Connor and James Owen Weatherall: Explores how misinformation spreads and why it is so pervasive, examining the social dynamics and network effects that make it difficult to combat.
  2. “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy” by Cathy O’Neil: Discusses how data algorithms and their misuse can create and propagate false narratives, impacting society significantly.
  3. “Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics” Yochai Benkler, Robert Faris, and Hal Roberts: Provides an in-depth analysis of how misinformation and disinformation have infiltrated political discourse, focusing on the US context.
  4. “Information Wars: How We Lost the Global Battle Against Disinformation and What We Can Do About It” Using by Richard Stengel. Stengel shares his experience fighting disinformation during his tenure at the US State Department, offering insights into the global impact of false information.
  5. “Trust Me, I’m Lying: Confessions of a Media Manipulator by Ryan Holiday: A firsthand account of how media can be manipulated, providing an insider’s perspective on the techniques used to spread misinformation.
  6. “Propaganda” by Edward Bernays: A classic text on the principles of propaganda and public relations, explaining how public opinion can be shaped and controlled.
  7. “The Shallows: What the Internet Is Doing to Our Brains” by Nicholas Carr: This book examines how the internet affects our cognitive abilities, including how we process and perceive information, contributing to the spread of misinformation.
  8. “So You’ve Been Publicly Shamed” by Jon Ronson: Jon Ronson explores the impact of public shaming in the internet age, touching on how misinformation can fuel online mob behavior.
  9. “Hate Crimes in Cyberspace” by Danielle Keats Citron: An examination of online harassment and how disinformation can be weaponized to attack individuals and groups.
  10. “The Attention Merchants: The Epic Scramble to Get Inside Our Heads” by Tim Wu: This book traces the history of advertising and media manipulation, revealing how attention is captured and exploited in the digital age.
  11. “The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think” by Eli Pariser: Explains how algorithms create echo chambers and filter bubbles, leading to a polarized and misinformed public.
  12. “Post-Truth” by Lee McIntyre: Discusses the rise of “post-truth” culture, where objective facts are less influential than appeals to emotion and personal beliefs.
  13. “Blur: How to Know What’s True in the Age of Information Overload” by Bill Kovach and Tom Rosenstiel: A guide to critical thinking and media literacy, helping readers discern truth from misinformation in a world of information overload.
  14. “The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters” by Tom Nichols: Examines the growing disdain for expertise and how it contributes to the spread of misinformation.
  15. “Deepfakes and the Infocalypse: What You Urgently Need to Know” by Nina Schick: An exploration of deepfake technology and its implications for misinformation and trust in media