The Tactics Of Disinformation

An amazing self-own from the US Government’s Cybersecurity and Infrastructure Security Agency (CISA)

In this essay I provide an overview of disinformation tactics summarized on the US Government’s Cybersecurity and Infrastructure Security Agency (CISA) CISA.gov website, and have modified that document for use by individuals and organizations wishing to counter the mis- dis- mal- information (MDM) constantly emanating from governments, corporations – such as big pharma, non-governmental organizations (such as the Gates foundation), and astroturf organizations (such as the Center for Countering Digital Hate).

For more information on CISA, please see our prior Substack essay titled “HOW A “CYBERSECURITY” AGENCY COLLUDED WITH BIG TECH AND “DISINFORMATION” PARTNERS TO CENSOR AMERICANS”: THE WEAPONIZATION OF CISA – Interim staff report for the Committee on the Judiciary and the Select Subcommittee.

In addition to the various organizations, the information summarized below also applies to individuals spreading MDM, including those who are working for these agencies or as individuals, with their own, personal agendas.

All of the disinformation tactics described below are being used against the health sovereignty movement, the freedom (or liberty) movements, those pushing back against UN Agenda 2030 and the approved climate agenda, the WEF, the UN, and the resistance against globalization movement (NWO).

Tactics of Disinformation

Disinformation actors include governments, commercial and non-profit organizations as well as individuals. These actors use a variety of tactics to influence others, stir them to action, and cause harm.

Understanding these tactics can increase preparedness and promote resilience when faced with disinformation.

Disinformation actors use a variety of tactics and techniques to execute information operations and spread disinformation narratives for a variety of reasons. Some may even be well intentioned but ultimately fail on ethical grounds.

Each of these tactics are designed to make disinformation actors’ messages more credible, or to manipulate their audience to a specific end. They often seek to polarize their target audience across contentious political or social divisions, making the audience more receptive to disinformation.

These methods can and have been weaponized by disinformation actors. By breaking down common tactics, sharing real-world examples, and providing concrete steps to counter these narratives with accurate information, the Tactics of Disinformation listed below are intended to help individuals and organizations understand and manage the risks posed by disinformation.

Any organization or its staff can be targeted by disinformation campaigns, and all organizations and individuals have a role to play in building a resilient information environment.

All of this is yet another aspect of the Fifth Generation Warfare (or propaganda/PsyWar) technologies, strategy, and tactics which are being routinely deployed on all of us by our governments, corporations, and various non-state actors.

Disinformation Tactics Overview

Cultivate Fake or Misleading Personas and Websites: Disinformation actors create networks of fake personas and websites to increase the believability of their message with their target audience. Fake expert networks use inauthentic credentials (e.g., fake “experts”, journalists, think tanks, or academic institutions) to lend undue credibility to their influence content and make it more believable.

Create Deepfakes and Synthetic Media: Synthetic media content may include photos, videos, and audio clips that have been digitally manipulated or entirely fabricated to mislead the viewer. Artificial intelligence (AI) tools can make synthetic content nearly indistinguishable from real life.

Synthetic media content may be deployed as part of disinformation campaigns to promote false information and manipulate audiences.

Devise or Amplify Conspiracy Theories: Conspiracy theories attempt to explain important events as secret plots by powerful actors. Conspiracy theories not only impact an individual’s understanding of a particular topic; they can shape and influence their entire worldview.

Disinformation actors capitalize on conspiracy theories by generating disinformation narratives that align with the conspiracy worldview, increasing the likelihood that the narrative will resonate with the target audience.

Astroturfing and Flooding the Information Environment: Disinformation campaigns will often post overwhelming amounts of content with the same or similar messaging from several inauthentic accounts.

This practice, known as astroturfing, creates the impression of widespread grassroots support or opposition to a message, while concealing its true origin. A similar tactic, flooding, involves spamming social media posts and comment sections with the intention of shaping a narrative or drowning out opposing viewpoints.

Abuse Alternative Platforms: Disinformation actors may abuse alternative social media platforms to intensify belief in a disinformation narrative among specific user groups. Disinformation actors may seek to take advantage of platforms with fewer user protections, less stringent content moderation policies, and fewer controls to detect and remove inauthentic content and accounts than other social media platforms.

Exploit Information Gaps: Data voids, or information gaps, occur when there is insufficient credible information to satisfy a search inquiry. Disinformation actors can exploit these gaps by generating their own influence content and seeding the search term on social media to encourage people to look it up.

This increases the likelihood that audiences will encounter disinformation content without any accurate or authoritative search results to refute it.

Manipulate Unsuspecting Actors: Disinformation actors target prominent individuals and organizations to help amplify their narratives. Targets are often unaware that they are repeating a disinformation actor’s narrative or that the narrative is intended to manipulate.

Spread Targeted Content: Disinformation actors produce tailored influencer content likely to resonate with a specific audience based on their worldview and interests. These actors gain insider status and grow an online following that can make future manipulation efforts more successful.

This tactic often takes a “long game” approach of spreading targeted content over time to build trust and credibility with the target audience.

Actions You Can Take

Although disinformation tactics are designed to deceive and manipulate, critically evaluating content and verifying information with credible sources before deciding to share it can increase resilience against disinformation and slow its spread.

  • Recognize the risk. Understand how disinformation actors leverage these tactics to push their agenda. Be wary of manipulative content that tries to divide.
  • Question the source. Critically evaluate content and its origin to determine whether it’s trustworthy. Research the author’s credentials, consider the outlet’s agenda, and verify the supporting facts.
  • Investigate the issue. Conduct a thorough, unbiased search into contentious issues by looking at what credible sources are saying and considering other perspectives. Rely on credible sources of information, such as government sites.
  • Think before you link. Slow down. Don’t immediately click to share content you see online. Check the facts first. Some of the most damaging disinformation spreads rapidly via shared posts that seek to elicit an emotional reaction that overpowers critical thinking.
  • Talk with your social circle. Engage in private, respectful conversations with friends and family when you see them sharing information that looks like disinformation. Be thoughtful what you post on social media.

Cultivate Fake or Misleading Personas and Websites

Description: Disinformation actors create networks of fake personas and websites to increase the believability of their message with their target audience. Such networks may include fake academic or professional “experts,” journalists, think tanks, and/or academic institutions.

Some fake personas are even able to validate their social media accounts (for example, a blue or gray checkmark next to a username), further confusing audiences about their authenticity. Fake expert networks use inauthentic credentials to make their content more believable.

Disinformation actors also increase the credibility of these fake personas by generating falsified articles or research papers and sharing them online. Sometimes, these personas and their associated publications are intentionally amplified by other actors.

In some instances, these materials are also unwittingly shared by legitimate organizations and users. The creation or amplification of content from these fake personas makes it difficult for audiences to distinguish real experts from fake ones.

Adversaries have also demonstrated a “long game” approach with this tactic by building a following and credibility with seemingly innocuous content before switching their focus to creating and amplifying disinformation.

This lends a false credibility to campaigns.

Example: During the course of the COVIDcrisis, I have had a number of misleading personas and websites (including substack authors) target me. It is extremely disturbing to see fragments of my CV, my life, my peer-review papers dissected, re-configured, even modified to target me.

Evidently, because some government and/or organization perceives my ideas to be dangerous.

There is one person on Twitter with almost 100,000 followers who has literally posted thousands of posts targeting me. At one point, Jill took screen shots of all of these posts and placed in a summary document.

She gave up the project at around page 1500. The file was too big to handle easily. Everyday for years, this disinformation actor posts two to three hit pieces on me – mixed with other content. He has used every single one of these tactics outlined above.

Clearly, he is being paid by an organization or government. He self-defines as an independent journalist, and has a long and well documented history of cyberstalking and spreading falsehoods. Many of his followers do not question his authenticity.

One of the followers recently even attacked me for having worked on development of the Remdesivir vaccine! These posts get passed around as authentic information, and there is nothing I can do.

These fabricated fake information fragments then gets spread as if they are true information, and other influencers report on these posts as if they were real.

The cycle goes around and around. The end result is not only intentional damage to myself and my reputation, but also that the whole resistance movement (whatever that is) gets delegitimized.

Which is a “win” for those chaos agents who are pushing this disinformation.

This is taken from a long document. Read the rest here substack.com

Header image: Security Info Watch

Please Donate Below To Support Our Ongoing Work To Defend The Scientific Method

PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorporated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX. 

Trackback from your site.

Comments (1)

  • Avatar

    KittyJ

    |

    “Rely on credible sources of information, such as government sites.” Uh, I don’t think so necessarily. Government resources can run afoul of the truth as easily as any other.

    Reply

Leave a comment

Save my name, email, and website in this browser for the next time I comment.
Share via