How Disinformation Bots are Born
Ryan McBeth
17 min, 43 sec
An in-depth analysis of a Twitter account suspected of being a Russian disinformation bot, detailing the process of bot creation, and the disinformation kill chain.
Summary
- Ryan McBeth investigates a Twitter profile suspected to be a Russian bot and uncovers a sophisticated disinformation operation.
- The process of bot creation involves three phases: crafting, connecting, and deceiving, aiming to establish credibility, attract followers, and manipulate public opinion.
- The investigation includes technical data analysis, pattern of life analysis, and examination of the account's tweets to determine the authenticity of the account.
- The account exhibits behavior characteristic of disinformation bots, such as dormant periods, strategic tweet timing, and inconsistencies in language and content.
Chapter 1
Ryan McBeth introduces himself and sets the stage for the investigation of a suspected Russian bot account on Twitter.
- Ryan McBeth, a retired anti-armor infantryman with a background in computer science and cybersecurity, investigates the Twitter account @capCoronado.
- The account is suspected of being a disinformation bot, and McBeth details his qualifications and expertise in tracking such entities.
Chapter 2
McBeth explains the concept of bots, the types of disinformation spreaders, and outlines the disinformation kill chain.
- Bots can be automated accounts or humans directed to follow a script, sometimes referred to as trolls.
- Disinformation spreaders also include 'sopus,' users who spread disinformation unknowingly or to support a cause.
- The disinformation kill chain has three phases: seeding, harvesting, and amplification, each serving a specific purpose in spreading disinformation.
Chapter 3
McBeth narrates the birth of a seeding bot and its three-phase strategy: crafting, connecting, and deceiving.
- McBeth discovers the bot's three-phase strategy to establish itself on Twitter and spread disinformation.
- The crafting phase involves dormant account creation and initial tweets to gain credibility.
- In the connecting phase, the account builds connections within the community, and the deceiving phase involves the account potentially being sold and used for disinformation.
Chapter 4
McBeth conducts a detailed analysis of the account's tweet history and identifies patterns indicative of a disinformation bot.
- The account exhibits periods of high tweet frequency, followed by dormancy, and most tweets occur on Mondays.
- A pattern of life analysis suggests the bot is dormant during times when the target audience is likely asleep.
- The account's content includes inconsistencies and characteristics that suggest the account is not operated by a native English speaker.
Chapter 5
McBeth dissects the tweets for evidence of disinformation and inconsistencies that undermine the account's credibility.
- The account makes implausible military claims and uses language that suggests unfamiliarity with American culture.
- Tweets contain false narratives and alarmist content meant to spread disinformation and stoke fears.
- The account's lack of credible sourcing and strange phrasing in tweets further indicate it is a bot.
Chapter 6
McBeth concludes the investigation, emphasizing the importance of vigilance against disinformation and offers resources for verification.
- McBeth concludes the account is 'very likely' inauthentic and encourages viewers to check the detailed data provided for verification.
- He emphasizes the role of knowledge in defending against manipulation and offers tools and resources to support his findings.
Chapter 7
McBeth promotes ways to support his channel and introduces branded merchandise.
- Viewers are invited to support McBeth's channel through donations on Substack or by purchasing merchandise from Bunker Branding.
- The merchandise includes t-shirts and hoodies with military themes.