Lesson Report:
**Title: Designing Covert Influence Campaigns and Source Laundering Strategies**
In this session, students continued a multi-step workshop on disinformation and “active measures,� focusing on how states can covertly introduce and launder deceptive narratives into mainstream discourse. Groups refined their earlier scenario work (country, target population, and chosen false or misleading narrative) and then built detailed multi-step dissemination plans, from initial obscure placement to eventual mainstream uptake. The instructor closed by connecting these Cold War–era techniques to contemporary tools such as bot/troll farms and AI, and previewed a shift next week toward internal information control using the Chinese model.
—
### Attendance
– Number of students explicitly mentioned as absent: **0**
– Notes:
– A couple of students (e.g., “L.A.â€�) briefly dropped from breakout rooms and were re-assigned, but no one was reported as absent from class.
—
### Topics Covered (Chronological, with Activity/Lecture Labels)
#### 1. Group Scenario Recap: Initial Influence Narratives (Breakout Room Reports, Part 1)
Students reported back on the *first* stage of their ongoing workshop: each group had to adopt the perspective of a state actor and define a specific deceptive or manipulative narrative they wanted to inject into public discourse, including the target audience.
**Room 2 – North Korea’s Nuclear Signaling via False Narrative**
– Clarification of scenario:
– Group initially had some confusion; instructor helped pin it down.
– Final version at this stage:
– **Actor:** North Korea.
– **Target population:** Initially phrased as South Koreans, but more broadly, global audiences.
– **Goal:** Not purely to frighten, but to *remind* the world that North Korea is an important nuclear-armed state and to reassert itself on the global stage.
– **Narrative:** North Korea is significantly building up its nuclear arsenal.
– Later, they refined that it is **North Korea itself** spreading this (rather than South Korea), but via indirect channels.
**Room 3 – U.S. Disinformation to Undermine Russia Among Its Allies**
– **Actor:** United States.
– **Target population:** Allies of Russia.
– **Narrative:** The Russian government covertly cooperated with the U.S. to help arrest Venezuelan President Nicolás Maduro.
– Intent:
– Damage Russia’s reputation among its allies.
– Undermine trust in Russia as an anti-U.S. partner.
– Instructor highlighted:
– The narrative’s strategic target: trust and perceived loyalty within Russia’s alliance network.
– Interest in seeing how they operationalize this in later “launderingâ€� stages.
**Room 4 – China Blaming the U.S. for a Fictional Global Virus (“Norovirus� Scenario)**
– **Actor:** China.
– **Target population:** Global public, especially social-media-dependent news consumers.
– **Narrative:**
– A new virus “norovirusâ€� (group’s placeholder name) actually originates in China, but the Chinese state spreads a narrative that:
– The **United States** created and spread this virus.
– Justifications invoked:
– U.S.’s advanced medical research infrastructure.
– Global network of laboratories.
– History of classified biomedical or military-biological projects.
– U.S.’s early and extensive travel connections.
– Strategy element:
– Use a **“neutralâ€� third-country proxy** to spread the accusation, via anonymous “experts,â€� to make it appear independent and less politically motivated.
– Instructor’s note:
– Praised the use of a third-country proxy as a classic source-laundering move.
– Flagged interest in seeing how they further detail that third-country role.
**Room 5 – Russia and the “Return of Alaska� Narrative**
– **Actor:** Russia.
– **Target population:** Primarily U.S. domestic audiences (with a focus on Alaskans), and secondarily international observers.
– **Narrative:**
– Alaska (historically once Russian territory) supposedly wants to rejoin Russia.
– Elements of the story:
– Polls and deepfakes showing:
– Large numbers of “Russian compatriotsâ€� or recent Russian immigrants in Alaska.
– Americans with Russian ancestry calling to “sellâ€� or “returnâ€� Alaska to Russia.
– Tie-in with:
– Trump’s real-world attempts to purchase Greenland as precedent.
– Framing the U.S. as internally divided (e.g., “wokenessâ€� vs. MAGA) and socially exhausted, supposedly making Russia appear “more stableâ€� and “traditional.â€�
– Instructor:
– Acknowledged the narrative’s plausibility and emotional hooks.
– Noted it would be interesting to see how they operationalize this in a step-by-step covert campaign.
**Clarification Question – Must Narratives Be Political? (Sen’s Question)**
– A student queried whether their scenario must be strictly *political*, or whether non-political deceptive campaigns (e.g., related to skincare products) are acceptable.
– Instructor’s response:
– The exercise is **primarily political**, but:
– A **non-political** narrative (e.g., South Korea covertly boosting its skincare reputation) can be made *defensible* if:
– It is framed as economically or strategically significant (e.g., protecting a major export sector).
– It involves state-level psychological operations and intelligence services.
– Emphasized:
– The key requirement is that the scenario plausibly motivates a state actor to use *covert* information operations rather than simple overt marketing.
—
#### 2. Lecture/Instructions: Source Laundering and the Next Stage of the Workshop
The instructor then framed the **next task** in the context of “active measures� and the well-known **Operation INFEKTION** case.
**Review of Operation INFEKTION (Contextual Reference)**
– Recalled previously discussed Soviet disinformation campaign:
– Claim: The U.S. government created and deliberately spread HIV/AIDS.
– Mechanism: Information did *not* move directly from the KGB to major Western media; instead, it was:
– First planted in obscure or ideologically friendly outlets.
– Gradually picked up by more prominent, “neutralâ€�-looking sources.
– Eventually echoed by mainstream or international media.
**Key Concept: Stepwise Source Laundering**
– Instructor emphasized a *multistage* process:
– **1. Origin**: A government or intelligence service fabricates or distorts a claim.
– **2. Initial placement** (today’s focus):
– Choose an initial outlet that:
– Appears independent or neutral.
– Is obscure enough not to raise obvious suspicions of state orchestration.
– Provides a plausible reason *why* it would have access to or interest in this information.
– **3. Laundering/Amplification**:
– Move from obscure or partisan outlets to:
– Slightly larger or more “respectableâ€� venues.
– Influencers, NGOs, or academic-like sources.
– Eventually, mainstream or international media.
**Workshop Task (Part 2): Designing Placement and Laundering Plan**
– Students, in their groups, were instructed to:
1. **Decide the initial placement** of their narrative:
– Where will the information first appear so that it looks *organic*?
– What type of source (blog, minor regional outlet, “independentâ€� think tank, local TV, etc.)?
– Why would that source plausibly produce or discover this story?
2. **Plan the laundering steps**:
– How will the story move from:
– Obscure or fringe venues → mid-level or regional media → major outlets or widely followed accounts?
– Which specific platforms, influencers, or organizations will be used at each stage?
– What is the *end goal* for mainstream penetration (e.g., a particular TV channel, international newspaper, widely trusted news website)?
– Time:
– Students were given ~9–10 minutes in breakout rooms to map out their “evil deedsâ€� (instructor’s joking phrasing) in a multi-step chart.
—
#### 3. Group Reports: Placement and Laundering Strategies (Breakout Room Reports, Part 2)
Each group then presented their more detailed operational steps. The instructor consistently pushed them on two recurring analytic questions:
1. **Why covert instead of overt?**
2. **What makes the initial source and discovery story *believable* and minimally biased?**
**Room 1 – South Korean Skincare Psychological Operation (Non-Political Scenario)**
– Scenario (as clarified now):
– **Actor:** South Korea/state-linked actors.
– **Goal:** Covertly boost domestic (and possibly international) demand for South Korean skincare products, framed as an economic/strategic push.
– **Target population:** Primarily **younger**, **female**, social-media-active demographics.
– Initial placement:
– **Social media platforms**: Instagram, Facebook.
– **Product placement in K-dramas**:
– Skincare products subtly integrated into plotlines and character routines.
– Amplification & laundering:
– Use of **influencers and celebrities**:
– Initially, group simply said “Korean influencers,â€� K-beauty/beauty vloggers, etc.
– Also mentioned:
– Semi-informal beauty product review blogs.
– Mid-sized “independentâ€� cosmetics review pages with established audiences.
– Potential export of narrative beyond South Korea:
– Once established domestically, the same influencers/brands could target foreign audiences to grow global demand.
– Instructor’s critical guidance:
– **Specificity of influencer choice**:
– There are thousands of South Korean influencers; the intelligence operation should:
– Identify which influencers appear **least commercially biased** (e.g., doctors, dermatologists, lifestyle bloggers known for “honestâ€� reviews).
– Avoid “obvious salespeopleâ€� whose recommendations are clearly sponsored.
– **Assumed domestic problem**:
– If South Korea is launching a *state-level* covert campaign, this implies:
– Domestic skincare sales or reputation are under threat.
– The narrative should address *why* consumers have become skeptical or disinterested.
– **Bias awareness**:
– Modern social media users already recognize influencers as paid shills.
– Operation must find messengers whose reputations soften that skepticism (e.g., non-beauty influencers, medical professionals, or “reluctantâ€� reviewers).
**Room 2 – North Korea’s Nuclear Narrative via Indonesia**
– Scenario update:
– Earlier confusion was corrected:
– **Actor:** North Korea.
– **Goal:** Remind the world it remains a significant nuclear power, not merely to induce fear but to reaffirm its status.
– **Primary channel:** Covert rather than overt messaging.
– Initial placement:
– **Country chosen:** Indonesia.
– Rationale:
– Large population (~280 million).
– High social media penetration.
– Perceived as relatively neutral in global power politics.
– Plan:
– Use Indonesian media outlets and social media as the first venue for circulating:
– Highly realistic, AI-generated video showing Kim Jong-un at missile/arms factories, inspecting weapons, “demonstratingâ€� production growth.
– Amplification:
– Viral **social media circulation** within Indonesia:
– Shared by ordinary users and influencers.
– Expected secondary uptake:
– Once viral in Indonesia/regionally, **international news agencies** may pick it up as “unverified but widely circulated footageâ€� of North Korean nuclear expansion.
– Instructor’s critical questions:
– **Why covert?**
– Historically, North Korea has had no problem making *overt* nuclear threats and staging highly public missile tests.
– Why would it now *prefer* a covert psychological operation rather than:
– A public missile test.
– Direct official announcements?
– Group suggested:
– Limited information flow from North Korea gives any leak extra plausibility.
– Covert operation allows them to stage-manage imagery without taking overt responsibility.
– Instructor urged deeper development of this justification.
– **Choice of Indonesian media**:
– Needs **specific outlets**:
– Which Indonesian outlets or online communities have enough credibility and reach but are still “grey zoneâ€�?
– How does the **initial Indonesian source** plausibly obtain the video?
– Leaked by North Korean insider?
– Hacked from internal server?
– Provided by a shadowy “regional defense analystâ€�?
**Room 3 – U.S. Operation via Algerian Newspaper and Chinese Influencers**
– Scenario:
– **Actor:** United States.
– **Narrative:** Russia covertly aided the U.S. in arresting Maduro.
– **Target audience:** Russia’s allies and sympathetic publics.
– Initial placement:
– **Outlet chosen:** A fictional Algerian English-language news outlet called **“Desired Daily.â€�**
– Rationale:
– Algeria as a relatively independent, non-aligned country.
– English-language edition increases global readability.
– Private ownership implies “independent journalism.â€�
– Amplification:
– **Chinese influencers**:
– Selected because:
– China has both ties to Russia and a large online audience.
– Chinese social media personalities can present the story as “international newsâ€� rather than Western propaganda.
– Strategy:
– Chinese influencers echo or refer to the Desired Daily story, thereby:
– Making it look like a confirmed international report.
– Increasing its salience among Russia’s partners in Asia and the Global South.
– Instructor’s critical guidance:
– **Discovery problem**:
– Why would *this* Algerian paper be the *first* to break such a major global story?
– Students need to explain:
– What leak, whistleblower, FOIA-style document, or hacked communication gave Desired Daily access to the story?
– Why didn’t any Russian, Venezuelan, American, or larger international outlet get it first?
– Emphasized:
– The **plausibility of the origin story** is central to making the campaign work.
– Without a convincing “how we found this outâ€� narrative, the story risks being dismissed as fringe conspiracy.
**Room 4 – “Norovirus� Blame Campaign via Health Blogs and Social Media**
– Scenario recap:
– **Actor:** China.
– **Narrative:** The U.S. created a new virus (“norovirusâ€�) but is blaming China, echoing patterns seen in COVID-related conspiracies.
– **Target audience:** Global internet users, no specific linguistic limitation (focus on English but using automatic translation).
– Initial placement:
– **Neutral-looking health blogs and research websites**:
– Medical/health information sites that:
– Present themselves as scientific rather than political.
– Are not obviously tied to any government.
– **International news pages** with a science/health focus that appear independent.
– Amplification:
– Creation of **media outlets/accounts on Instagram and TikTok** with large follower counts.
– Repetition strategy:
– Reiterate the “U.S. created norovirusâ€� claim across multiple platforms and formats.
– Rely on the psychological effect that repeated exposure makes information feel more like “common knowledge.â€�
– Use of **automatic translation tools**:
– Make content accessible in local languages while originating in English.
– Secondary laundering:
– Later articles and posts could phrase it as:
– “According to earlier reports, some experts have suggested that…â€�
– Without naming or linking to the original fringe sources.
– Instructor’s critical push:
– **Conspiracy-theory fatigue and credibility**:
– This is at least the *third* major disease-theory blaming the U.S. in recent decades (HIV, COVID, etc.).
– Many audiences now have strong cognitive filters for “U.S.-created virusâ€� narratives and quickly categorize them as conspiracy theories.
– Questions students must address:
– How do these health blogs claim to *know* the U.S. is responsible?
– Leaked lab documents?
– Insider testimony?
– Suspicious funding patterns for labs?
– How do they differentiate their story from existing, already-discredited conspiracy tropes?
– Emphasis:
– Repetition can normalize, but only if the **first wave** of sources avoids immediate dismissal as crank or fringe.
**Room 5 – Russia and the “Alaska Wants to Return� Narrative**
– Scenario recap:
– **Actor:** Russia.
– **Narrative:** Significant segments of Alaska’s population want to rejoin Russia.
– Strategy 1 – Diaspora Journalist Narrative:
– **Key protagonist:** A journalist or opinion writer:
– Background:
– Born in a Russian family.
– Possibly emigrated from Russia; now lives in or reports on Alaska.
– Grew up hearing from family that “Russia is great now; economic and political situation is normal.â€�
– Writes a personal article or blog post describing:
– Economic crisis and political polarization in Alaska/the U.S.
– Nostalgia for the time when Alaska was Russian.
– Claims that many locals share this sentiment.
– **Laundering path:**
1. Neutral or mid-sized media outlets:
– Pick up *selective* quotes and photos (possibly out of context).
– Frame them as indicative of a broader undercurrent: “Some Alaskans are longing for Russia.â€�
2. Social media:
– Discussions and debates amplify the impression that this is a real trend.
3. Major domestic and international outlets:
– Eventually large U.S. and Russian outlets “cover the coverage,â€� framing it as:
– “Debate over Alaska’s future status resurfaces online.â€�
– Even if it originates from a single journalist, the echo makes it look like a larger movement.
– Strategy 2 – European Media Speculation Linked to Greenland:
– European outlet scenario:
– A European (e.g., Danish) outlet runs an analysis piece:
– “If the U.S. can consider buying Greenland, could Russia claim Alaska back?â€�
– Russia then:
– Exploits this as a talking point:
– Repeatedly citing the article.
– Framing it as serious international discussion of Alaska’s possible return.
– Emphasis on **quantity of narratives**:
– “Quantity over qualityâ€�:
– Constant speculation about Alaska’s status.
– Varying storylines: purchase, referendum, cultural ties, etc.
– Repetition across platforms to create an illusion of inevitability or at least serious debate.
– Instructor’s extension: **“Firehose of Falsehoodâ€� Concept**
– Recommended term: *firehose of falsehood*.
– A disinformation strategy where the aim is not to sustain a single coherent lie, but to:
– **Flood the information space** with many conflicting or overlapping falsehoods.
– Overwhelm the public’s ability to distinguish truth from fiction.
– Effects:
– Erodes trust in *all* information sources.
– Makes audiences feel that it is impossible to know what is true.
– Suggested further reading:
– Students were encouraged to Google “firehose of falsehoodâ€� and related analyses, as it ties directly into the Alaska case and modern information operations.
—
#### 4. Closing Mini-Lecture: Key Analytical Issues and Modern Tools
**Two Recurrent Design Problems in All Groups’ Plans**
The instructor synthesized cross-group issues:
1. **Why use covert active measures rather than overt actions?**
– Each scenario should confront:
– If a state *could* achieve its goal by overt means (e.g., missile tests, official speeches, open marketing), why would it opt for:
– Plausible deniability?
– Covert psychological operations?
– Longer, more fragile laundering chains?
– Groups were urged to articulate:
– The *added strategic value* of covert operations in their specific cases (credibility, deniability, ability to reach otherwise skeptical audiences).
2. **Plausibility of initial source and “discovery story�**
– Every first-placement outlet must answer:
– How did this small blog, neutral newspaper, or influencer get such explosive information?
– Leak? Whistleblower? Hack? Accidental discovery?
– Without a convincing origin story:
– Audiences default to skepticism (“This is just propaganda/conspiracyâ€�).
– Mainstream media are less likely to pick up the story as newsworthy.
**Transition to Contemporary Context: Bot Farms, Troll Farms, and AI**
– Instructor noted that the classic active-measures model was created in a **pre-internet era**:
– Physical pamphlets, marginal newspapers, and slow information flows.
– Major changes in the last decade, and especially the last 3–4 years:
1. **Bot farms (botnets) and troll farms**:
– Organized networks of automated or semi-automated accounts.
– Can:
– Mass-like, share, comment, and retweet content.
– Artificially inflate apparent popularity and consensus.
2. **AI-generated content**:
– Deepfakes (video and audio).
– Synthetic text, images, and “expertâ€� personas.
– Vastly lower the cost of producing convincing-looking propaganda at scale.
– These tools:
– Compress the *time* and *cost* involved in source laundering.
– Enable the **firehose** strategy much more effectively:
– Massive output at low cost.
– Multiple conflicting narratives launched simultaneously.
**Preview of Next Week & Assigned Reading**
– Next week:
– Continue discussion of external information operations.
– Transition toward **internal information control mechanisms**, with focus on:
– The **Chinese model** of information regulation and censorship.
– Reading:
– An article by **Bauer** (approx. 15–20 pages) on China’s internal information control:
– To be uploaded to **eCourse** the same evening.
– Students were asked to **read it before Monday’s class**.
– Final Q&A:
– Student asked for clarification on the term the instructor used alongside AI:
– Answer: **bot farms** (also “troll farmsâ€�).
– Clarified spelling and concept in the chat.
—
### Actionable Items
#### High Priority (Before Next Class)
– **Read Bauer article on Chinese information control**
– Length: ~15–20 pages.
– Platform: eCourse (to be uploaded by instructor).
– Purpose: Prepare for Monday’s discussion on internal information control and the Chinese model.
– **Groups refine their disinformation campaign plans**
– Explicitly address:
– Why the state actor in your scenario opts for *covert* ops over more obvious, overt mechanisms.
– The *plausible origin story* for your initial outlet:
– What is the supposed leak/hack/whistleblower event?
– Why that specific outlet or influencer is first.
– Improve specificity:
– Name types (or examples) of outlets, influencers, or social media communities.
– Clarify audience segmentation (which demographics each step targets).
#### Medium Priority (For Ongoing Course Work)
– **Optional individual research on “firehose of falsehoodâ€�**
– Look up analyses of:
– Firehose of falsehood disinformation model.
– Its application in Russian and other contemporary information operations.
– This will deepen understanding of:
– The Alaska scenario.
– Broader strategic logic behind flooding the information space.
– **Consider integration of modern tools into group scenarios**
– Think through:
– Where bot/troll farms and AI-generated content could plug into your existing dissemination chain.
– How these tools might alter:
– Speed of spread.
– Perceived authenticity.
– Resistance to debunking.
#### Instructor-Facing Follow-Ups
– **Upload Bauer reading to eCourse**
– Ensure students can access the PDF/article well before Monday.
– **Next session planning**
– Prepare to:
– Revisit group scenarios briefly to see how students addressed:
– Covert-vs-overt rationale.
– Plausible origin stories.
– Transition into lecture/discussion on:
– Chinese internal information control.
– Integration of bot/troll farms and AI into both external and internal propaganda strategies.
Homework Instructions:
ASSIGNMENT #1: Reading on the Chinese Model of Information Control (Bauer Article)
You will read the Bauer article on the Chinese model of internal information control to prepare for next week’s continuation of our discussion on propaganda, psychological operations, and how states shape information ecosystems, moving from external “active measures� (like Operation INFEKTION) to internal control (China’s model).
Instructions:
1. **Locate the assigned article**
1. Log in to the course eCourse page.
2. Navigate to the section for this week/next week’s class.
3. Find the PDF or link labeled with Bauer’s article (the professor said: “There’s an article by Bauer that I’m going to be posting on eCourse tonight.â€�).
4. Download or open the article so you can annotate or take notes.
2. **Do an initial skim (5–10 minutes)**
1. Read the title, abstract/introduction, section headings, and conclusion.
2. As you skim, ask yourself:
– How is China’s approach to controlling information different from the Cold War “active measuresâ€� and source-laundering strategies we discussed (e.g., Operation INFEKTION, the workshop scenarios about North Korea, Russia, China, etc.)?
– Is the focus on **internal** audiences, **external** audiences, or both?
3. Identify key sections that look especially relevant to:
– State control of media and platforms.
– How information is made to look “organicâ€� or credible.
– Any mention of digital tools, platforms, or surveillance mechanisms.
3. **Read the article closely (15–30 minutes)**
1. Read the full article carefully; it’s “relatively short� (about 15–20 pages).
2. As you read, annotate or take notes on:
– **Core mechanisms of control**: What tools or institutions does the Chinese state use to manage information flows (e.g., censorship systems, legal controls, platform regulation, content guidelines, etc.)?
– **Narrative control**: How does the state try to shape what stories people see and believe, similar to how in class we discussed governments trying to get particular narratives into the “mainstreamâ€� conversation?
– **Perceived legitimacy**: How are these controls framed to the domestic population (e.g., stability, security, anti–fake news, moral order)?
– **Comparisons**: Any explicit or implicit comparisons to other countries or to Cold War–era techniques.
3. Mark any passages that you find confusing or that you think are particularly important; these are good points to raise in discussion.
4. **Connect the reading to our recent workshop**
1. Revisit in your mind the in-class group scenarios described in the transcript, where:
– States chose a **piece of information** that was “not totally factually honestâ€� but that they wanted to become part of the **mainstream narrative**.
– Groups had to decide on **placement** (obscure initial outlet, influencers, neutral third countries) and **laundering** (how to move from fringe to mainstream).
2. Compare that model of **external influence campaigns** to what Bauer describes about China’s **internal information architecture**:
– Who are the main **targets** in Bauer’s account? Domestic citizens? Foreign audiences? Both?
– Are there analogues to “source launderingâ€� domestically—for example, using seemingly independent outlets, influencers, or “expertsâ€� to convey party-approved lines?
3. Note at least one way in which China’s model **goes beyond** or **differs from** the tactics you used in the class exercise (e.g., scale, centralization, integration with law and technology, surveillance component, etc.).
5. **Prepare 2–3 points to bring to class**
1. Write down **at least two** of the following (bullet points are fine):
– One key **mechanism** of information control Bauer describes, and why you think it is effective (or not).
– One **similarity** between China’s methods and the state-run influence campaigns we designed in groups.
– One **difference** between long-term internal control (China) and short-term external disinformation operations (like Operation INFEKTION or the Alaska/Russia, North Korea nuclear, norovirus scenarios).
2. Be ready to briefly explain these points in discussion on Monday; this will help link the reading to our ongoing work on propaganda, psychological operations, and concepts like bot farms, troll farms, and AI-supported influence.
6. **Optional but recommended: Link to bot farms and AI**
1. As you read, keep in mind the end-of-class note that “two major things� have changed in the last decade, especially the last few years: **bot farms / troll farms** and **AI**.
2. Ask yourself:
– Could the Chinese model Bauer describes be **augmented** by bot farms or AI-generated content?
– How might these tools change the **scale**, **speed**, or **subtlety** of internal information control?
3. Jot down any speculative thoughts; even if Bauer does not discuss these directly, this will help you engage with next week’s material.
By completing this reading thoughtfully, you will be better prepared to see how the techniques we practiced in class (source placement, laundering, narrative construction) operate not only abroad but also within domestic information systems, with China as a central case.