Lesson Report:
## Title: Social Media as a Propaganda Ecosystem — Gatekeepers, Algorithms, and the Attention Economy
**Synopsis (2–3 sentences):**
This class transitioned from *how states use propaganda* to *why propaganda spreads so effectively today*, focusing on the internet—especially social media—as the major structural shift. Students used the Arab Spring as an entry case, then analyzed “gatekeeping� and contrasted traditional editorial decision-making with algorithmic amplification, emphasizing how profit-driven engagement systems shape what information becomes visible and persuasive. The session ended by launching a mini “lab� activity to identify what kinds of content social platforms reward, setting up a follow-up discussion on democratic consequences.
—
## Attendance
– **Absent students mentioned:** 0
—
## Topics Covered (Chronological, with detailed activities and examples)
### 1) Course continuity + framing question: Why is propaganda more effective today?
– Instructor opened by explicitly connecting to last week’s theme: shifting from *propaganda used by states* to *propaganda’s popularity/effectiveness in the modern era*.
– Class revisited the “major global shiftâ€� from the last session: **the internet**, and more specifically **social media** as the key mechanism that expands reach and influence.
—
### 2) Real-world anchor case: The Arab Spring and social media as an organizing/information channel
**Prompting + student knowledge-building**
– Instructor asked whether students had heard of the **Arab Spring** (early 2010s), requesting short summaries and causes.
– Student contributions (consolidated):
– A regional wave of **protests** (2010–2011) across the Middle East and North Africa.
– Included **coups/overthrows** of governments in some cases (protests succeeded in removing leaders).
– Drivers included **unemployment**, broader **economic grievances**, **anti-corruption** sentiment, **anti-dictatorship/anti-authoritarianism** demands.
– Origin point highlighted: **Tunisia**, including the triggering event of **self-immolation**.
**Key analytical pivot**
– Instructor linked the case to the day’s focus: *How did information about events (e.g., Tunisia’s triggering incident) spread across borders quickly, enabling coordination?*
– Students identified platforms like **Twitter and Facebook** as major vectors for dissemination and mobilization.
**Changing narrative over time**
– Instructor noted early-2010s “euphoriaâ€� about social media as a **“beacon of democracyâ€�** (open discussion, political organizing, anti-oppression mobilization).
– Instructor then contrasted this with contemporary perceptions (10–15 years later), where many people view social media as **net negative** due to:
– relentless negative news exposure,
– manipulative/attention-seeking content,
– pervasive advertising.
– Student example: one student reported deleting social media due to emotional/mental burden from negative news content (including geopolitical/personal context).
**Objective established**
– The central investigation for the lesson: *Why did platforms once framed as democratizing tools evolve into environments where propaganda thrives?*
—
### 3) Concept introduction: “Gatekeeping� and who controls information flows
**Term-building discussion**
– Instructor introduced **gatekeeper/gatekeeping** and elicited definitions.
– Student definition (accepted): a gatekeeper is the entity that **controls what information passes through** and what is **kept out**.
**Historical comparison activity (guided)**
– Instructor used a “time travelâ€� method to clarify gatekeeping structures:
**1980 information distribution (pre-internet) — where to publish to reach mass audiences**
– Students suggested:
– **Newspapers** (e.g., *New York Times*)
– **TV and radio broadcasters**
– Posters (noted as limited reach)
**Today’s distribution (social media era) — mass reach with low barriers**
– Students suggested:
– **Instagram**
– **TikTok**
– Facebook/X/Threads (mentioned)
– Instructor emphasized “rolling the diceâ€� dynamic: any post could potentially be shown to **hundreds of thousands/millions**, depending on platform dynamics.
**Gatekeeper identity shift**
– Pre-internet: gatekeeping mostly performed by **humans** (editors/boards/producers).
– Social media: amplification decisions largely made by **algorithms** (with limited human review/censorship systems in some cases).
—
### 4) Key term: Algorithm (definition and role as gatekeeper)
– Instructor prompted a general definition.
– Student definition (accepted): an algorithm is a **step-by-step set of instructions/rules** to solve a problem (often implemented as code / “big math problemâ€�).
– Instructor clarified algorithmic gatekeeping:
– Not necessarily deciding whether content can be posted, but deciding whether content gets **amplified** widely or shown only to a small circle (friends/local users).
—
### 5) Structured comparison activity: Editors/Publishers vs Algorithms (chart task)
**Instructions (2–3 minutes + extension)**
Students were asked to create a chart comparing two gatekeepers:
1) **Editors/publishers** (newspapers/TV)
2) **Algorithms** (social media feed systems)
For each, students brainstormed:
– **Primary goals**
– **Decision-making** (who decides)
– **Key “currencyâ€� / source of value** (what is valued most)
**Class debrief highlights**
– Students proposed:
– Editors/publishers: **credibility**, **public trust**, **public interest**, **responsibility**, **newsworthiness**
– Algorithms: **attention**, **engagement**, **data**, **interaction metrics**
– Instructor “trick questionâ€� reveal:
– At the highest level, both systems are businesses driven by **money/profit**.
– But they pursue profit differently, which changes the content ecosystem.
—
### 6) How “free� platforms make money: advertising + data profiles (Google model origin story)
**Problem posed**
– If users aren’t paying, how do platforms like TikTok/Instagram make money?
– Students answered: **ads**, views/likes/comments.
– Instructor pushed for mechanism clarity: likes do not directly transfer money; revenue is mediated through advertising systems.
**Core principle introduced**
– “If you’re not paying for the product, **you are the product**.â€�
– The “productâ€� is user attention plus **behavioral data** that builds profiles valuable to advertisers.
**Instructor explanation: targeted advertising logic**
– Instructor offered a simplified history of **Google’s advertising breakthrough**:
– Early banner-style advertising was inefficient.
– Google realized it could increase ad value by targeting ads to people **most likely to buy**.
– Data signals used: what you click, what you don’t click, time spent, browsing patterns, etc.
– Algorithms convert these signals into predictions (e.g., who is likely to want sneakers).
**Expansion on short-form platforms**
– TikTok/Reels/Shorts accelerate profiling because:
– they can rapidly test preferences (continuous scroll of videos),
– quickly learn what content keeps the user engaged,
– and translate that into consumer propensity predictions.
**Resulting ecosystem**
– Platforms optimize for **engagement** because engagement keeps users on-platform, generating more ad impressions and improving ad targeting effectiveness.
—
### 7) Connection to democratic theory: why this business model creates political risk
**Democracy values (elicited)**
– Students named key democratic requirements:
– **Freedom of speech/expression**
– **Equality/participation**
– **Transparency**
– **Fair elections**
– Instructor summarized functional needs: people must be able to speak and also **access what others are saying** (shared information environment).
**Key currency contrast (explicit)**
– Traditional journalism gatekeeping:
– “Currencyâ€� includes **newsworthiness**, relevance, social value, and trust/brand preservation.
– Algorithmic gatekeeping:
– “Currencyâ€� is **engagement** (likes, comments, shares, follows; emotional reaction).
– Truth is “niceâ€� but **not required** for amplification.
– Instructor emphasized platform accountability rhetoric:
– Executives often claim “we’re not editors; we’re a platform,â€� distancing themselves from content responsibility.
– This structure incentivizes emotionally compelling content—an environment where propaganda can thrive.
—
### 8) Mini “lab� activity launch: What content gets rewarded vs penalized on short-form platforms?
**Goal**
– Prepare students to empirically observe platform incentives by comparing viral emotional content vs viral civic/informational content.
**Individual instructions (3 minutes)**
Students were asked to open TikTok/Reels/YouTube Shorts and find/save:
1) **Emotional content** (feel-good, funny, anger-inducing, celebrity clip, life hack, etc.)
– Must have **≥ 100,000 likes**
2) **Civic/informational content** (news, political/social issue, serious report)
– Must have **≥ 100,000 likes**
**Breakout room instructions (groups of ~3, ~5–6 minutes)**
Students were to share their two examples and analyze:
– Differences in **like counts vs comment counts** across emotional vs civic videos.
– Infer platform logic:
– What content gets **rewarded** (amplified/viral)?
– What content gets **penalized** (hidden/low distribution/shadow-banned)?
– Clarification given: “penalizedâ€� = not necessarily removed, but **not shown broadly** (possible “shadow banâ€� dynamics).
**Outcome**
– Breakouts ran, but the class **ran out of time** before group findings could be reported to the whole class.
– Instructor stated the next session (Wednesday) would begin with group findings and then connect algorithm incentives to democratic consequences and potential fixes.
—
### 9) Wrap-up: Readings + next steps preview
– Instructor previewed Wednesday’s direction:
– connect what algorithms want (engagement/data) to effects on political beliefs, democratic processes, and potential remedies.
– Assigned readings on eCourse:
– One by **Defection**
– One by **Bale**
– Guidance: read both if possible; if time-limited, **Bale is required/priority for Wednesday**.
—
## Actionable Items (Short bullets, organized by urgency)
### High Urgency (Before next class: Wednesday)
– **Students:** Complete readings on eCourse; **prioritize Bale** if unable to do both (Defection + Bale).
– **Instructor:** Start Wednesday class by debriefing **breakout room findings** from the short-form content “labâ€� (rewarded vs penalized content; likes/comments comparisons).
### Medium Urgency (Course coordination / student support)
– **Supervision meeting:** Schedule confirmed with Chinara for **Wednesday at 2:00 p.m.** (instructor to place on calendar; implied follow-through).
### Ongoing / Reminders
– **Journal reflection #1 clarification provided:** Students may reflect on the whole course conference plan or focus on **one or two concepts/readings**, emphasizing **personal connection**, prior experience, and whether opinions changed.
– Consider reiterating submission details (length, deadline) in writing on the course site if not already posted (not stated in transcript, but may reduce confusion).
Homework Instructions:
ASSIGNMENT #1: Reading for Wednesday — Defection & Bale
You will prepare for Wednesday’s discussion on how social media algorithms shape what information spreads (and what gets hidden), and how that connects to propaganda and democracy.
Instructions:
1. Open the two readings that are posted on eCourse:
1) the reading by **Defection**
2) the reading by **Bale**
2. Read **both** readings if you have time.
3. If you are short on time, prioritize reading **Bale** in full before Wednesday’s class (this was identified in class as essential for Wednesday’s work).
4. As you read, keep the lesson’s main thread in mind:
– We discussed a shift from human “gatekeepersâ€� (editors/publishers) to algorithmic gatekeepers.
– We distinguished “newsworthiness/trustâ€� vs. “engagementâ€� as what gets rewarded.
– We began investigating how these incentives affect democracy and make propaganda easier to spread.
5. Come to Wednesday’s class ready to use ideas from Bale (and Defection if you read it) to help answer the upcoming focus question mentioned in class: if algorithms are optimized for engagement and advertising value, what consequences does that have for democratic processes and political beliefs—and what might be done to address those consequences?
ASSIGNMENT #2: Journal Reflection #1 — Personal connection to course concepts
You will write a short reflection that connects course ideas (concepts and/or readings) to your own experiences, prior knowledge, or changing opinions, so you can clarify what you think about the material and how it relates to real life.
Instructions:
1. Choose your focus for the reflection:
– You may reflect on the **whole course plan/conference theme** if you want, but you are **not required** to cover everything.
– If it’s easier, focus on **one or two specific ideas, concepts, or readings** that you found particularly interesting.
2. Make your reflection personal and analytical (as described in class). Address questions like:
1) What is the idea/concept/reading you’re responding to (briefly)?
2) What is your **personal connection** to it?
– Have you experienced something similar in your own life (online or offline)?
– Had you heard of this idea before this class? If so, how did you understand it then?
3) Did your **opinion change** in any way because of the class material? If yes, how and why?
3. Keep your writing centered on what you think about the idea(s) “in general,� using your own experiences and perspective as the main evidence.
4. Before submitting, quickly check that your reflection clearly includes:
– at least one course-related idea (a concept and/or a reading)
– your own reaction/connection (experience, prior familiarity, or changed view)
5. Submit the reflection according to the Journal Reflection #1 posting details provided for the assignment.