Lesson Report:
## Title: Short-Form Algorithms, Emotional Engagement, and the Polarization–Propaganda Link
**Synopsis (2–3 sentences):**
This session connected students’ observations of short-form social media videos (TikTok/Instagram) to broader course themes about propaganda and democratic polarization. Students identified what platform algorithms tend to reward (hooks, simplicity, emotional engagement, controversy) and what they tend to penalize (complexity, low visual/attention capture), then used these insights to explain why social media environments are conducive to propaganda. The class also discussed polarization and echo chambers, and analyzed why Bale’s “exposure to opposing views� intervention failed, using contact theory and identity-based explanations (Van Bavel).
—
## Attendance
**Students explicitly mentioned absent:** 2
– Benyak — “not hereâ€�
– Natalia — initially noted “not hereâ€� (later appeared and was added to a breakout room)
—
## Topics Covered (Chronological, Detailed)
### 1) Breakout Room Extension & Task Clarification: “What the Algorithm Rewards vs. Penalizes�
– Instructor checked whether groups completed the breakout task; most groups needed **2 more minutes**.
– **Task goal (reiterated):**
– Build a **list of features** of short-form videos that **get amplified** (promoted/recommended) versus features that lead to **rejection/subversion/suppression** by the platform algorithm.
– Students were expected to use videos found previously (end of last class) and synthesize patterns **across multiple group members’ videos**, not just select one or two.
– **Clarification question (Ainula):** Whether groups should choose only two videos (one emotional, one civic).
– Instructor clarified it is a **synthetic activity**: extract common features **across all shared videos** to infer what drives amplification vs. underperformance/suppression.
### 2) Breakout Room Logistics (Zoom Reassignment)
– Instructor manually rebuilt breakout rooms due to Zoom not remembering previous assignments.
– Multiple students were missing from prior room lists or appeared late; instructor redistributed:
– Added Dina, Helen, Niloufar, Subhan into rooms.
– Ruslan had connection/room placement issues and was re-added to Room 2.
– A student (Ainula) temporarily disconnected and needed to be put back into Room 3.
– This segment was mostly administrative but ensured groups could complete the comparison task.
### 3) Whole-Class Share-Out: What Gets Amplified? (Emotional vs. Civic Content)
**Prompt:** “Bottom line up front: what features get a video amplified by the algorithm?�
– **Chinara’s group findings (emotional vs. civic):**
– “Civicâ€� content resembled **news/political/serious events**; “emotionalâ€� content was **funny/lighthearted/memes**.
– **Funny/lighthearted content spreads faster** because it produces quick, positive emotional reactions (joy/humor), prompting:
– Rapid likes/comments/shares
– Quick “send to othersâ€� behavior
– **Civic/serious content spreads less** because:
– People react more slowly/less often
– It can be “sensitive/controversial,â€� potentially reducing reach or being “hiddenâ€�
– Practical observation: finding a civic post with **100,000+ likes** was **harder** than finding an emotional/funny one (which appeared immediately on the feed).
– **Imad’s step-by-step “algorithm testingâ€� model (platform mechanics and engagement signals):**
– When a short/reel is posted, it is shown first to a **small batch** (e.g., ~100 to a few thousand).
– Platform measures engagement signals, such as:
– Do viewers **stop scrolling**?
– **Watch time** / completion
– **Swipe away** quickly?
– **Re-watch**
– Effective “hookâ€� elements noted:
– Strong title/opening seconds
– Immediate emotional framing (“true storyâ€� feel)
– Identified the value of **universal emotions** (cross-cultural relatability): heartbreak, loss, missed opportunity, etc.
– Instructor reinforced additional engagement indicators:
– Pausing/holding on frames
– Re-watching
– Likes/comments/shares
– **Instructor-led generalization: Formatting similarities across content types**
– Even civic content often adopts the **same short-form format** as entertainment:
– A strong **hook** in the first 1–2 seconds
– Attention-grabbing/shocking openings
– Instructor example: sensational “dashcam near-crashâ€� openings used even in advertisements to capture attention, then pivot to unrelated messaging (illustrating hook mechanics).
### 4) Discussion: Civic Content, Sensationalism, and Comment-Driven Engagement
– **Helen’s observation (TikTok examples):**
– Civic/controversial content can generate **high comment volume** (debate), sometimes exceeding likes in perceived prominence.
– Civic content is more “debatable,â€� prompting strong opinions → people **comment rather than share**.
– Sensational topics and celebrity framing increase visibility:
– Bold/attention-forward titling
– Celebrity names displayed prominently
– Instructor synthesis:
– Civically oriented content that “winsâ€� often does so by being **sensational** and **provoking conflict/debate** in comments.
### 5) What Gets Penalized or Suppressed? (Failure Conditions & Platform Moderation)
**Prompt:** “What kind of content might be punished/suppressed by the algorithm?�
– **Sen’s and instructor synthesis on “failure featuresâ€�:**
– Content that lacks:
– Strong visuals/spectacle
– Fast emotional pull early on
– Simple and quickly digestible premise
– Instructor emphasized: **complexity is punishable**
– If an idea requires time/effort to understand, it often won’t survive the competition in the “oceanâ€� of daily uploads.
– Successful content typically offers an immediately intuitive emotional reaction (outrage, humor, strong agreement).
– **Chat-driven moderation discussion (Khadija and instructor):**
– Hate speech and graphic violence can be **suppressed** depending on platform policy and enforcement.
– Instructor complicating point: these categories can also **spread quickly** if platforms do not intervene, because:
– Outrage and agreement both generate comments/engagement.
– The algorithm does not distinguish “positiveâ€� vs “negativeâ€� comment sentiment—**engagement is engagement**.
– Note: hate speech can be **embedded in memes** (“irony layerâ€�), making moderation harder.
### 6) Linking Algorithms to Course Theme: Why Social Media is Fertile Ground for Propaganda
– Instructor guided a “full circleâ€� connection:
– Successful influencer content relies on **rapid emotional engagement**.
– Propaganda exploits the **same psychology** (quick emotional capture), but with different goals:
– Social media platforms: **profit** (engagement → revenue)
– Propaganda: **disrupt** or **control** (selling ideas/narratives)
– Transition to earlier course content:
– Reminder of Arab Spring discussion: early optimism that social media would strengthen democracy by enabling organization and a “public square.â€�
– Present-day reversal: social media can harm democracy through **polarization** driven by its engagement-based business model.
### 7) Polarization, Echo Chambers, and Democratic Breakdown
– Instructor elicited definitions:
– Polarization: not merely “two sides,â€� but a condition where groups **no longer share a common reality/facts/values** to debate.
– Echo chamber: repeated exposure to the same ideas, with little meaningful contact with outside perspectives; can drive views toward extremes over time.
– Instructor used a democratic prerequisite:
– Debate requires some shared baseline reality; polarization undermines this, making debate futile.
### 8) Propaganda + Polarization: Does Polarization Help Disruption and Control?
– Instructor posed application questions to propaganda goals:
– **Disruption goal:** polarization makes disruption easier (adding fuel to existing distrust/chaos).
– **Control goal:** polarization makes control easier (“divide and conquerâ€�).
– Examples referenced:
– Stalin-era forced migrations as control strategy (breaking groups, increasing distrust, making central authority the mediator).
– Student mentioned British policy in Afghanistan/India as an example of divide-and-conquer logic (student declined to expand due to speaking difficulty).
### 9) Attempted Solutions to Polarization: “Change the Algorithm?� and Bale’s Study
– Brainstorming how to reduce polarization:
– Highlight shared values and similarities across groups.
– Encourage seeing/hearing different viewpoints.
– “Change the algorithmâ€� to diversify feeds.
– One student suggestion: reduce economic incentives (influencer revenue caps) to reduce polarization drivers.
– Instructor introduced Bale’s study as a test of the common intuition:
– If echo chambers create polarization, then “popping the bubbleâ€� by exposing users to opposing content should reduce polarization.
### 10) Bale’s Findings: Exposure Backfires (and Why)
– Class answered: Bale did **not** depolarize participants.
– Democrats: little to no change when shown Republican content.
– Republicans: exposure to liberal content sometimes made them **more** polarized (moved further right).
– Instructor framed this as a major problem for simplistic “just show both sidesâ€� solutions.
### 11) Second Breakout: Why Bale Thought It Would Work (Theoretical Rationale)
**Breakout prompt (5 minutes):**
– Open Bale reading and explain, using course language, why Bale’s team expected depolarization to occur.
**Whole-class share-out:**
– **Samira:** expected effect because opposite views reduce stereotypes and increase open-mindedness; like meeting the outgroup in real life.
– **Ruslan:** explicitly named **contact theory / intergroup contact theory**: contact reduces stereotypes, increases mutual understanding.
– **Ainula:** exposure intended to challenge stereotypes and lead to understanding across groups.
### 12) Why It Failed: Identity Response (Van Bavel Connection)
– Instructor asked students to connect to Van Bavel:
– Key mechanism: **identity** (group/tribal identity) drives political alignment and conflict more than pure “ideas.â€�
– Student explanation (Nazbikia): exposure to opposing “opinion leadersâ€� triggered distrust; people didn’t want to listen to outgroup sources.
– Instructor synthesis:
– Opposing content is perceived as an **identity threat/attack**, not neutral information.
– Thus, exposure does not create deliberation; it provokes defensive reactions and counter-arguing.
### 13) Course Transition & Next Steps: From Theory to Resilience/Verification
– Instructor concluded this marks the end of the first course segment:
– Propaganda: definition + goals (disrupt/control)
– Why it works psychologically (emotional hook)
– How it spreads today (platform logic + polarization dynamics)
– Preview of next unit:
– Focus shifts to **solutions**: “immunizingâ€� people against propaganda; improving resilience.
– Next week: **verification techniques & open-source intelligence (OSINT)** workshops.
– **No reading** assigned for next week.
### 14) Logistics: Office Hours
– Student (Helen) asked about office hours.
– Instructor confirmed:
– **Wednesdays, 12:00–14:00 Bishkek time**
– Students should message to schedule; alternative times possible if time zones conflict.
– Noted briefly that office hours might not have been easy to find; instructor confirmed it is in the syllabus (after checking).
—
## Actionable Items (Short Bullet Points, Organized by Urgency)
### High Urgency (Before Next Class)
– **Prepare for in-class OSINT/verification workshops** (no reading, but instructor should have tools/exercises ready).
– Ensure students know: **no reading next week**, but active workshop participation expected.
### Medium Urgency (Course Administration / Follow-up)
– **Office hours clarity:** confirm office hours are clearly visible in the syllabus/LMS; if confusion persists, post an announcement: Wednesdays 12:00–14:00 Bishkek time (by appointment/message).
– **Breakout room stability:** consider pre-assigning breakout rooms in Zoom (if possible) or keeping a saved roster to reduce time lost to reassignment.
### Lower Urgency (Instructional Continuity)
– Carry forward a “running listâ€� for students next session:
– Algorithm reward signals (watch time, rewatch, comments, hook, simplicity, emotional resonance, controversy)
– Penalty factors (complexity, weak visuals/slow start; plus moderated categories depending on platform policy)
– Plan to explicitly bridge into next unit: how verification/OSINT can function as a practical “immunityâ€� toolkit against emotionally manipulative content.
Homework Instructions:
NO HOMEWORK
The professor explicitly says “there’s no reading for next week� and only mentions in-class “workshops� (“Next week… there’s no reading for next week. We’re going to be doing a bunch of workshops in class�), without assigning any out-of-class task to complete or submit.