Lesson Report:
Title
Reverse-Engineering the Algorithmic Self: Debrief of YouTube Polarization Experiment + Digital Twin Profiling
In this session, students analyzed how platform algorithms shape what they see and inferred what those systems “think� about them. We first debriefed the mini YouTube experiment on 15-minute cities (self- vs. community-generated exposure), then ran a structured activity to reverse-engineer each student’s social media feed and write a personality profile of a peer’s “digital twin� based on saved Reels/Shorts/TikToks.

Attendance
– Students mentioned absent: 0
– Note: Class ended ~15 minutes early due to instructor meeting.

Topics Covered (chronological)
1) Quick start and logistics
– Audio check; reminder that class will end 15 minutes early due to a meeting.
– Today’s plan: brief debrief of last class’s YouTube experiment, then a new activity analyzing social media feeds and constructing “digital twinâ€� profiles.

2) Debrief: YouTube Mini-Experiment on 15-Minute Cities (self-generated vs. community-generated exposure)
– Context reminder:
– Groups previously ran a “mini science experimentâ€� on YouTube to test how recommendations shift based on two conditions:
– Self-generated: students searched and selected content themselves.
– Community-generated: students followed pre-shared links provided by the instructor (simulating being sent videos from friends/communities).
– Groups were also assigned stance contexts (e.g., for/against walkable cities/15-minute cities) to probe whether platforms amplify a given direction.
– Student presentations and findings:
– Elijah (self-generated; against walkable cities): Initially received pro-walkable content despite anti-walkable queries; after interacting with a suggested video aligned with the “againstâ€� stance, the feed shifted to more anti-walkable content.
– Group example (self-generated; “problems with public transportationâ€� queries):
– First results: “Three problems with public transportation,â€� “Why the US gave up on public transit,â€� “Why NYC subways are such a mess.â€�
– Recommendation rail thereafter tilted toward negative takes on public transit; some mixed “anti-carâ€� items also appeared (e.g., privacy issues with automakers, car loan delinquencies), showing the feed can intermingle adjacent topics while still circling a theme.
– Community-generated example (following instructor link to “15-minute city conspiracy theory explainedâ€�):
– After refreshes, recommendation rail produced mostly unrelated videos (e.g., Tesla engineers quitting, farmer seed laws, chemistry demo—“this liquid explodes when shakenâ€�) rather than more 15-minute city/urbanism content.
– Additional note from another student (community-generated condition):
– Initial searches like “cancel carsâ€� surfaced some related items up top but quickly mixed in unrelated Shorts further down the page.
– Instructor synthesis:
– Results align with Cho’s model discussed earlier: self-initiated search and selection tend to trigger stronger recommendation cascades (“rabbit holesâ€�) toward related and potentially more extreme content; passively receiving links from outside YouTube (e.g., Telegram/WhatsApp/Facebook shares) has a weaker, less consistent effect on one’s recommendation graph.
– Why this matters: Platforms are more likely to intensify interests (and polarization) when users search and click within the platform, compared to when they merely watch externally shared links that don’t deeply alter the recommendation system’s picture of the user.

3) Activity Part I: Collecting personal feed samples (Reels/Shorts/TikTok)
– Quick poll: who has TikTok/Instagram? Three students reported having neither; they were paired with classmates who do.
– Pairings arranged (examples noted): Barfia with Mohamed Omar; Ermahan with Nurani; Freshta confirmed Facebook Reels access.
– Instructions:
– Platform options: Instagram Reels, TikTok, YouTube Shorts, Facebook Reels (LinkedIn Reels also exists, though not required).
– Task: Scroll through 10 items; save every other one (5 saves total).
– Timebox: ~2–3 minutes to collect 5 saved videos.
– Goal: Build a small, recent, platform-selected sample to analyze what the algorithm currently thinks will hold attention.

4) Activity Part II: Per-video micro-analysis and documentation
– For each of the 5 saved items, students noted:
– Topic: What is the content about (e.g., cooking, politics, fitness, urbanism)?
– Vibe: Emotional/tonal quality (e.g., funny, angry, aesthetic, educational, motivational).
– Inferred interest: What specific personal interest did the algorithm target (i.e., why this to you)?
– Logistics:
– Students without accounts worked with partners (breakout rooms enabled; screen sharing allowed as needed).
– Everyone compiled notes into a single Google Doc and included shareable links/URLs to each video.

5) Activity Part III: Digital twin profiling (group work)
– Framing:
– Distinguish between the “real youâ€� and your “algorithmic selfâ€� (digital twin)—a simplified profile inferred from limited behavioral data.
– Aim: Describe who the platform thinks the user is and what it thinks they care most about.
– Instructions:
– In 3-person breakout rooms, share Google Docs.
– Each student selects a peer’s document and writes a concise one-paragraph “personality profileâ€� of that person’s digital twin based solely on the 5 items (topics, vibes, inferred interests).
– Sample share-outs (no names attached unless volunteered):
– “Romantic traveler/aesthetic seekerâ€�: feed suggests a love for atmospheric, cinematic cityscapes (Paris/Switzerland), prioritizing beauty and vibe.
– “Beauty/fashion + adulthood stressâ€�: content on makeup, hair, style; also life-stage stress/pressure themes—suggesting the algorithm associates interest in self-presentation and adulting concerns.
– “Humor-forward copingâ€�: heavy on memes (including possible dark humor), interpreted as stress relief (e.g., academic pressure); another profile mixed memes with travel hacks.
– “Career-driven aspiring software engineerâ€�: videos on Microsoft coding interviews, productivity, time management, “how successful students workâ€�—signaling focus, discipline, and career ambition.
– Instructor note:
– Save these docs and profiles; we’ll reuse them to unpack not only what the algorithm infers but how/why it arrived there (signals like watch time, click-through, topic adjacency, account history, etc.).

6) Wrap-up and preview
– Next week’s focus: Disinformation and synthetic media (including deepfakes).
– A reading on disinformation tactics will be posted tonight for Tuesday’s class.
– Reminder to keep all Google Docs and digital twin profiles for an upcoming synthesis.

Actionable Items
Urgent (before next class)
– Post the disinformation/synthetic media reading for Tuesday.
– Create and share a single submission location (folder or LMS assignment) for:
– Each student’s Google Doc with 5 videos analyzed (topics, vibes, inferred interests, links).
– The one-paragraph digital twin profiles students wrote for peers.
– Confirm that students without social media accounts have their analyses included via partners’ submissions.

Upcoming / Plan-ahead
– Aggregate & review: Compile a class-wide snapshot of topics/vibes/inferred interests to identify common clusters and outliers; prepare a debrief on how algorithms may have inferred these interests.
– Revisit YouTube experiment: Collect any missing diagrams/screenshots; prepare a brief comparative analysis of self- vs. community-generated conditions to connect to literature (e.g., “rabbit holes,â€� personalization strength, polarization dynamics).
– Design next session’s structure:
– Short formative check on the disinformation reading (quiz or brief prompts).
– Curate examples of synthetic media/deepfakes with clear ethics guidance and discussion questions.
– Connect digital twin findings to susceptibility/resilience: how a platform’s portrait of us can shape exposure to mis/disinformation.
– Accessibility/Privacy follow-up:
– Offer an alternative pathway for students with limited platform access or privacy concerns (e.g., using YouTube Shorts via a clean browser profile or using a provided anonymized feed sample).

Homework Instructions:
ASSIGNMENT #1: Readings on Disinformation and Synthetic Media (Deepfakes)

You will read a short piece on disinformation tactics and synthetic media (including deepfakes) to prepare for next class’s discussion and to connect today’s “digital twin� activity to how platforms can be used to spread misleading content.

Instructions:
1) Locate the reading your instructor will post later tonight on disinformation and synthetic media (deepfakes).
2) Read it carefully before Tuesday’s class.
3) As you read, take brief notes that address:
– Key terms: define deepfake, synthetic media, disinformation vs. misinformation, and any new vocabulary introduced.
– Tactics: list at least three techniques or strategies used to spread disinformation.
– Detection: note at least three cues, tools, or strategies for identifying manipulated or synthetic content.
– Algorithmic angle: jot down 2–3 sentences linking the reading to today’s work on recommendation systems and your “digital twinâ€� (for example, how a platform’s profile of you could make you more likely to see or believe certain deceptive content).
4) Bring your notes to class (digital or paper) and be prepared to discuss one concrete insight or example from the reading.

ASSIGNMENT #2: Preserve Your “Digital Twin� Materials for Upcoming Use

You generated and shared a Google Doc today that profiles your feed (topics, vibes, inferred interests) and wrote a one-paragraph personality profile for a classmate’s digital twin. You will use these again soon, so make sure everything is saved and accessible.

Instructions:
1) Open the Google Doc you created today that contains:
– Your five saved videos (Reels/Shorts/TikToks) with working links.
– For each video: topic, vibe, and the inferred interest the platform may be targeting.
2) Confirm that all video links work. If any link is broken, replace it by copying a fresh shareable URL.
3) Add your one-paragraph “digital twin� profile that you wrote about a classmate (and, if you received one about you, paste that into your Doc as well or save it in the same folder).
4) Check sharing settings so you can quickly share or display the Doc next time (e.g., set to “anyone with the link can view� if appropriate).
5) Organize: name the file “YourName_DigitalTwin_Profile� and store it where you can easily retrieve it during class.
6) No submission is required now, but do not lose this file—you will need it for an upcoming activity.

Leave a Reply

Your email address will not be published. Required fields are marked *