Lesson Report:
Title
Alexa, Terms of Service, and the Data Pipeline to Political Microtargeting
In this session, students interrogated the gap between the friendly, “helpful assistant� branding of Amazon’s Alexa and the legal/technical realities embedded in its Terms of Service. The class then pivoted from commercial data extraction to political data refining, mapping how everyday digital traces are transformed into sellable voter profiles used for microtargeted persuasion.
Attendance
– Students mentioned absent: 1 (Safi; instructor requested an “excuse sheetâ€� to excuse a prior absence)
– Notes: 4 new students joined (awaiting LMS/eCourse access)
Topics Covered
1) Kickoff and Continuity
– Quick recap of previous session: How market-driven logic and permissive legal frameworks enabled targeted advertising; review of the Alexa ad and the exercise of reading Amazon’s Terms of Service (ToS).
– Objectives for today:
– Analyze the conflict between Alexa’s advertised “helpful, private assistantâ€� persona and the ToS reality.
– Connect commercial data extraction to political microtargeting.
– Build a process flow from raw personal data to sellable political profiles.
2) What Alexa Is (Hardware and Pipeline)
– Device anatomy (student recall and confirmation):
– High-quality Bluetooth speaker, far-field microphones, small internet-connected computer.
– Always-on listening for wake words; audio snippets sent to Amazon servers.
– Server-side functions and data uses:
– Storage of user voice interactions “in the cloud.â€�
– Multi-purpose model training: voice recognition/replication and LLMs (e.g., text prediction).
– Explicit possibility in ToS to model individual users’ voices (not just generic voice features).
3) Group Activity 1: Ad Promise vs. ToS Reality (Breakout rooms; groups of ~5)
Prompt:
– Identify 2–3 specific conflicts between what the Alexa ad promises (feelings, imagery, assurances) and what the ToS permits (data practices, rights reserved by Amazon).
Key findings shared (synthesized across groups):
– Persona vs. Purpose
– Ad imagery: Alexa as a friendly, trustworthy, quasi-family member that simplifies home life.
– ToS reality: A sophisticated data collection system optimized for Amazon’s interests—continuous recording/storage of interactions; used to train AI models, improve services, enable frictionless commerce.
– Privacy vs. Persistent Surveillance
– Ad implication: Private, safe, at-home convenience.
– ToS reality: Voice data stored; contacts and location may be imported/retained; cross-service data sharing/ingestion (e.g., Spotify usage) with third parties; long retention windows.
– Responsibility vs. Liability Limits
– Ad implication: Dependable, competent helper in the heart of the home.
– ToS reality: Amazon limits liability for malfunctions, misrecognitions, and potential harms; the legal burden shifts to the user despite a “trust meâ€� marketing frame.
– Safety vs. Exposure
– Ad implication: Safe home technology.
– Identified risks: Misrecognition/false triggers, noisy-environment errors, potential attack surface for hacking; broader concern about possible backdoor access by state actors (not claimed in ToS, but a foreseeable risk).
– Care vs. Commodification (voice model concern)
– Ad implication: Personalized help.
– ToS reality: Individual voice modeling could later be licensed/sold, potentially enabling deepfake markets or repurposing likenesses; users’ voices become corporate assets.
Instructor synthesis/challenge:
– Why the unease if it’s legal and “you get what you paid forâ€�?
– Mismatch between intimate, in-home setting and opaque, ongoing data extraction.
– Asymmetric information: most users never read/understand ToS; the “dealâ€� is not meaningfully informed.
– Data is quietly repurposed beyond helping the user—across ad ecosystems and model training—without the human-legible disclosures the ad suggests.
4) From Commercial Extraction to Political Targeting
– Bridge concept: Data as oil
– Raw digital traces (crude oil) have low inherent value; value emerges after refining, categorizing, and packaging into actionable profiles (gasoline/products).
– Revisited the “Annaâ€� profile (from last class):
– Inferred attributes: Financial stress; active job/career seeking; interest in housing; high vulnerability to economic messaging.
– Important note: Algorithms search for many “Annasâ€�—they work at scale to assemble cohorts with similar vulnerabilities, not one-off humans.
5) Group Activity 2: Build the Data-to-Profiling Flowchart (Extraction → Refining → Packaging → Sale)
Instructions given:
– Map how Anna’s scattered digital actions become a product a political actor can buy.
– Four stages to include:
1) Extraction: What is collected and from where?
2) Refining: How is it organized/analyzed into signals?
3) Packaging: How are insights labeled and made marketable?
4) Sale: How is it delivered/transacted to the buyer?
Exemplars from share-out:
– Group 1 (skeleton pipeline):
– Data collection → Profile building → Political values/targets → Sell to political orgs.
– Instructor feedback: Good structure; add concrete “crude oilâ€� details (exact signals, sources) and clearer refining steps that lead to those insights.
– Group 2 (data specificity emphasis):
– Extraction examples: Demographics (age, location, education, income), social content engagement (likes, dwell time, comments), app usage (budgeting tools; LinkedIn), group memberships (alumni, homebuyer support).
– Inference: These traces reveal current needs/challenges (career focus, financial pressures) that guide downstream targeting.
– Instructor highlight: Include fine-grained engagement metrics (watch time, replays, pauses, comment behavior) that platforms actively track.
– Group 3 (algorithmic refinement and categorization):
– Refining: Correlation analysis across features; categorization into audience segments; probability scoring for “persuadabilityâ€� or “hypothetical voter.â€�
– Packaging: Segment labels and propensity scores for downstream ad systems; clear, comparable categories that media buyers can filter.
– Sale: Hand-off to political advertisers via platform dashboards or data brokers; activation as microtargeted ad buys.
– Note: Groups 4 and 5 ran out of time; they will present their flowcharts next session.
6) Closing, Next Steps, and Assignments
– Next week: Polarization—how algorithmic feeds (Instagram/TikTok) reinforce divergence; in-class activity comparing feeds and discussing mechanisms of polarizing exposure.
– Reading: To be posted tonight (instructor flagged a previously missing reading on legal failures/ToS regimes).
– Assignment: Video Reflection Journal #2 due Sunday, 11:59 pm (Bishkek time). Submit link (Drive/YouTube) via eCourse; if no LMS access yet, email the link directly to the instructor.
– Feedback: Instructor is actively reviewing Video Journal #1; feedback expected today/tomorrow.
Actionable Items
Urgent (before next class)
– Post all readings tonight, including the previously omitted reading on legal failure/ToS and data consent.
– Email next-week materials to the 4 new students lacking LMS access; confirm when they obtain access.
– Collect Safi’s excuse documentation to finalize absence excusal.
– Ensure Rooms 4 and 5 retain their flowcharts for first item next session (have them saved/ready to screen-share).
High priority (this week)
– Video Reflection Journal #2
– Due Sunday, 11:59 pm (Bishkek). Submit link on eCourse; students without LMS should email the link.
– Send a brief reminder 24 hours prior to the deadline.
– Journal feedback
– Continue returning feedback for Journal #1; aim to complete within 24–48 hours.
Preparation for next session
– Polarization activity logistics
– Ask students to bring access to Instagram or TikTok; pair students without accounts/devices.
– Prepare prompt and comparison framework (e.g., identify 5–10 items from feeds; code for topic, tone, stance, and engagement hooks).
Administrative follow-ups
– Syllabus distribution
– Re-send the syllabus to any student who didn’t receive it (e.g., Nirani) and post the link on LMS once accessible.
– Resource links from today
– Repost Alexa ToS worksheet and Alexa ad link on LMS (and email for those without access) for students to review.
Homework Instructions:
ASSIGNMENT #1: Video Reflection Journal #2
You will record a short video reflecting on this week’s work with the Alexa advertisement vs. Amazon’s Terms of Service, the “data-as-oil� pipeline (extraction → refining → packaging → sale), and the political microtargeting case centered on “Anna.� This helps you connect course concepts to your own experiences and prepare for next week’s discussion.
Instructions:
1) Record yourself speaking to camera for about 5 minutes. No slides are needed; keep it informal and reflective.
2) Address at least two of the following:
– Where do you see the biggest conflict between the Alexa ad’s promise (friendly, private helper) and the reality described in the Terms of Service (continuous data collection, model training, corporate liability limits)?
– Explain the extraction → refining → packaging → sale pipeline in your own words. Where in your daily app/device use do you think data about you is “extracted,â€� “refined,â€� and turned into a product?
– Reflect on “Annaâ€�: which signals made her “vulnerableâ€� for political persuasion, and have you noticed similar targeting in your feeds?
– Optional: Your reaction to voice modeling (building a model of your voice), hacking/“hot micâ€� risks, or why an arrangement can feel unsettling even if it is technically legal.
3) Upload the video to Google Drive or YouTube (set to Unlisted). Confirm that sharing permissions allow the link to open.
4) Submit only the link on the course page under “Video Journal #2.�
5) Due: Sunday, 11:59 PM Bishkek time.
6) If you still do not have access to the course site, email your link to the professor by the same deadline; if you truly cannot submit before access is granted, inform the professor by email.
7) File naming: LastName_FirstName_VJ2. Aim for clear audio; captions or a brief description are appreciated but optional.
ASSIGNMENT #2: Preparatory Readings on Polarization
You will read the materials posted for next class on political polarization and algorithmic curation. This prepares you for an in-class activity using your Instagram/TikTok feeds to examine how data collection and microtargeting contribute to polarized information environments.
Instructions:
1) Later tonight, locate the assigned readings posted by the professor. If you do not yet have platform access, watch your email for the files.
2) Skim first for the main claims, then close-read. Take notes on:
– How polarization is defined and measured.
– Mechanisms that drive it (algorithmic curation, engagement optimization, microtargeted messaging).
– Legal/policy angles that allow large-scale data capture (e.g., Terms of Service) and how that data is later used politically.
3) Write down 2–3 discussion points or questions that link the readings to this week’s topics (Alexa TOS vs. ad, the data pipeline, Anna’s profile).
4) Bring one concrete example from your Instagram or TikTok feed that could illustrate polarization or targeted messaging (you can screenshot or describe it).
5) Be prepared to reference these notes in class.
6) Due: Complete before our next class.
ASSIGNMENT #3: Save and Prepare Your Group Flowchart for Presentation
We will begin next class by sharing the flowcharts that map how Anna’s online activity becomes a product sold to political organizations (extraction → refining → packaging → sale). Groups 4 and 5 will present first; others should be ready if called upon.
Instructions:
1) Ensure your group’s flowchart is saved in a shareable format (image, PDF, slide). Verify that it opens clearly.
2) Check that it includes:
– Extraction: concrete examples of what is collected (likes, watch time, comments, app usage, location, contacts, demographics).
– Refining: sorting, correlating, and analyzing signals; building features; generating insights (e.g., career stress, homebuying interest).
– Packaging: assigning audience labels/segments and predictive scores (e.g., “economically anxious suburban homeowner—high persuadabilityâ€�).
– Sale/Delivery: how segments are made available to campaigns/advertisers (custom audiences, lookalike audiences, API/ad platform delivery).
3) Decide who will share the screen and who will explain. Aim for a 1–2 minute walkthrough.
4) Do not discard your in-class work; minor clean-up is fine, but keep the original logic intact.
5) Be ready to present at the start of the next class.