Lesson Report:
Title
Algorithmic Bias Beyond Code: Structural Layers of Facial Recognition + Policy Memo “Shark Tank� Prep
Synopsis: The session combined assignment logistics with a structured, multi-layer analysis of algorithmic bias using facial recognition as the core case. Students examined how technical choices, industry incentives, cultural defaults, and lived experience interact to produce persistent inequities, then synthesized those layers through collaborative diagrams to prepare for deeper discussion next class.

Attendance
– Absent (explicitly mentioned): 0
– Students who spoke or were named: Elijah, Niloufar, Vic, Gavin, Ermahan, Varfia (Barfia), Anoush, Samira, Lucia, Sapi, Imad, Freshta, Bekaim

Topics Covered (chronological with activity/topic names)

1) Check-in and Admin Setup
– Brief scheduling housekeeping; instructor noted availability after class and plans to check meeting times during activities.

2) Policy Memo Briefing and Post-Submission Plan
– Deadline: Saturday, November 1, 23:59 Bishkek time.
– Topic feedback: Students invited to email topics for guidance; instructor will respond with advice.
– After submission: Ungraded, “Shark Tankâ€�-style mini-presentations.
– Format: 3-minute pitch per student.
– Content: 1) Clearly frame the AI-and-democracy problem; 2) Propose a concise, persuasive policy solution.
– Audience role: Classmates act as “sharks,â€� offering buy/no-buy feedback; light, playful “investmentâ€� via emojis in chat.
– Purpose: Share learning, practice persuasive framing, not a high-stakes presentation.

3) Assignment Q&A: Timing, Extensions, Sources, and Access
– Time zone clarification (US-based student): Due 23:59 Saturday Bishkek; instructor explained this is Saturday evening Bishkek, Saturday morning in New York.
– Time converter: Instructor will share a personal time-converter link (to be posted).
– Extension request (Freshta): Declined due to heavy November grading workload.
– eCourse access: Elijah (and possibly Imad) still lack access; instructor will email again to resolve; advised email submission if access not restored by Friday night.
– Sources for memo (later Q&A):
– Requirement affirmed: 2 course sources + 2 external sources.
– External sources can include government websites.
– Scope advice: If covering multiple related issues, unify them under a single policy package/throughline suitable for a memo.
– No video journal this week.

4) Framing the Day’s Inquiry: From “Democratic Algorithms� to “Just Algorithms�
– Recap: Prior debate asked whether a hypothetically perfect, always-correct algorithm would be democratic; today extends to whether such systems would be just, even if “democratic.â€�
– Core question: Why do algorithmic inequities persist even when developers are aware of them?

5) Case Study Launch: Facial Recognition Bias (Joy Buolamwini/Coded Bias)
– Image prompt: Joy Buolamwini wearing a white mask to trigger facial recognition—without the mask, the system fails to detect her face.
– Prompt to class: Identify why the system fails here; think beyond purely technical reasons.

6) Individual Freewrite (approx. 3 minutes)
– Task: Write initial thoughts on why facial recognition failed to detect Joy’s face without a mask. No deep technical detail required—surface the most plausible causes.

7) Debrief of Freewrite: Initial Causes Identified
– Student insights:
– Training data imbalance (overrepresentation of white faces; underrepresentation of Black and brown faces).
– Algorithmic bias manifests as misidentification/non-detection of darker skin tones.
– Environmental and hardware factors (lighting, camera/infrared differences) can magnify bias.
– Team composition and perspectives: If builders do not represent affected groups, blind spots persist.
– Instructor pivot: If even non-experts can anticipate these issues and researchers acknowledge them, why do they persist? Signals deeper structural drivers.

8) Multi-Layer Framework Introduced: Four Interlocking Layers of Causation
– Layer 1: Code and Data
– Focus: What’s wrong in the models/datasets? Collection, labeling, representativeness, testing regimes.
– Layer 2: Industry
– Focus: People and money—who works in top labs (e.g., DeepMind, OpenAI), who funds them, and what commercial/governmental priorities steer what gets built and shipped.
– Layer 3: Culture
– Focus: Societal defaults about whose faces count as “standardâ€� or “neutral,â€� and how media/advertising reinforce those defaults.
– Layer 4: Lived Experience
– Focus: Who bears the burden of demonstrating harms, enduring misclassification, and advocating for fixes? What costs are imposed on them?

9) Breakout Room Round 1: Single-Layer Analysis (5 minutes)
– Structure: Four rooms; each assigned one layer.
– Task: Brainstorm specific mechanisms within that layer that produce the Joy Buolamwini failure case (and similar outcomes).

10) Breakout Room Round 2: Cross-Teaching (5 minutes)
– New mixed groups with at least one representative from each layer.
– Task: Each representative explains their layer’s findings to peers; group gains a multi-layer view.

11) Breakout Room Round 3: Venn-Diagram Synthesis (10 minutes)
– Instructor demoed a 4-quadrant whiteboard setup (Code/Data, Industry, Culture, Lived Experience).
– Task:
– In-group: List key issues in each quadrant.
– Draw connections: How industry incentives shape code/data choices; how cultural defaults shape industry priorities and evaluation criteria; how lived experience both reveals harms and is shaped by failures; how feedback loops sustain bias.
– Deliverable: A coherent diagram showing component problems and inter-layer linkages that sustain the observed outcomes (e.g., facial recognition failing for non-white faces).

12) Whole-Class Share-Out and Synthesis
– One group summary (Elijah) highlighted:
– Code/Data: Limited, biased training data reflecting narrow perspectives; pipeline decisions that overlook robustness for darker skin tones.
– Industry: US tech-company ecosystems (e.g., partnerships with government, tech for surveillance/defense) prioritize applications and timelines that underweight equity; leadership demographics and power structures (white, male-dominated) shape priorities.
– Culture: Media and news patterns normalize associating Black people with criminality/danger, creating unconscious associations that seep into product assumptions and risk models.
– Lived Experience: People like Joy are forced into “scholar-activistâ€� roles to prove harms and fix problems they did not create—additional labor and cost borne by impacted communities.
– Instructor preview: Next class will deepen the structural analysis and continue with group diagram presentations.

13) Closing Admin and Next Steps
– Reading: Short (~11 pages) reading to be posted on eCourse; instructor will try to assign the Coded Bias documentary if a free link is available; otherwise, proceed with the reading.
– Office hours/meetings:
– Sapi: Tomorrow at 3:00 p.m. (confirm details).
– Imad: Tomorrow at 1:00 p.m. (confirm details).
– eCourse access: Instructor will follow up with IT/admin; Elijah to email assignment if access not restored by Friday night.
– Time-converter link to be shared in chat/LMS.

Actionable Items

Time-sensitive (before next class)
– Post materials:
– Upload the ~11-page reading to eCourse.
– Check availability of Coded Bias (documentary) and, if accessible, assign with viewing instructions; otherwise confirm the reading assignment.
– Clarify memo deadline/time:
– Re-state exact due time and date in Bishkek time and provide conversions for major student locations; resolve any phrasing confusion (“one minute before midnight Sundayâ€� vs. 23:59 Saturday).
– Share the promised time-converter link in LMS/email.
– eCourse access:
– Email IT/admin again about Elijah’s (and any others’) access issues; confirm resolution with students.
– Communicate fallback submission by email if access is not restored by Friday night.
– Thursday lesson plan:
– Allocate time for remaining groups to present their diagrams.
– Prepare prompts to deepen the structural linkages across the four layers and connect to additional cases beyond facial recognition.

Short-term (this week, pre-memo)
– Shark Tank logistics:
– Post one-pager with format, time limit, audience role, and “emoji investmentâ€� instructions; specify presentation dates.
– Memo guidance:
– Reiterate requirement of 2 course sources + 2 outside sources (including acceptable types like government websites).
– Encourage students synthesizing multiple issues to frame a coherent policy package with a single throughline.
– If available, share a rubric or checklist (problem statement, stakeholders, policy options, recommendation, feasibility, risks, implementation).

Individual follow-ups
– Sapi: Confirm 3:00 p.m. meeting details (location/Zoom link).
– Imad: Confirm 1:00 p.m. meeting details (location/Zoom link).
– Elijah: If eCourse access remains blocked by Friday night, confirm acceptance of memo via email and provide submission instructions (subject line format, file naming, etc.).

Nice-to-have (process improvements)
– Provide a shared template or whiteboard link for the four-layer diagram so groups can save and submit their visuals.
– Post a curated resource note on Joy Buolamwini and the Algorithmic Justice League, with corrected title “Coded Bias,â€� to replace the auto-transcription error (“Poland Biasâ€�).

Homework Instructions:
ASSIGNMENT #1: Policy Memo Essay

You will write a policy memo that identifies a specific AI-and-democracy problem and proposes a clear, feasible policy solution. The goal is to apply the structural lenses we discussed in class (code/data, industry, culture, lived experience) to diagnose the problem and then persuade a policymaker audience that your solution addresses root causes, not just symptoms. You will later share your memo informally in a brief Shark Tank–style, 3-minute pitch after submission (the pitch is ungraded).

Instructions:
1) Choose your problem and audience
– Select a focused AI-and-democracy issue (for example, algorithmic bias in facial recognition like the example with Joy’s mask, or another topic you’ve cleared with the professor).
– Identify a realistic policymaker audience (e.g., a ministry, regulator, or institutional decision-maker) your memo is written to persuade.

2) Define the problem clearly
– In the opening, state the concrete problem, why it matters for democratic practice/values, and who is affected.
– Use the class’s structural lenses to frame the causes:
• Code/data: What data or model-design issues are at play?
• Industry: Whose incentives, funding, or governance priorities shape what gets built?
• Culture: What “default face� or similar assumptions get reproduced through media and norms?
• Lived experience: Who bears the burden of pointing out failures and the costs of those failures?

3) Propose your policy solution
– Present 1–3 specific, implementable actions (e.g., procurement standards, auditing requirements, data-collection reforms, oversight mechanisms).
– If you have multiple ideas, join them under one coherent policy package. The professor emphasized that in a memo, you should “join them together under one common ideaâ€� so the pieces work as a unified solution.

4) Justify why your solution will work
– Explain how your recommendation addresses the structural causes you identified (not just a quick technical fix).
– Anticipate likely concerns (costs, feasibility, trade-offs) and respond persuasively.

5) Support with evidence and sources
– Use at least 4 sources:
• 2 course texts
• 2 outside sources (outside sources may include credible government websites)
– Integrate evidence to substantiate both the problem and the effectiveness of your proposal. Use a consistent citation style.

6) Write concisely in memo style
– Aim for clear headings, succinct paragraphs, and direct language appropriate for decision-makers.
– Make your argument “stickâ€� as if you had only three minutes to convince skeptical “sharks.â€� This will also help you prepare for the informal, ungraded class pitch after submission.

7) Final checks
– Confirm your memo aligns with the deadline and requirements discussed in class.
– If you have questions about your topic, you are welcome to email the professor for feedback on fit and scope (as noted in class).

8) Submit by the deadline
– Due: Saturday, November 1 at 23:59 Bishkek time.
• Note on time zones: This corresponds to Saturday morning in New York (as discussed in class). Double-check your local time conversion.
– Submit via the assignment link. If you still do not have access to the course site by Friday night, email your memo to the professor as a fallback.

9) Prepare for the post-submission class activity (ungraded)
– After you submit, you will give a very short, Shark Tank–style pitch (3 minutes) introducing:
• The AI-and-democracy problem
• Your policy solution and why it works
– Keep this in mind as you write so your key points are clear and compelling.

Leave a Reply

Your email address will not be published. Required fields are marked *