Lesson Report:
## Title
**Transition to the Final Project: Digital Hygiene Toolkits and Audience Personas**
This session shifted the class from the midterm policy memo toward the final project, a **digital hygiene toolkit** designed for a specific community facing propaganda. The lesson focused on two main objectives: first, clarifying what “digital hygiene” means in a grassroots, community-centered sense; and second, introducing the logic of **audience personas** so students can design interventions that respond to how people actually think, feel, and trust information.
—
## Attendance
– **Absent students explicitly mentioned:** **0**
– **Names mentioned absent:** None
– A formal attendance check was not captured in the transcript, but multiple students participated in chat, breakout rooms, and whole-class discussion.
—
## Topics Covered
### 1. Opening decision: move on from the midterm or spend more time on it
– The instructor opened by noting that, under the original schedule, the class would already have been finished with the midterm assignment, but the extra two days created room either to:
– continue reviewing policy memo material, or
– transition immediately into the final assignment.
– A quick class vote was conducted in chat.
– **Timur Musaev** was jokingly singled out for a “50-50” response.
– The instructor interpreted the class preference as a decision to **move on from the midterm** and begin final-project preparation.
—
### 2. Final project introduction: from top-down policy responses to grassroots digital hygiene
– The instructor introduced the final project as the creation of a **digital hygiene toolkit**.
– Core assignment frame:
– Students will identify a **community somewhere in the world** that is being threatened by propaganda.
– Unlike the midterm, which asked what an **NGO or institution** could do, the final asks what the **community itself** can do to better defend against propaganda.
– The instructor explicitly contrasted the two assignments:
– **Midterm:** top-down approach; institutional or NGO response.
– **Final:** bottom-up / grassroots approach; community self-defense.
– Students were told they **do not need to use the same community** from the midterm.
– If their current case has “run its course,” they may select a new one.
– The instructor said that Wednesday’s class would provide time to make that choice if needed.
– The instructor defined the day’s two major objectives:
1. understand the concept of **digital hygiene**;
2. begin work on the idea of an **audience persona**.
—
### 3. Warm-up writing activity: what propaganda targeting looks like and how ordinary people resist it
– Students were given two prompt questions and about three minutes to answer in chat:
1. What does it actually look like for a community to be targeted by propaganda?
2. What can ordinary people do to resist propaganda?
– The instructor specifically asked students **not** to stop at generic answers like “critical thinking” or “media literacy,” but to give **specific practices or actions**.
#### Student responses highlighted in discussion
– **Gulobov Ruslan Sodikovich** wrote that a targeted community often sees the **same message repeated everywhere**, with efforts to change opinion or generate **fear, anger, or support**. He also suggested resisting propaganda by **checking facts from different sources, not sharing suspicious posts, asking questions, and discussing information with others**.
– **Samatbekova Elaiym Samatbekovna** suggested that communities should **engage specialists and experts** who can provide deeper analysis of contested information.
– **Zulumbekov Alikhan Dastanbekovich** answered briefly with **“looking for evidence,”** which the instructor treated as a useful starting point and then pressed further by asking who would actually do that work in practice.
– **Yousufzai Khadija** gave a concrete example of propaganda: narratives on social media claiming that **immigrants are taking people’s jobs and futures**. The instructor marked this as a common and powerful trope seen in many political contexts.
– **Musaev Timur Arsenovich** suggested that communities may feel a **flow of disinformation** trying to change their beliefs, and that people can resist by **stepping back from media that provides fake news**.
– **Joro Danek** *(uncertain transcription in the transcript as “Danieka”)* appeared to share a **fact-checking / debunking example**, likely involving COVID misinformation, and then identified **older people** as a particularly frequent propaganda target. The instructor highlighted this as especially important, noting that older adults often appear across many contexts as a vulnerable audience due to lower technological or media literacy.
– **Ismailova Kamilla Renatovna** observed that propaganda creates or normalizes emotions such as **fear and anger**, and that people can become accustomed to it.
– **Harzu Natalia** commented that propaganda can create a **biased or constructed reality**, encouraging people to build and inhabit an alternative interpretation of events.
#### Instructor synthesis
– The instructor noted that class answers kept returning to familiar anti-disinformation tropes:
– media literacy,
– critical thinking,
– fact-checking.
– He then made an important transition point: these ideas are **not new**, and yet propaganda remains highly effective.
– This became the bridge to the day’s next question: **why generic anti-disinformation advice so often fails**.
—
### 4. Mini-lecture: why generic “be media literate” advice is often ineffective
– The instructor emphasized that students were not inventing anti-propaganda strategies for the first time; these have circulated for years or decades.
– Key claim:
– simply telling people to “be media literate” does **not automatically make them media literate**.
– He stressed that many anti-disinformation programs have **mixed or weak efficacy**, especially when they rely on vague, overly rationalistic messaging.
– This set up the breakout exercise analyzing a flawed public-information poster.
—
### 5. Breakout room workshop: analyzing a flawed government anti-disinformation poster
– Students were given a hypothetical government poster that read:
**“Stop fake news. Always check the source. Don’t let emotions control you. Trust only official government health sites.”**
– The instructor asked each breakout group to make a Google Doc with **two columns**:
– **Left column:** identify the actual commands/actions the poster is giving.
– **Right column:** use one of the course concepts/authors—**Haidt, Van Bavel, or Bale**—to explain **why the campaign would fail**, or possibly **make the problem worse**.
– The instructor framed the relevant concepts as:
– **the elephant and the rider**,
– **identity**,
– **the backfire effect**.
– The class used a dialectical notebook-style format similar to a previous week’s work.
– **Aynura** *(uncertain roster match)* asked a clarifying question about the task, and the instructor explained that students did not need to define a specific propaganda campaign; they only needed to analyze why the anti-disinformation poster itself was inadequate.
—
### 6. Breakout debrief, part I: Room 2 analysis
**Group members named:** **Beishenova Akylai Samatovna** *(likely transcript “Akari/Akrai/Akalai”)*, **Zulumbekov Alikhan Dastanbekovich**, **Harzu Natalia**, **Musaev Timur Arsenovich**
– The instructor first reviewed their left-column breakdown of the poster’s commands.
– His main criticism was that the group, like the poster itself, listed phrases such as **“do not let emotions control you”** without translating them into **concrete actions**.
– On the theoretical side, the group argued:
– The campaign would fail because **telling people to ignore emotions is unrealistic**.
– If people **distrust the government** or identify with groups that distrust it, commands from authorities can produce **resistance**.
– People may believe and share disinformation because it reinforces **group identity**, not simply because they have carefully reasoned through the claims.
– **Beishenova Akylai Samatovna** briefly clarified part of the group’s thinking about **trust as group-based** and the role of identity, though the instructor still pushed for fuller elaboration.
– The instructor tied their analysis back to the course:
– **Haidt:** emotions cannot simply be switched off.
– **Identity-based reasoning:** people are not blank slates waiting for official correction.
– **Backfire dynamics:** direct challenges from distrusted authorities often intensify commitment rather than reduce it.
—
### 7. Breakout debrief, part II: Room 1 analysis
**Group members named:** **Suslov Ivan** *(likely transcript “Yvonne/Ivan,” uncertain)*, **Turgunalieva Nazbike Baktybekovna**, **Imomdodova Samira Khairullaevna**, **Mar Lar Seinn**
– This group’s left column more clearly unpacked the government commands:
– “Stop fake news” as an attention-grabbing slogan,
– “Trust only official government health sites” as a demand to follow government-approved information,
– “Always check the source” as a fact-checking instruction,
– “Don’t let emotions control you” as a warning not to click/share based on anger or excitement alone.
– Their right-column analysis was praised for connecting the poster’s weaknesses to theory:
– Using **Bale/backfire effect**, they argued that when people’s views or identities are directly challenged, they may react with **more doubt and defensiveness**, not more openness.
– Using **identity / Van Bavel**, they noted that audiences whose identities are built on **skepticism toward authority** will interpret the message as an attempt to **control the narrative**, reinforcing an **us-vs.-them division**.
– They also argued that the poster’s framing is **too vague and nonspecific** to guide any real behavior.
– The instructor expanded on this by pointing out that the people most vulnerable to disinformation are often precisely the ones who **already do not trust official sources**.
—
### 8. Breakout debrief, part III: Room 3 analysis
**Group members named:** **Akylbekova Amina Batyrbekovna**, **Kasymova Chynara Iusubzhonovna**, **Kendirbaeva Kanykei Oskonovna**, **Yousufzai Khadija**
– Their left column listed commands such as:
– stop fake news,
– check the source,
– trust only government websites,
– think before you share.
– The instructor again pushed the group to define **what these commands actually mean in practice**, noting that such slogans are too abstract on their own.
– On the right column, the group connected the poster to theory:
– With **Haidt**, they argued that people rely on emotional processing, so messages that ignore emotion fail to persuade.
– With **Van Bavel**, they argued that people who do not trust the government will turn instead to the groups they identify with and trust.
– With **Bale**, they argued that confronting people’s prior beliefs directly can produce a **backfire effect**, increasing commitment to those beliefs.
– **Yousufzai Khadija** then verbally elaborated on the left column:
– “Stop fake news” could mean not liking, commenting on, or forwarding misleading content.
– “Always check the source” means examining where information comes from and whether it is linked to a reliable institution or agenda.
– “Don’t let emotions control you” could mean not reacting immediately in comment sections when content provokes anger.
– “Think before you share” means recognizing that reposting contested material can help spread a group’s agenda.
– The instructor used her explanation to show that, even when the underlying intention is sensible, the government poster still fails because **real people are unlikely to change behavior based on vague moral commands alone**.
—
### 9. Breakout debrief, part IV: Room 5 analysis
**Group members named:** **Tabibzada Dina**, **Samatbekova Elaiym Samatbekovna**, **Furmoly Floran**
– This group organized its analysis by tying each command to a specific reason it would fail.
– The instructor highlighted several strong points:
– People are given commands **without emotional connection**, making the campaign weak from a persuasion standpoint.
– Individuals often define truth through their own identity, prior beliefs, and trusted networks.
– People may not “check the source” when a message already aligns with what they want to believe.
– “Don’t let emotions control you” asks people to suppress a basic human response, which is not realistic.
– “Trust only government websites” may trigger even more suspicion where there is **structural distrust** of institutions.
– The instructor praised the group for demonstrating, again, that this is a **terrible digital hygiene toolkit** because it offers slogans rather than usable protocols.
—
### 10. Synthesis: what a good digital hygiene toolkit must do differently
After reviewing the rooms, the instructor paused to explicitly draw out the lesson for the final assignment.
#### Main takeaways
– A strong digital hygiene toolkit **cannot** rely on vague slogans like:
– “check the source,”
– “don’t be emotional,”
– “trust the government.”
– Students will need to create a **protocol** or system that a real community could actually use.
– The toolkit must answer questions such as:
– What exactly should community members check?
– What should they do after they check?
– How can they respond if they **do not trust official institutions**?
– This became the transition into audience design: to build a useful toolkit, students first need to understand **who the audience is and why certain messages feel persuasive to them**.
—
### 11. Introduction to audience personas via a Threads/Telegram example
– The instructor then introduced a second scenario, this time focused on audience psychology.
– Students were asked to imagine scrolling through **Threads** and seeing a post such as:
**“The medical industry is hiding the truth about what is in your family’s food. Join our private group to learn the holistic secrets they don’t want you to know.”**
– A Telegram link supposedly accompanied the post.
– The instructor noted that students have likely seen similar messaging before, even when the topic is not food or medicine:
– claims that **“they”** are hiding the truth,
– promises of access to secret knowledge,
– invitations into private chat groups.
#### Writing instructions for students
Students were asked to write for a few minutes in the **first person**—not as outsiders judging others, but as the type of person who would click the link. They were told **not** to simply say “a stupid person.”
They were asked to respond to:
1. **What am I afraid of?**
2. **What is happening in my life that makes me feel out of control?**
3. **Why do I trust this random post more than my doctor?**
– During this setup, **Harzu Natalia** added a side comment that in Russia, **Max** had become a prominent platform backed by the government; the instructor briefly asked whether it functioned like Telegram with channels.
—
### 12. Persona-building discussion: emotional vulnerability, distrust, and the appeal of “secret” knowledge
This became the most explicit lead-in to Wednesday’s persona work.
#### Student responses highlighted
– **Uncertain student (“Shadona” in transcript; roster match uncertain)** gave a detailed example modeled on her grandmother:
– an elderly woman, nearly 70, a doctor by profession,
– strongly protective of children and grandchildren,
– likely to click and share the link out of fear for family health,
– shaped by Soviet/post-Soviet experiences that encouraged skepticism toward institutions,
– attracted by the feeling of receiving **insider information** and regaining **control**.
– **Musaev Timur Arsenovich** wrote that food quality seems to have changed over time and that it is hard to know what chemicals are in food now; concern for children’s future health could make such a message compelling.
– **Samatbekova Elaiym Samatbekovna** wrote from the perspective of an older woman with illnesses who needs healthy food but struggles to keep up with changing information; she suggested such a person might distrust doctors because more illness can mean more medical profit.
– **Imomdodova Samira Khairullaevna** emphasized **curiosity** and the possibility that “maybe there’s something there we don’t know,” which can make unofficial sources feel more candid than official ones.
– **Suslov Ivan** *(uncertain transcript match)* wrote from the perspective of someone who cannot afford decent medical care and therefore already sees medicine as market-driven; such a person may click in hopes of finding legitimate insider information.
– **Kendirbaeva Kanykei Oskonovna** described fear of disease, concern about poor-quality products in markets, and greater trust in such posts than in doctors or hospitals.
– **Ismailova Kamilla Renatovna** wrote from the perspective of someone afraid that something is wrong with them and no one is telling them the truth, someone who fears missing important information and regretting it later.
#### Instructor synthesis
– The instructor stressed that propaganda works not because targets are simply “stupid,” but because it exploits:
– fear,
– uncertainty,
– loss of control,
– care for loved ones,
– prior bad experiences with institutions,
– and the emotional pull of secret or alternative knowledge.
– He gave his own example of a **new mother with a crying baby**:
– the doctor has little time,
– medical advice feels dismissive,
– the parent feels failed by the system,
– fears about chemicals in food become a persuasive explanation,
– and a random online community suddenly seems more emotionally responsive than official expertise.
– This was presented as the exact mindset students need to understand in order to build **personas** and then design meaningful community-facing interventions.
—
### 13. Closing logistics and next steps
– The instructor explained that **Wednesday’s class** would focus heavily on **persona generation**.
– There is **no additional reading** assigned for Wednesday.
– Students were reminded to:
– finish the **midterm**,
– be ready for the next phase of the final project,
– expect **Video Reflection Journal #2** to be assigned on Wednesday.
– A student asked whether the final would be **group work**, and the instructor confirmed that it would.
– He said the groups would be assigned/formed as the class moved into persona selection and project setup.
– **Aynura** *(uncertain roster match)* asked for the policy memo examples used in class (including examples from Afghanistan and Syria).
– The instructor promised to upload those resources to **eCourse immediately after class**.
– The instructor clarified the midterm deadline as **Wednesday at 23:59**.
—
## Student Tracker
– **Beishenova Akylai Samatovna** *(likely transcript “Akalai/Akari/Akrai”)* — Participated in Room 2’s poster analysis and briefly clarified the group’s point about trust and identity.
– **Akylbekova Amina Batyrbekovna** — Participated in Khadija’s breakout group on the failed government poster.
– **Furmoly Floran** — Participated in Elaiym’s breakout group analyzing why the government poster would fail.
– **Gulobov Ruslan Sodikovich** — Defined how repeated messaging targets communities and suggested fact-checking, withholding shares, and discussion as resistance strategies.
– **Harzu Natalia** — Commented that propaganda builds its own biased reality, worked in Room 2, and later mentioned the Russia-based Max platform.
– **Imomdodova Samira Khairullaevna** — Participated in Room 1 and later described curiosity and the desire to compare information independently as reasons someone would click a suspicious link.
– **Ismailova Kamilla Renatovna** — Noted propaganda’s emotional normalization and later wrote from the perspective of someone fearing hidden illness and missed information.
– **Joro Danek** *(uncertain transcription as “Danieka”)* — Contributed a debunking/fact-checking example and identified older adults as a frequent propaganda target.
– **Kasymova Chynara Iusubzhonovna** — Participated in Khadija’s breakout group on the government poster.
– **Kendirbaeva Kanykei Oskonovna** — Participated in Room 3 and later described fear of disease, poor product quality, and distrust in medical institutions.
– **Mar Lar Seinn** — Participated in Room 1 and submitted a persona reflection in chat.
– **Musaev Timur Arsenovich** — Weighed in during the opening vote, described disinformation as an attempt to alter beliefs, suggested distancing from fake-news sources, and later discussed food-quality anxieties.
– **Samatbekova Elaiym Samatbekovna** — Suggested consulting experts, participated in Room 5’s analysis, and later wrote from the perspective of an older ill person suspicious of medical profit motives.
– **Shoguniev Imat Imatovich** — Assisted with an issue early in class; the instructor thanked him directly.
– **Suslov Ivan** *(likely transcript “Yvonne/Ivan,” uncertain)* — Participated in Room 1 and later described how unaffordable, market-driven healthcare can push someone toward unofficial “insider” sources.
– **Tabibzada Dina** — Participated in Elaiym’s breakout group on the anti-disinformation poster.
– **Turgunalieva Nazbike Baktybekovna** — Participated in Room 1’s collaborative poster analysis.
– **Yousufzai Khadija** — Gave an example about anti-immigrant propaganda, then explained her group’s interpretation of the government poster commands in detail.
– **Zulumbekov Alikhan Dastanbekovich** — Suggested “looking for evidence” in the opening activity and participated in Room 2’s analysis.
– **Uncertain student (“Aynura” in transcript; roster match uncertain)** — Asked a clarifying question about the government-poster breakout task and later requested policy memo examples be reposted.
– **Uncertain student (“Shadona” in transcript; roster match uncertain)** — Gave a substantial persona example about an older grandmother/doctor, family protection, Soviet-era distrust, and the appeal of insider knowledge.
—
## Actionable Items
### High urgency
– **Upload policy memo examples to eCourse** as promised, including the examples referenced from Afghanistan and Syria.
– **Prepare Wednesday’s persona workshop** and final-project grouping process, since students were told that persona selection and group formation would begin next class.
### Medium urgency
– **Post or clarify final project group-work expectations** so students understand how teams will be formed and what the group deliverable will involve.
– **Prepare Video Reflection Journal #2 prompt/details** for release on Wednesday, as announced.
### Lower urgency
– Consider reinforcing the distinction between:
– **vague anti-disinformation slogans** and
– **concrete digital hygiene protocols**,
since this was the central conceptual hurdle throughout the lesson.
– Consider providing a short model of a **strong audience persona**, since students will build on this next class and several were already thinking in productive first-person terms.
Homework Instructions:
ASSIGNMENT #1: Complete the Midterm Policy Memo
You will finish and submit your midterm policy memo, which asks you to analyze a propaganda problem affecting a specific community and take the top-down perspective discussed in class by explaining what an NGO can do to address that problem. This assignment helps you complete the policy memo unit before the class fully transitions into the final project’s more grassroots focus on digital hygiene toolkits.
Instructions:
1. Return to your midterm topic and review the community and propaganda campaign you selected.
2. Re-read the original midterm prompt, the syllabus, and any notes you took so you can confirm that your memo still matches the assignment requirements.
3. Make sure your memo is using the midterm’s top-down approach. In class, this was contrasted with the final project: for the midterm, you should focus on the question, “What can an NGO do to solve a propaganda problem?”
4. Review your draft and confirm that you clearly identify:
1. the community you are studying,
2. the propaganda or disinformation campaign affecting that community, and
3. why this problem matters.
5. Strengthen your analysis of the propaganda campaign by making sure you explain how the community is being targeted and what effects the campaign is trying to produce.
6. Revise your recommendations so they are specific, practical, and appropriate for an NGO. Avoid vague statements, and make sure each recommendation clearly shows what the NGO should do.
7. Check that your memo is supported by research and course material where appropriate. Make sure your evidence is relevant to the community and campaign you chose.
8. Review the organization of your memo. Make sure each section is clear, focused, and easy to follow, and that your argument moves logically from the problem to the proposed response.
9. Edit the memo carefully for clarity, grammar, and concision. Since this is a policy memo, make sure your writing is professional and direct.
10. Use the memo examples discussed in class, including the examples from Afghanistan and Syria, to help you revise your format and approach. The professor said these resources would be posted after class, so check eCourse for them before you finalize your submission.
11. Proofread your final version one more time to make sure it is complete and ready to submit.
12. Submit your finished midterm by Wednesday at 23:59.