Lesson Report:
# Title
## Trust-Based Counter-Misinformation Toolkits and De-radicalization Simulation
This session focused on student presentations of final counter-propaganda/digital hygiene toolkits, followed by an applied de-radicalization exercise. Students presented and critiqued interventions aimed at misinformation in Nigeria and among Central Asian migrants in Russia, then practiced strategies for reducing radical belief without triggering identity threat or the backfire effect.
—
# Attendance
– **Students mentioned absent/late:**
– **Samatbekova Elaiym Samatbekovna** — initially reported as missing/absent by **Gulobov Ruslan Sodikovich**, but later joined class and participated.
– **Number of students mentioned absent:**
– **1 initially mentioned absent/late**
– **0 confirmed absent by the end of the session**
—
# Topics Covered
## 1. Opening Logistics: Presentation Scheduling and Group Communication Check
– The instructor opened by noting that students had not received the expected presentation signup sheet over the weekend.
– Instead of assigning groups to present, the instructor proposed handling presentations through **volunteering** and a quick class vote.
– Students were first asked to confirm whether each group had a way to communicate outside Zoom, such as a group chat.
– No one indicated that they lacked group communication, so the instructor assumed all groups had an external method of coordination.
– The instructor explained the intended structure for the week:
– Some presentations would take place during the current session.
– The remaining presentations would take place on Wednesday.
– Any remaining class time would be used for activities/games connected to the course themes.
– Students were given approximately two minutes to consult with their groupmates and decide whether they wanted to present that day or on Wednesday.
### Group presentation schedule decisions
– **Group 1:** Chose to present during the current class.
– **Group 2:** Chose to present on Wednesday.
– A representative identified the group as including **Yousufzai Khadija** and **Ismailova Kamilla**; the third name was unclear in the transcript.
– **Group 3:** Chose to present on Wednesday.
– **Akylbekova Amina** responded for the group.
– **Group 4:** Tentatively placed for the current class because **Samatbekova Elaiym** was initially missing but later arrived.
– **Group 5:** Chose to present on Wednesday.
– The instructor identified the group as including **Zulumbekov Alikhan**, **Ahmadi Nahida**, and **Harzu Natalia**.
– **Group 6:** Chose to present on Wednesday.
– **Group 7:** Chose to present on Wednesday.
– The instructor identified the group as including **Ezgo Helen**, **Suslov Ivan**, and **Silmonova Nilufar**.
– **Group 8:** Chose to present on Wednesday.
### Final structure for the day
– The instructor confirmed that **Group 1** and **Group 4** would present during the current class.
– The remaining groups would present on Wednesday.
– After the two presentations, the class would move into an interactive activity connected to radicalization/de-radicalization.
—
## 2. Group 1 Presentation: Interactive Toolkit Against Biolab Misinformation in Nigeria
### Presentation focus
– **Group 1** presented a digital toolkit designed to counter misinformation among **digitally active youth in Nigeria**.
– The specific misinformation threat addressed was the narrative about alleged **U.S. biolabs in Ukraine**, which the group described as still circulating despite having been debunked.
– The group framed the project around a central behavioral insight:
– Many people do not spread misinformation because they deeply believe it.
– Instead, they often spread it because they react quickly, emotionally, and share content without thinking.
### Target audience and vulnerability
– The target audience was described as **young, digitally active Nigerians** who receive and share information through platforms such as:
– WhatsApp
– Facebook
– Telegram
– The group argued that misinformation spreads effectively in this context because users often trust messages based on social proximity rather than verification.
– Messages from family, friends, or trusted group chats may be accepted more easily.
– The group also connected the biolab misinformation narrative to broader skepticism toward Western actors in the region.
– This preexisting skepticism makes the claim about U.S. biolabs more believable to some audiences.
### Toolkit design: Interactive website / GitHub-based resource
– The group created an **interactive website**, hosted or built through GitHub.
– **Ibraimov Suban** was credited by the group and instructor as the main technical contributor/web developer.
– The instructor specifically praised Suban’s work on the website’s appearance, functionality, and usability.
– The website was designed to resemble or draw inspiration from a “stop and don’t forward” model.
– Rather than directly telling users, “Do not believe this,” the toolkit begins with a familiar scenario:
– A user receives a shocking message that says something like, “Share this before it gets deleted.”
– The group described this as the toolkit’s “hook.”
– It feels familiar and realistic.
– It avoids sounding judgmental.
– It encourages engagement instead of defensiveness.
### Techniques used in the toolkit
The group explained several persuasive and educational strategies built into the project:
– **Prebunking approach**
– The toolkit teaches users to recognize misinformation patterns before encountering them in the wild.
– Examples of suspicious patterns included:
– Emotional language
– Urgency
– Lack of sources
– Manipulative framing
– “Forward quickly” messaging
– **Mental immunity**
– The group described the goal as building “mental immunity” against misinformation.
– By learning to recognize patterns, users become less likely to be influenced by future false information.
– **Humor**
– Humor was included to lower defensiveness.
– The group argued that if a correction feels funny or light rather than accusatory, users may be more willing to reject misinformation.
– **Everyday communication style**
– The toolkit avoids overly formal or confrontational language.
– It encourages responses that feel natural, relatable, and socially acceptable.
– **Pause before sharing**
– The central behavioral habit promoted was to pause before forwarding content.
– The group emphasized a simple idea: wait a few seconds before sharing.
– **Quick verification process**
– Users are encouraged to check at least one source.
– If they are uncertain, the toolkit advises them not to share the message.
– **Behavior over belief**
– The group emphasized that the project aims first to change behavior rather than immediately change beliefs.
– The immediate goal is to reduce the spread of misinformation by slowing down forwarding behavior.
### Instructor feedback and question: How will Nigerian users find the toolkit?
The instructor praised the website as visually strong, functional, attractive, and clearly purposeful. The instructor then asked a major implementation question:
– How will the target audience in Nigeria actually find the website?
– Since there are many websites online, what is the plan for making sure people exposed to this specific propaganda encounter this toolkit?
### Group 1 response: Distribution strategy
The group proposed several methods for reaching the target audience:
– **Integrating a pause/reminder into forwarding contexts**
– A group member suggested that a small message or prompt could be attached to shared links, reminding users to pause before forwarding.
– The group wanted to avoid directly labeling the content as misinformation because that could increase curiosity or defensiveness.
– **Humor-based integration**
– The group suggested embedding humorous warnings or prompts into links/messages to make users more receptive.
– **Platform targeting**
– The group proposed using paid advertisements and organic posts on high-traffic platforms such as:
– WhatsApp
– Facebook
– TikTok
– **Search engine optimization**
– The group identified SEO as a strategy.
– They proposed optimizing keywords, metadata, and content so the site would appear in searches related to trending misinformation.
– **Analytics and campaign tracking**
– They mentioned using Google Analytics to monitor:
– Traffic sources
– User behavior
– Campaign effectiveness
– **Seeding content inside networks**
– The group proposed sharing short-form content inside relevant online communities.
– They also suggested using influencers or celebrities to amplify the toolkit.
### Instructor follow-up: Specific hashtags, pages, and communities
The instructor asked whether the group had identified specific Instagram groups, social media accounts, or hashtags that would be especially useful for reaching the target audience.
The group responded that hashtag mapping would be part of their plan and gave several possible categories/examples:
– General Nigeria-related hashtags:
– Nigeria
– Niger News / Nigeria News
– Public health-related hashtags:
– Public health Nigeria
– Health NG
– Stay safe Nigeria
– Misinformation-prone phrases:
– Hidden truth
– They don’t want you to know
– Youth/student-focused tags:
– Niger youth
– Nigerian student
The group emphasized that the goal would not be to create an entirely new audience, but to enter existing conversations where misinformation is already spreading.
### Instructor closing feedback
– The instructor accepted the answer as broadly solid, while noting that he did not personally have enough experience with those specific hashtags to verify them.
– The instructor invited classmates to ask questions, but no one did.
– The group was thanked for presenting first and for taking the risk of opening the presentation sequence.
—
## 3. Group 4 Presentation: Telegram-Based Digital Hygiene Toolkit for Central Asian Migrants in Russia
### Presentation focus
– **Group 4** presented a toolkit titled around **digital hygiene to counter disinformation among Central Asian migrants in Russia**.
– The project focused on migrant workers from:
– Kyrgyzstan
– Tajikistan
– Uzbekistan
– The group described these migrants as often working in low-income sectors such as:
– Construction
– Delivery
– Cleaning
### Target audience and vulnerability
The group emphasized that Central Asian migrants in Russia are vulnerable to misinformation because:
– They often strongly depend on employment in Russia to support families back home.
– Remittances may be a major or main source of family income.
– They face:
– Job insecurity
– Fear of deportation
– Limited legal protection
– Even small economic or legal changes can have serious consequences for their lives.
### Information environment
The group explained that migrant workers commonly receive information through:
– Telegram channels
– WhatsApp groups
– Coworkers
– Friends
– Family members
The group described these as **closed and trusted networks**, where information can spread quickly but is often not verified.
### Why this audience is targeted
– The group argued that misinformation aimed at migrants often connects directly to their strongest anxieties:
– Losing employment
– Losing income
– Being deported
– Facing hostility from local citizens
– Because migrants depend on the Russian economy, narratives about instability feel realistic and urgent.
– This makes disinformation easier to spread through their existing social networks.
### Toolkit design: Telegram-based information system
The proposed toolkit was a **Telegram-based information system** embedded in the platforms migrants already use.
The toolkit would include:
– **Voice messages**
– Useful for migrants who may prefer audio communication or may not have time to read long posts.
– **Real updates for migrant jobs**
– Information about employment conditions, opportunities, or risks.
– **Myth vs. fact corrections**
– Directly addressing rumors and misinformation.
– **Practical guidance**
– Jobs
– Police checks
– Legal help
– Potentially housing or documentation issues
### How the toolkit would work
The group stated that the toolkit would be:
– Shared through existing Telegram groups.
– Managed or supported by trusted people from migrant communities.
– Designed for environments migrants already use instead of asking them to adopt an unfamiliar platform.
The group argued that the approach would work because:
– Most migrants already use Telegram.
– Migrants are more likely to trust information from other migrants than from institutions.
– Information can spread quickly through existing trusted networks.
### Key strategic justification
The group stressed that their toolkit would not simply create a separate new channel and wait for migrants to join. Instead, it would enter the “battlefield” where misinformation is already spreading:
– Existing Telegram groups
– Active migrant chats
– Community networks
– Diaspora communication spaces
The group argued that this would allow the toolkit to respond quickly when misinformation emerges, rather than waiting until rumors have already spread widely.
### Communication style
The group emphasized that the toolkit would not force migrants to change their beliefs. Instead, it would use:
– Questions
– Voice messages
– Images
– Practical guidance
– Corrections with sources
– Non-confrontational framing
The goal would be to help migrants engage voluntarily with more accurate information.
### Trust and insider strategy
The group noted that people are more likely to trust members of their insider circle than official institutions. This is especially important in Russia because, according to the group, government institutions and official news are often not perceived as migrant-friendly.
The toolkit would therefore rely on:
– Migrant community insiders
– Migrants who support the project
– Community leaders
– Popular Telegram admins
– Bloggers from Central Asian countries
– Relatives and friends already connected to migrant communities
### Instructor question: What misinformation would the toolkit debunk?
The instructor asked for examples of specific misinformation themes targeting migrants in Russia.
A Group 4 speaker, likely **Samatbekova Elaiym**, described asking an uncle who currently works in Russia as a migrant. According to this account:
– Some migrants have heard rumors that they should prepare to return to Kyrgyzstan before economic instability in Russia gets worse.
– Some fear losing their jobs or being deported if conditions deteriorate.
– There are also narratives that local citizens are hostile toward migrants, accusing them of:
– Taking jobs
– Creating security threats
– Causing problems in the country
The speaker explained that such rumors can create panic among migrants.
### Instructor question: Who are the trusted intermediaries?
The instructor asked how the group would find trusted people in migrant communities and convince them to partner with the project.
The group answered that they would look for:
– Migrants already arguing against misinformation inside Telegram debates.
– People motivated to calm down their colleagues and maintain stability in their communities.
– Migrant community leaders.
– Popular Telegram admins.
– Bloggers who speak to Central Asian migrant audiences.
– Family members and friends connected to migrant communities.
The group argued that such insiders would benefit from the project because they would also want to reduce panic and misinformation among their communities.
### Instructor question: Have specific Telegram channels been identified?
The instructor asked whether the group had identified actual Telegram channels or social media pages used by migrant communities.
The group responded that they did not want to create a completely new Telegram channel because that might seem unfamiliar or unconvincing. Instead, they wanted to work inside already active Telegram chats.
One speaker said they had found a Telegram channel/group that sounded like **“MigrantEG”** or similar, apparently aimed mainly at migrants from Kyrgyzstan.
– The group reportedly had around **8,000 subscribers**.
– It included information about:
– Apartments
– Job opportunities
– Taxes
– News
– The speaker had difficulty joining the group but identified it as a possible existing network for outreach.
### Instructor closing feedback
– The instructor agreed that working with existing groups is more effective than creating something entirely new.
– He suggested that the group would likely need to contact admins of those channels to distribute or link the toolkit.
– The instructor invited classmates to ask questions, but no one did.
– Group 4 was thanked for presenting.
—
## 4. Transition: Confirmation That Remaining Groups Will Present Wednesday
– After Group 4 finished, the instructor gave the remaining groups one more chance to volunteer to present during the current session.
– No additional groups volunteered.
– The instructor confirmed that the remaining presentations would take place on Wednesday.
– The class then transitioned into the planned activity for the rest of the session.
—
## 5. Activity Setup: De-radicalization Simulation
### Purpose of the activity
The instructor introduced a class activity described as a **de-radicalization simulation**.
The goal was for students to practice strategies for:
– Understanding radicalization narratives.
– Identifying underlying fear/anxiety beneath a conspiratorial belief.
– Reducing radical belief without direct confrontation.
– Avoiding identity threat.
– Avoiding the backfire effect.
### Game premise
The instructor explained that he would play the role of the radicalized individual. Students would have to:
1. First decide what topic he had been radicalized about.
2. Then work in small groups to develop a strategy for de-radicalizing him.
The instructor framed himself jokingly as an “open book” who was willing to be radicalized by almost anything for the purpose of the exercise.
### Brainstorming instructions
Students were asked to individually brainstorm one radicalization scenario and post it in the chat.
The instructor specified that students should not repeat the exact audience/fear from their own final project. They could choose any other scenario, such as:
– Biolabs in Ukraine
– Migrant fears
– Anti-vaccine conspiracies
– Anti-AI beliefs
– Anti-school beliefs
– Economic grievance narratives
Students were initially given three minutes, but when relatively few ideas appeared, the instructor extended the time by approximately two more minutes.
—
## 6. Student Brainstorming: Radicalization Scenario Ideas
The following student contributions were discussed or referenced:
– **Samatbekova Elaiym**
– Submitted multiple ideas, one of which was an **anti-vaccine fear** scenario.
– This was ultimately selected for the activity by random number generation.
– **Ismailova Kamilla**
– Submitted a radicalization scenario in the chat.
– The exact content was not clearly preserved in the transcript.
– **Azimshoev Ofarid**
– Suggested that an **economic crisis** can lead to radicalization.
– The instructor asked for more specificity: what propaganda narratives exploit economic crisis to influence people?
– **Suslov Ivan**
– Suggested a scenario involving distrust of social media because it is controlled by big tech companies seeking profit and user data.
– The instructor noted that this may not be an extremely radical belief on its own, but it could still be used for the exercise if framed as radicalized distrust.
– **Musaev Timur**
– Suggested a scenario in which people believe **AI will take over humanity by the end of the 21st century**.
– The instructor labeled this as an “anti-AI radicals” scenario.
– **Sangmamadova Zamira**
– Suggested that school does not build creativity but quietly trains students to abandon it.
– The instructor connected this to anti-public-school or homeschooling radical narratives, noting that similar beliefs exist in the United States.
– **Yousufzai Khadija**
– Proposed a scenario involving a young Native American worker who loses his job, becomes angry, and follows people who blame others.
– She also included the backfire effect, noting that family disagreement could make him more defensive.
– The instructor said this was a good start but asked what specific propaganda narrative such a person might be given.
– **Amery Ainullah**
– Submitted a definition of radicalization as a process.
– The instructor clarified that the activity required a specific radicalization theme or narrative, not just a general definition.
– **Turgunalieva Nazbike**
– Submitted an idea about social media algorithms not directly telling people to become extreme but nevertheless pushing them toward more extreme content.
– The instructor said this explains why radicalizing narratives work, but again asked for a specific narrative example.
### Scenario selection
– The instructor counted the usable scenarios and used a random number generator.
– The selected scenario was **anti-vaccine fear**, submitted by **Samatbekova Elaiym**.
—
## 7. De-radicalization Scenario: Anti-Vaccine Microchip Conspiracy
### Scenario details
The instructor then adopted the role of a partially radicalized family member at a dinner table.
His assigned belief was:
– Vaccines are not just harmful or poisonous.
– Vaccines may contain tracking devices or microchips.
– Shadowy pharmaceutical companies or powerful figures may use vaccines to track people.
– The instructor referenced a common version of the conspiracy involving the **Bill and Melinda Gates Foundation**, though he noted that students could imagine any shadowy actor.
The instructor clarified that this was not invented for the activity; it reflects a real conspiracy narrative that has circulated online.
### Student task
Students were placed into breakout rooms in groups of approximately three. Their task was to create a strategy to de-radicalize the instructor-character.
They were specifically instructed to avoid:
– **Identity threat**
– They should not make the person feel attacked for who they are or what group they identify with.
– **Backfire effect**
– They should not present evidence in a way that makes the person reject it more strongly.
The instructor recommended that groups begin by asking:
– What is the underlying fear beneath the belief that vaccines are tracking devices?
– Is the fear about control, bodily autonomy, surveillance, government power, pharmaceutical profit, or betrayal?
– How can that fear be addressed without dismissing the person or insulting them?
### Breakout room activity
– Students were sent to breakout rooms for approximately five minutes.
– They were asked to prepare a conversational strategy for de-radicalizing the instructor-character.
—
## 8. De-radicalization Practice: Group 5 Strategy and Class Analysis
### Group selected to respond
After students returned from breakout rooms, the instructor asked for volunteers.
No group volunteered, so he used a random number generator.
– The selected group was **Group 5** for the activity, consisting of:
– **Ezgo Helen**
– **Suslov Ivan**
– **Kendirbaeva Kanykei**
### Strategy presented by Suslov Ivan
**Suslov Ivan** presented a hypothetical dinner-table story. He said, in role, that:
– He works in the United Nations.
– He originally entered the pharmacy/vaccine field because he had always been anti-vaccine.
– His original goal was to reduce government and corporate access to vaccines.
– He still considers himself strongly anti-vaccine in many ways.
– He believes the COVID vaccine rollout was a “total mess.”
– However, while working in Africa, he saw an Ebola vaccination campaign save an entire tribe.
The instructor paused to ask whether this was Ivan’s real personal story or a hypothetical scenario. Ivan clarified that it was hypothetical.
### Instructor analysis: Why this strategy worked
The instructor praised the story as unusually strong and believable. He then broke down why it could help establish trust with a radicalized person.
Key elements identified:
#### 1. Establishing shared identity
– Ivan’s opening move was to say that he was also anti-vaccine.
– This immediately positioned him as an insider rather than an outsider attacking the belief.
– The instructor emphasized that this was the most important trust-building element.
– A radicalized person is more likely to listen if the speaker appears to share the same concerns.
#### 2. Becoming even more credible than the listener
– Ivan did not just say he was anti-vaccine.
– He claimed to have changed his life and entered the UN/pharmacy field in order to influence vaccine distribution from inside the system.
– This made him sound like someone deeply committed to the same concern, possibly even more committed than the radicalized person at the dinner table.
#### 3. Moderating the claim rather than directly reversing it
– Ivan did not immediately say “vaccines are safe” or “you are wrong.”
– Instead, he conceded something the anti-vaccine person might agree with: the COVID vaccine rollout was a mess.
– The instructor highlighted that this was a moderated version of the radical claim.
– It does not claim vaccines were a weapon against humanity.
– It simply says the rollout was flawed.
– This allows the speaker to move the listener a few steps away from the most extreme version without direct confrontation.
#### 4. Introducing evidence only after trust is established
– Ivan’s final evidence was the Ebola example in Africa, where vaccines saved lives.
– The instructor explained that if someone opened with this evidence immediately, a radicalized person might reject it.
– But after Ivan established insider credibility and partial agreement, the evidence became easier to accept.
### Class discussion contributions
– **Samatbekova Elaiym** commented that the UN is a familiar and visible organization associated with humanitarian action.
– She argued that people may trust someone connected to the UN because it is known for helping people and providing humanitarian or medical aid.
– The instructor agreed this was part of the credibility but pushed the class to identify the even more important point: Ivan’s claim that he was also anti-vaccine.
– **Kendirbaeva Kanykei** identified the key mechanism in the chat:
– Ivan built trust by saying, “I’m also anti-vax.”
– The instructor confirmed that this was exactly the core trust-building move.
– Another student, not clearly identifiable from the transcript, noted that Ivan’s story included evidence that a tribe in Africa survived because of vaccines.
– The instructor agreed that this was important evidence, but emphasized that evidence alone usually does not persuade strongly radicalized people unless trust is established first.
### Instructor’s concluding lesson from the activity
The instructor summarized the core takeaway:
– De-radicalization requires trust before evidence.
– If a radicalized person does not trust the speaker, facts may be rejected.
– Effective intervention often begins by reducing threat, establishing shared identity, and then slowly moderating the belief.
– Only after this foundation is laid should corrective evidence be introduced.
—
## 9. Closing and Preview of Wednesday
– The instructor apologized for keeping students approximately five minutes over time.
– He confirmed that the remaining presentations would take place during the final Wednesday session.
– The rest of Wednesday’s class would include one more activity similar to the de-radicalization simulation.
– The instructor described Wednesday as the end of the current course/book/module.
– Students were dismissed after the instructor checked for questions.
—
# Student Tracker
– **Gulobov Ruslan Sodikovich**
– Notified the instructor that **Samatbekova Elaiym** was missing/late at the beginning of class.
– **Samatbekova Elaiym Samatbekovna**
– Initially reported missing but later joined; contributed to Group 4’s migrant misinformation discussion, submitted the anti-vaccine radicalization scenario that was selected for the activity, and later explained how the UN’s humanitarian reputation could support trust.
– **Ibraimov Suban Kubanychevich**
– Credited as the technical/web development lead for Group 1’s GitHub-based interactive misinformation toolkit and shared the project link.
– **Yousufzai Khadija**
– Represented or helped identify Group 2’s Wednesday presentation plan and proposed a radicalization scenario involving a Native American worker losing his job and becoming defensive when challenged.
– **Ismailova Kamilla Renatovna**
– Named as part of Group 2 and submitted a radicalization scenario in the chat, though the exact content was not clear in the transcript.
– **Akylbekova Amina Batyrbekovna**
– Responded for Group 3 and confirmed that the group would present on Wednesday.
– **Azimshoev Ofarid Asalbekovich**
– Submitted an economic-crisis-related radicalization idea; the instructor prompted for a more specific propaganda narrative.
– **Suslov Ivan**
– Submitted a radicalization scenario about distrust of social media/big tech and later presented the strongest de-radicalization strategy, using a hypothetical UN/anti-vaccine insider story to build trust before introducing evidence.
– **Musaev Timur Arsenovich**
– Proposed an anti-AI radicalization scenario in which people believe AI will take over humanity by the end of the 21st century.
– **Sangmamadova Zamira Marodbekovna**
– Proposed an anti-school/public education radicalization scenario, arguing that schools train students to abandon creativity.
– **Amery Ainullah**
– Submitted a general definition of radicalization; the instructor redirected him toward identifying a concrete narrative or radicalizing theme.
– **Turgunalieva Nazbike Baktybekovna**
– Submitted an observation about social media algorithms pushing people toward extremism; the instructor asked for a more specific narrative example.
– **Kendirbaeva Kanykei Oskonovna**
– Participated in the selected de-radicalization breakout group and identified in chat that Ivan’s key trust-building move was presenting himself as “also anti-vax.”
– **Ezgo Helen**
– Was part of the selected breakout group for the de-radicalization response with Ivan and Kanykei.
– **Uncertain Group 1 presenter(s)**
– Presented the Nigeria-focused biolab misinformation toolkit, explained the target audience, prebunking strategy, humor-based design, behavioral focus, and outreach/distribution plan.
– **Uncertain Group 4 presenter(s)**
– Presented the Central Asian migrant toolkit, explaining the target population, vulnerability factors, Telegram-based intervention, trusted insider strategy, and possible existing migrant channels.
—
# Actionable Items
## Urgent: Before Wednesday’s Class
– **Remaining presentations scheduled for Wednesday:**
– Group 2
– Group 3
– Group 5
– Group 6
– Group 7
– Group 8
– **Groups presenting Wednesday should be ready with:**
– A clear target audience.
– A specific propaganda/misinformation threat.
– A concrete toolkit/intervention.
– A realistic dissemination plan.
– Specific platforms, groups, hashtags, influencers, admins, or channels where possible.
## Follow-Up for Group 1
– Identify more concrete Nigerian social media accounts, WhatsApp/Telegram communities, influencers, or hashtag clusters for distribution.
– Clarify how users move from encountering misinformation to encountering the toolkit link or prompt.
– Consider how the toolkit would be localized linguistically/culturally for Nigerian youth audiences.
## Follow-Up for Group 4
– Verify the exact name and accessibility of the migrant Telegram group identified as “MigrantEG” or similar.
– Identify additional Telegram/WhatsApp groups for Tajik and Uzbek migrant communities, not only Kyrgyz migrants.
– Clarify how the toolkit will gain admin cooperation inside existing migrant groups.
– Develop sample “myth vs. fact” messages for concrete migrant misinformation narratives.
## Pedagogical Notes for Instructor
– Students understood the importance of trust-building in de-radicalization once the Ivan example was analyzed.
– Some students still need practice distinguishing between:
– A general cause of radicalization, such as economic crisis or algorithms.
– A specific radicalizing narrative, such as “migrants are stealing jobs” or “vaccines contain tracking chips.”
– The de-radicalization activity worked well and could be continued Wednesday with another selected narrative.
Homework Instructions:
ASSIGNMENT #1: Prepare and Deliver Your Final Toolkit Presentation on Wednesday
You will present your group’s anti-propaganda/disinformation toolkit during our next class session on Wednesday. The purpose of this presentation is to show how your group identified a specific propagandistic threat, designed a practical toolkit to respond to that threat, and connected your strategy to the course concepts we have discussed, such as pre-bunking, de-radicalization, avoiding the backfire effect, understanding target audiences, and distributing interventions through trusted networks. Groups 2, 3, 5, 6, 7, and 8 are scheduled to present on Wednesday.
Instructions:
1. Confirm your group’s presentation plan with your group members before Wednesday.
– Make sure everyone in your group knows that your group is presenting on Wednesday.
– Use your group chat or another communication method outside of Zoom to coordinate.
– Decide who will speak during each part of the presentation.
2. Prepare a presentation that is approximately 7 minutes long.
– In class, the presentation time limit was stated as 7 minutes.
– Practice your presentation so that you can cover the most important information clearly within that time.
3. Begin your presentation by identifying your toolkit’s target audience.
– Explain who your toolkit is designed for.
– Be specific about the audience’s location, social context, media habits, vulnerabilities, and relationship to the propaganda or disinformation campaign.
– For example, the groups that presented in class identified audiences such as digitally active youth in Nigeria or Central Asian migrant workers in Russia.
4. Clearly explain the propagandistic threat or disinformation narrative your toolkit addresses.
– Describe the specific false, misleading, manipulative, or radicalizing message your audience is exposed to.
– Explain why this narrative is believable or emotionally powerful for the target audience.
– Identify the fear, anxiety, grievance, or vulnerability that the narrative exploits.
– Avoid speaking only in general terms such as “misinformation is spreading”; instead, name the specific narrative, theme, rumor, conspiracy, or campaign your toolkit is responding to.
5. Explain why your target audience is vulnerable to this narrative.
– Discuss the audience’s social, political, economic, cultural, or emotional context.
– Consider factors such as fear of job loss, distrust of institutions, dependence on family or community networks, exposure through social media, closed messaging groups, or previous experiences with authority.
– Connect your explanation to course concepts where appropriate.
6. Present the toolkit itself.
– Show what your group created or designed.
– This may include a website, Telegram bot, social media campaign, guide, poster, message template, intervention strategy, educational resource, game, fact-checking tool, or another practical product.
– If your toolkit is digital, be ready to show the interface, link, screenshots, prototype, or demonstration.
– If your toolkit is not fully built, clearly explain how it would function in practice.
7. Explain how the toolkit works step by step.
– Describe what a user would see, do, or experience when encountering your toolkit.
– Explain how the toolkit responds to the propaganda or disinformation narrative.
– If relevant, describe features such as:
1. Pre-bunking or “mental immunity” techniques.
2. Myth-versus-fact corrections.
3. Humor or non-confrontational messaging.
4. Source verification prompts.
5. Voice messages, images, short videos, or simple posts.
6. “Pause before sharing” reminders.
7. Community-based sharing through trusted people.
8. Practical advice for users facing real-world uncertainty.
8. Explain why your toolkit would be effective for this specific audience.
– Do not only say that the toolkit provides correct information.
– Explain why the audience would actually notice it, trust it, and use it.
– Consider whether the toolkit uses platforms the audience already uses, such as WhatsApp, Telegram, Facebook, TikTok, Instagram, or other relevant channels.
– Explain how your design avoids sounding judgmental, elitist, hostile, or like an outside attack on the audience’s identity.
9. Address how your audience will find or receive the toolkit.
– Be prepared to explain your distribution strategy in detail.
– You should answer the question: How will the people most exposed to this propaganda actually encounter your toolkit?
– Consider strategies such as:
1. Posting in existing social media groups or messaging channels.
2. Working with trusted community leaders, influencers, admins, migrants, students, activists, journalists, or local organizations.
3. Using hashtags already connected to the issue.
4. Using paid advertisements or organic posts.
5. Optimizing search terms so the toolkit appears when people search for related misinformation.
6. Sharing content through existing trusted networks rather than asking users to adopt an unfamiliar platform.
10. Identify specific online spaces, communities, hashtags, influencers, or channels if possible.
– Based on the feedback given to the groups that presented in class, you should be ready for questions about exactly where your toolkit would be promoted.
– If your project depends on Telegram groups, Instagram pages, WhatsApp networks, TikTok creators, Facebook communities, hashtags, or search engine results, try to identify concrete examples.
– If you cannot identify exact names, explain what kind of channels or accounts you would look for and how you would evaluate whether they are useful.
11. Explain who the trusted messengers would be.
– If your strategy depends on trusted people sharing the toolkit, explain who those people are.
– These could include community leaders, diaspora figures, local influencers, social media admins, students, family members, religious leaders, migrant workers, health workers, journalists, or people already respected within the audience’s network.
– Explain why the audience would trust these messengers more than official institutions or outside experts.
12. Connect your toolkit to relevant course concepts.
– You should explicitly show how your project reflects ideas from the course.
– Possible concepts include:
1. Pre-bunking.
2. De-bunking.
3. De-radicalization.
4. Avoiding the backfire effect.
5. Avoiding identity threat.
6. Building trust before correcting misinformation.
7. Using insiders rather than outsiders.
8. Addressing underlying fears rather than simply attacking beliefs.
9. Changing behavior before trying to change belief.
10. Slowing down sharing or forwarding behavior.
13. Anticipate questions from the professor and classmates.
– The groups that presented in class were asked questions about how their toolkit would reach the target audience, which exact channels or hashtags they would use, what misinformation narratives they were countering, and why the audience would trust the toolkit.
– Prepare answers to similar questions for your own project.
– You should especially be ready to answer:
1. What exact propaganda or disinformation narrative are you addressing?
2. Why is your audience vulnerable to it?
3. How will your audience find your toolkit?
4. Why would they trust it?
5. Who will help distribute it?
6. What existing online communities or platforms will you use?
7. How does your toolkit avoid making users defensive?
14. Make sure your presentation is organized and easy to follow.
– A recommended structure is:
1. Target audience.
2. Propaganda/disinformation threat.
3. Why the audience is vulnerable.
4. Toolkit overview.
5. How the toolkit works.
6. Why it will work.
7. Distribution strategy.
8. Course concept connection.
9. Brief conclusion.
15. Prepare any materials you need before class.
– If you are using slides, make sure they are complete and ready to share.
– If you are showing a website, bot, social media page, or other digital prototype, test the link in advance.
– If you need to share your screen, make sure the correct group member is ready to do so.
– If your toolkit has a link, be prepared to post it in the chat.
16. Rehearse as a group.
– Practice the presentation at least once before Wednesday.
– Make sure each speaker knows when to begin and end.
– Check that the presentation does not go significantly over 7 minutes.
– Make sure your explanation is clear even for classmates who have not worked on your topic.
17. Be ready to present during class on Wednesday.
– The majority of Wednesday’s class time will be used for the remaining group presentations.
– Groups 2, 3, 5, 6, 7, and 8 should be prepared to present when called.