Lesson Report:
**Title:**
Demand-Side Propaganda and Moral Intuition: From “What Is Propaganda?� to “Why Does It Work on Us?�
**Synopsis:**
This lesson reviewed and refined students’ working definition of propaganda, emphasizing it as a neutral *strategy* rather than inherently false or immoral content. The class then shifted to the “demand side� of propaganda—why humans are so susceptible to it—using Jonathan Haidt’s work on moral intuition vs. reasoning and student-provided media examples to explore how emotions precede and shape our “rational� political judgments. The session concluded by connecting this psychological foundation to the upcoming focus on partisan political communication, and by assigning students state-funded media artifacts for future analysis.
—
## Attendance
– **Number of students explicitly mentioned as absent:** 0
– Several reminders were given about the **camera-on requirement** for being marked present.
– At least one student (Ali) was prompted to turn the camera on and complied; no one was explicitly recorded as absent in the transcript.
—
## Topics Covered (Chronological, with Detail)
### 1. Review and Refinement of “Propaganda� (Warm-Up & Conceptual Grounding)
**Activity:**
– Quick individual recall (1–2 minutes) where students wrote down what *is* and *is not* propaganda, using prior class concepts.
– Followed by a whole-class share and instructor synthesis.
**Student contributions:**
– **Ruslan:**
– Propaganda as “information shared to influence someone’s opinion or action,â€� often not truthful.
– This reflected the *intuitive* pre-course conception: propaganda = lying/manipulation by governments.
– **Elayim:**
– Propaganda as a **strategy** to get people to believe or act in a certain way.
– Distinguished between:
– **Information** – “pure facts,â€� goal is to convey factual content.
– **Knowledge/Education** – goal is to help people learn and change opinions by broadening understanding; “tells you *how* to think.â€�
– **Propaganda** – goal is to change opinions/behavior about specific things; “tells you *what* to think.â€�
– **Iman:**
– Example: **Islamic Republic of Iran** and how state information control shapes public understanding of protests and government.
– Emphasized controlled flow of information to influence perceptions — a contemporary example of propaganda use.
– **Ivan:**
– Reiterated:
– Propaganda **tells you what to think**.
– Education/information aim to **broaden horizons** and “tell you how to think.â€�
– Gave examples:
– Propaganda: wartime posters.
– Information: a syllabus or factual document.
– Emphasized that propaganda **narrows perspective**, unlike education/information.
**Instructor consolidation:**
– **Core definition for the course:**
– Propaganda = **a strategy** using information (which may be true or false) to make an audience **believe something specific and/or do something specific**.
– It is **not inherently false** and **not inherently immoral**.
– **Key distinctions:**
– Propaganda vs. **Information**:
– Information: simply relays facts.
– Propaganda: deploys facts (or non-facts) within a **deliberate persuasive strategy**.
– Propaganda vs. **Education**:
– Education ideally equips people to evaluate and think for themselves.
– Propaganda bypasses or constrains that evaluation by steering people directly to specific conclusions.
– **Normative neutrality:**
– Propaganda can be:
– **True or false**
– **Beneficial or harmful**
– Being “goodâ€� or “trueâ€� does **not** make something “not propaganda.â€�
**Example used to illustrate “good propaganda�:**
– **Graphic images on cigarette packs** (diseased lungs, decayed teeth, severe illness).
– Factual, truthful representation of health consequences.
– Clear strategic aim:
– Belief: “Smoking is dangerous and disgusting.â€�
– Action: “Don’t start smokingâ€� or “Quit if you smoke.â€�
– This is **propaganda** even if many consider it **socially beneficial**.
**Transition to new focus:**
– Last week: mostly **“supply sideâ€�** – What is propaganda? What counts as propaganda vs information or education?
– This week: **“demand sideâ€�** – Why are *we* (the audience) so vulnerable to it?
—
### 2. Framing the Week: Demand-Side of Propaganda & Readings Overview
**Objective:**
To introduce the psychological and political-science basis for human susceptibility to propaganda.
**Conceptual frame: Supply vs. Demand (Economics analogy):**
– **Supply side:**
– Actors who produce propaganda (e.g., governments, parties, interest groups, media outlets).
– They have agendas and create persuasive messages accordingly.
– **Demand side:**
– The **audience** and their psychological traits.
– Humans are:
– Easily influenced *en masse*.
– Frequently **overconfident** about their resistance (“I’m not easily manipulatedâ€�).
– Empirical research shows: people are **more vulnerable** to manipulation than they believe.
**Key questions posed:**
– Why are people psychologically and politically so vulnerable to propaganda?
– How do **emotion** and **rationality** interact in decision-making?
– How do these processes translate into **partisan attitudes** and receptivity to political communication?
**Assigned readings:**
1. **Haidt (pop-psychology, from a popular book):**
– Focus: When making decisions we consider “rational,â€� do we:
– First reason and then feel?
– Or first feel and then rationalize?
– Introduces the intuition vs. reasoning question.
2. **Van Bavel (political science / political psychology):**
– Focus: Why people have strong **partisan identities** and diverging political beliefs.
– How political communications influence us so easily.
– Explores group identity and motivated reasoning in political contexts.
—
### 3. Administrative Note: Camera-On Policy
**Reminders for new and existing students:**
– To be marked **present**, students **must keep cameras on** for the duration of the class.
– If a student cannot keep the camera on (technical or “societalâ€� reasons):
– They must **email the instructor and CC the department chair** explaining the situation.
– This policy had been sent in an earlier email and was re-emphasized for clarity.
—
### 4. Introduction to Haidt: Rationality vs. Emotion in Moral Judgment
**Objective:**
Prepare students to understand Haidt’s experimental work by clarifying the concept of “rational thinking� and the intuition/reasoning distinction.
**Class discussion: What does it mean to be rational?**
– Instructor prompted: “What does ‘rational’ mean in everyday and academic usage?â€�
– **Student responses:**
– Rational thinking = using **logic, reason, and facts rather than emotion** to make decisions.
– Linked to **rational choice theory** in political science:
– Decisions based on cost–benefit analysis, maximizing utility.
– Another student: a rational decision “makes sense based on the information/data we have.â€�
– Instructor:
– Connected this to the **dual-process** idea:
– “Coldâ€� cognitive system: evaluates pros/cons, facts, probabilities.
– “Hotâ€� emotional system: feelings, moral intuitions, desires, values.
– Framed Haidt’s core question:
– When people face a moral/political question:
– Is the *rational* system primary, with emotions following?
– Or do **emotions/intuitions react first**, and reasoning follows as a justification?
—
### 5. Haidt-Inspired Moral Intuition Exercise: The “Pet Dog� Scenario
**Disclaimer & setup:**
– Instructor warned that the upcoming example would be **gross and possibly disturbing**, used intentionally to surface strong **moral emotions**.
– Students read a short scenario (not fully transcribed, but clearly recognizable) in chat:
– A family’s **pet dog** is killed by a car.
– The dog is already dead; the family does not cause its death.
– They decide to **cook and eat** the dog.
– This act “brings the family closer together.â€�
– It happens in **private**; no one outside the family knows.
**Initial reactions:**
– Students described the story as:
– “Disgusting,â€� “unsettling,â€� “disturbing,â€� “silly/goofy in a dark way,â€� “sorrowful.â€�
– One student noted cultural variation (“for Chinese it’s okay, for us notâ€�), hinting at **cultural relativism**.
– Instructor asked for a quick judgment:
– Poll: Was what the family did **morally okay** or **morally wrong**?
– Almost unanimous choice: **morally wrong**.
**Debate structure:**
– Class split into **4 breakout rooms**, labeled by position:
– Rooms 1 & 3: Argue the family’s action was **morally wrong** (“Wrongâ€� rooms).
– Rooms 2 & 4: Argue it was **morally acceptable** (“Rightâ€� rooms).
– Critical constraint:
– Students were **not allowed** to use purely emotional or vague terms (e.g., “gross,â€� “disgusting,â€� “it just feels wrongâ€�).
– They had to build arguments based on **harm, rights, justice, law, cultural norms, measurable social effects**.
#### 5.1. Arguments Asserting the Family Was “Morally Wrong�
**From Room 1 / 3 reports:**
1. **Violation of cultural norms & values**
– Many cultures treat dogs (and pets generally) as **companions**, not food.
– Eating a pet dog **breaks deeply held social and moral norms**.
– Student argument:
– People’s behavior is shaped by cultural conditioning; when culture normalizes something like eating pets, it may **distort natural moral intuitions**.
– This was analogized to **propaganda** and social conditioning: culture as a propagandistic force that can override moral intuitions.
2. **Violation of law and social order**
– In many jurisdictions, eating pets (or handling animal remains this way) may be **illegal** or tightly regulated.
– Even if done privately, it can be seen as part of **erosion of legal/ethical standards**:
– Tolerating serious norm violations in private could normalize them socially over time.
– Thus, the act is wrong because it **violates or undermines law and public norms**.
3. **Animal rights / justice to animals**
– Even after death, animals may be thought to have certain **moral standing** and dignity.
– Eating a pet that loved and trusted the family violates implicit **duties of care and respect**.
– One student emphasized: “If you love a dog, you shouldn’t then eat it after it diesâ€� – framed as a justice/fairness issue toward a sentient being.
4. **Health and public safety (disease risk)**
– Dogs can carry diseases; eating dog meat could be **unhealthy** or dangerous.
– Therefore, the act could be wrong on public health grounds: risking the family’s health (and potentially broader community health depending on practices).
**Instructor synthesis for the “wrong� side:**
– Four primary rationalizable claims:
1. **Breaking cultural values** is inherently wrong.
2. **Breaking the law** (where applicable) is morally wrong.
3. **Violating animal rights/dignity** is wrong.
4. Consuming dog meat is **unhealthy/unsafe**.
#### 5.2. Arguments Asserting the Family’s Action Was “Morally Acceptable�
**From Rooms 2 & 4:**
1. **No additional harm to the animal**
– The dog was **already dead** due to a car accident.
– The family did not cause pain, suffering, or death.
– Therefore, **no new harm** was done to the dog by the decision to eat it.
2. **Cultural relativism & variation in food taboos**
– In some cultures, **eating dog** is accepted and integrated into local cuisine.
– A student drew a parallel with **horse meat**:
– They shared an experience of eating an old working horse in a mountain community.
– In that context, it was perceived as entirely normal and ethically acceptable.
– However, in other cultures (e.g., US), eating horse evokes disgust similar to eating dog.
– Thus, whether this act is wrong **depends on cultural norms**, and from the family’s cultural vantage point, it might be “normal.â€�
3. **Cleanliness and care of the animal**
– As a **family pet**, the dog likely received good veterinary care and clean living conditions.
– Therefore, from a **health perspective**, the meat may have been safer than that of stray animals or poorly farmed livestock.
4. **Privacy and lack of broader social harm**
– The act was done **in private**; no one outside the family witnessed it.
– It did not directly cause trauma or offense to others, nor did it set a visible public precedent.
– Therefore, arguments about “social harmâ€� via bad example are weaker.
5. **Psychological outcomes for the family**
– The text explicitly states that eating the dog **brought the family closer together** rather than traumatizing them.
– From a utilitarian standpoint, this created **positive emotional outcomes** (bonding), not harm.
– One student explicitly referenced the text’s claim to show that assumptions about trauma were contradicted by the scenario.
**Instructor synthesis for the “right� side:**
– Counterpoints to the “wrongâ€� arguments:
– **Animal’s rights**: what specific *right* is violated post-mortem?
– **Harm**: no additional suffering inflicted; animal is already dead; family derives positive bonding.
– **Law**: the scenario does not specify illegality; even if some jurisdictions forbid it, others do not.
– **Cultural norms**: what seems abhorrent in one culture is ordinary in another (dog vs. horse vs. cow, etc.).
#### 5.3. Thought Experiment: Changing Details of the Scenario
**Instructor-led variations:**
– Replace **“dogâ€� with “cowâ€�**:
– Pet cow killed by a car, family eats it.
– Most students found this **less disturbing**, though still somewhat odd if the cow was a *pet*.
– Remove the “petâ€� aspect:
– A family with a farm cow (not a pet) that dies; they butcher and eat it.
– Most emotional charge **dissipates**; scenario becomes quite ordinary.
**Insight surfaced by students:**
– **Attachment level** is critical:
– As one student (Chinara) noted:
– If it’s a **pet** with emotional bonds, eating it feels like a betrayal.
– If it’s “just livestock,â€� cultural expectations differ.
– Thus, two key emotional triggers:
1. The species category (**dog** vs. typical food animals).
2. The relational status (**pet** vs. livestock).
—
### 6. Connecting Back to Haidt’s Theory and the Demand Side of Propaganda
**Instructor’s analytic takeaway:**
– The exercise is not about endorsing dog-eating; the instructor explicitly noted personal discomfort and disapproval.
– **Core learning objective:**
– To illustrate Haidt’s finding that:
– When confronted with a moral scenario, people:
– Experience a **strong intuitive/emotional reaction first** (e.g., disgust).
– Then use their **rational faculties to justify** this initial reaction.
– This often produces **“moral dumbfoundingâ€�**:
– People insist something is wrong, but struggle to produce **consistent, harm-based, or rights-based arguments** that withstand counterexamples.
– **Implications for propaganda analysis:**
– Propaganda **targets our emotions and group-based intuitions** first.
– We typically **overestimate** how rational and independent our political views are.
– As analysts, students must:
1. Acknowledge that **emotion comes first, reasoning second** in many real-world judgments.
2. Learn to **separate feelings from facts**:
– Identify: What exactly am I feeling?
– Identify: What are the actual empirical claims being made?
– Ask: Are my conclusions supported by evidence or primarily by emotional resonance?
—
### 7. Application to Student-Chosen Media: Emotion → Belief/Action
**Objective for this segment:**
To connect Haidt’s model and the moral intuition exercise back to **real-world media and political communication**, including propaganda.
**Task reminder:**
– From the previous class, students had selected a **social media post** that made them feel a **strong emotion**.
– Now they were asked to:
1. Briefly recall what the media artifact was.
2. Describe the **emotions** it elicited.
3. Analyze **what the poster/organization wanted from them**.
4. Explain how those emotions were being used to **guide beliefs or actions**.
Students had ~3–5 minutes to revisit their chosen posts, then shared.
#### 7.1. Example 1: Epstein Files & Religious Framing (Instagram)
**Student description:**
– A **reposted Instagram thread** about the **Jeffrey Epstein files**.
– Included an image of a man with a little girl (implication: pedophilia, abuse).
– Textual framing:
– “If this man were Muslim, everyone would be discussing it, but he’s Jewish.â€�
– The post shifted emphasis from:
– The *specific crimes* (sexual exploitation/trafficking) to
– A **religious/ethnic double standard** in public and media reactions.
**Emotional impact:**
– Induced:
– **Anger** and frustration (especially at perceived anti-Muslim bias).
– Additional shock layered on top of existing outrage about Epstein’s crimes.
– Student noted that responses in comments were largely **emotionally driven**, with little verifiable information.
**Instructor probing & student analysis:**
– Emotional “hookâ€�:
– Leverages **religious identity** and perceptions of **unfair treatment** of Muslims vs. Jews.
– Religion is deeply tied to personal and group identity; references here can powerfully **amplify outrage**.
– What did the poster likely want?
– Upon checking the original poster (a woman), the student inferred:
– She wanted to **keep the Epstein issue salient**—preventing it from being forgotten or buried.
– The explicit message included something like “don’t stop talking about the Epstein files.â€�
– Strategic use of emotion:
– By framing the scandal as part of **systemic religious bias**, the post potentially recruits **new audiences** who may be less focused on Epstein per se but more sensitive to Islamophobia or double standards.
– Link to propaganda logic:
– This post blurs information and propaganda:
– It mentions a real scandal (Epstein).
– But leverages **identity-based anger** to maintain or increase engagement.
– It redirects attention from crime alone to **macro-level injustice claims** about religions/ethnicities.
#### 7.2. Example 2: Bishkek Patrol Officer Pushing an Elderly Man (Local Case)
**Media artifact:**
– A video posted by **Cactus Media** (Kyrgyzstani outlet), circulated on their site and Instagram.
– Content:
– A **patrol officer** managing a street as a **motorcade** passes.
– An **elderly man** attempts to cross the road.
– The officer pushes him aside forcefully; the man falls, appearing vulnerable and hurt.
– The video went viral and was heavily reposted across local media and social accounts.
**Emotional response:**
– The student reported:
– **Shock**, fear, **deep sympathy**, and **anger**.
– Emphasis on the man’s **vulnerability** (elderly, frail) and the absence of respect or protection.
– Culturally, treating elderly people this way is seen as highly unacceptable, making the video more emotionally charged.
**Information vs. propaganda discussion:**
– Student initially classified this as **information**, because:
– There is **concrete evidence** (clear video footage).
– The coverage used relatively **neutral language** describing what happened.
– But under discussion with the instructor:
– They explored how even “informationâ€� becomes **propaganda-like** when:
– It disproportionately features **emotionally provocative cases** (e.g., vulnerable victims, injustice).
– It is repeatedly reposted because it **generates engagement** (clicks, comments, shares).
– Instructor prompted a what-if:
– If the victim had been a **young, able-bodied man** instead of an elderly person:
– Would the post have gone as viral?
– Would reactions have been as outraged?
– Likely not—perhaps more humor or indifference, less moral condemnation.
– Thus, the **choice of clip** and **framing** is not neutral:
– It intentionally or unintentionally capitalizes on emotion (empathy for elders, anger at abuse of power).
**Media incentives:**
– Instructor emphasized:
– Even if news outlets are not cynically thinking solely in “dollar signs,â€� there is a **real economic and attention incentive** to feature **outrage-inducing content**.
– Emotional content:
– Drives higher **engagement**.
– Strengthens outlet visibility and relevance.
– This dynamic is central to understanding how propaganda and commercial media models intersect.
—
### 8. Closing Synthesis and Forward Link
**Main conceptual takeaway for this lesson:**
– To analyze propaganda effectively, students must:
– Recognize that **emotions are not optional add-ons**; they are **primary drivers** of moral and political judgment (per Haidt).
– Learn to **disentangle emotional reactions from empirical claims** in any media artifact.
– Identify emotional appeals: fear, anger, disgust, pride, empathy, etc.
– Identify factual content: what claims can be checked, verified, or falsified?
– Ask: How is the emotional content being used to **steer my reasoning** toward a particular belief or action?
**Next conceptual step (for upcoming classes):**
– Move from Haidt’s individual-level moral psychology to **Van Bavel’s** work on:
– Partisan identity.
– Group-based motivated reasoning.
– Why political messages (including state propaganda) resonate so strongly with in-groups and polarize societies.
—
## Actionable Items
### High Priority – Before the Next Class
– **State-funded media assignment:**
– Each student must:
– Identify **one state-funded or state-sponsored news outlet**
– Examples mentioned: RT (Russia Today), CGTN (China Global), NPR (US), or international outlets funded by the U.S. such as Azattyk/Radio Liberty.
– From that outlet, select **one 1–2 minute video** covering a **current global event** (e.g., Gaza, war in Ukraine, other significant international issues).
– Bring/link this specific video to class for analysis.
– **Complete required readings:**
– Finish the assigned chapter/passage from **Jonathan Haidt**.
– Read the **short, one-page infographic/summary of Van Bavel**.
– Optional (for deeper understanding): read the **full 25-page Van Bavel article** posted on the course platform.
### Medium Priority – For Upcoming Weeks (Not necessarily due next class)
– **Deepen analysis of your previously chosen social media artifact (from last week):**
– You’ve already hypothesized *why* the creator posted it (what they wanted from you).
– Next step (for future class):
– Think about **what evidence would be needed** to *support* your hypothesis about their motives.
– E.g., patterns in their other posts, funding sources, organizational mission, cross-posting networks, etc.
– You do **not** need to collect this evidence yet; just identify what would count as convincing proof.
– **Retain your chosen artifacts:**
– Keep:
– The **social media post** you analyzed this week.
– The **state-funded media video** you will select.
– These will be used in future exercises on:
– Distinguishing information vs. propaganda.
– Tracing emotional appeals.
– Inferring and then evidencing the strategic goals of communicators.
### Ongoing / Administrative
– **Camera policy compliance:**
– Keep cameras **on** for the full duration of each class to be marked present.
– If a student cannot comply:
– Email the instructor and **CC the department chair** explaining the reason (technical, societal, etc.).
– **Email communication:**
– Ensure you have read the earlier **administrative email** containing:
– Camera policy details.
– Contact information for the department chair.
– Zoom and e-course links as applicable.
### Individual Follow-Ups
– **Student attending from another class (Public Policy Analysis overlap):**
– One student (Felicia) confirmed she now has permission from Prof. Gorkin to attend this course **online** due to schedule conflicts.
– Agreed procedure:
– She will use **this same Zoom link** for future online attendance.
– No additional textbooks or readings are required for the **upcoming week** in the *other* (e-course) class she referenced (per instructor’s confirmation).
– Instructor action: No further action needed unless scheduling or access issues arise.
—
This report should allow you to reconstruct the lesson: from the conceptual refinement of propaganda, through Haidt-based moral intuition exercises, to the concrete application of those ideas to real-world media and the setup for analyzing state-funded propaganda in upcoming sessions.
Homework Instructions:
ASSIGNMENT #1: Evidence Planning for Your Propaganda Artifact
You deepen your understanding of how emotions and facts interact by revisiting the emotionally charged social media artifact you previously selected and thinking systematically about what *evidence* you would need in order to support your claims about the poster’s intentions and strategy.
Instructions:
1. Reopen the social media post or online artifact you chose for last week’s activity (the one that made you feel a particularly strong emotion and that you analyzed for “what they wanted from you�).
2. In your notes, briefly restate:
1. What the artifact is (platform, type of content, topic).
2. What emotions it evoked in you (e.g., anger, fear, disgust, pride).
3. What you concluded the poster/organization wanted from you as a viewer (e.g., to support a cause, hate a group, feel outrage, donate, share, vote a certain way).
4. How you thought they were using emotion to guide your beliefs or behavior (tying back to the Haidt discussion about emotions coming *before* rationalization).
3. Now, keep your existing hypothesis about their motives, but switch into an “evidence planner� role:
1. Ask yourself: *If I wanted to prove that my hypothesis about their intentions is actually correct, what concrete evidence would I need?*
2. Make a list of **specific types of evidence** that would support (or refute) your hypothesis. For example, depending on your artifact, this might include:
– Funding information for the account or organization (donors, government ties, party affiliations).
– Internal documents, interviews, or mission statements that state their goals.
– A consistent pattern in their posts (e.g., always targeting one group, always framing a topic the same way).
– Coordination with other accounts or campaigns on the same message.
– Metrics they use or celebrate (e.g., “we increased people’s fear/engagement on topic Xâ€�).
4. For each type of evidence you list, write 1–2 sentences explaining:
1. **Why** this evidence would be relevant to judging their motives.
2. How, if you found it, it would either **strengthen** or **weaken** your original claim about what they want from you.
5. You do **not** need to collect this evidence yet. For now, focus only on:
– Clarifying what would *count* as good evidence.
– Making sure your list is as concrete and realistic as possible (things a researcher could, in principle, actually look for).
6. Keep these notes somewhere you can easily access later. You will return to this artifact and your evidence plan in a future class, so be ready to quickly recall:
– The artifact,
– Your original interpretation,
– And your list of potential evidence.
ASSIGNMENT #2: Select a State-Funded News Outlet and a Short Video
You prepare for our upcoming analysis of the “demand side� of propaganda by choosing one state-funded news outlet and a short video about a current global event, which you will later dissect in terms of how it uses emotion to guide viewers’ beliefs—just as we discussed with Haidt’s findings and our in-class examples.
Instructions:
1. **Choose a state-funded media outlet.**
1. Identify a news organization that receives a **significant amount of its funding from a government**.
2. Acceptable examples (mentioned in class) include:
– RT (Russia Today – funded by the Russian government),
– CGTN / China Global Television Network (funded by the Chinese government),
– NPR (receives U.S. public/government funding),
– Azattyk/Radio Free Europe–Radio Liberty (funded by the U.S. Congress),
– Or another clearly state-funded outlet of your choice.
3. If you are unsure whether an outlet is state-funded, quickly check its “About� or “Funding� page, or look it up to confirm that it receives government support.
2. **Pick one short news video from that outlet.**
1. Within that outlet’s website, YouTube channel, or official social media, find a video that:
– Is approximately **1–2 minutes** long (shorter is fine; do not choose a 30–60 minute documentary).
– Covers a **current global or international event** (e.g., the war in Gaza, the war in Ukraine, international protests, elections, climate-related disasters, etc.).
2. Prefer a straightforward news or “explainer� segment rather than a purely entertainment clip.
3. **Watch the video carefully at least twice.**
1. On the **first viewing**, just note:
– What event or issue is being covered.
– The basic storyline and key claims.
2. On the **second viewing**, pay closer attention to:
– The language used (is it neutral, emotional, combative, sympathetic?).
– The choice of images and footage (which scenes are shown, which are *not* shown).
– Music, voice tone, pacing, and any captions or on-screen text.
– Any clear “good guysâ€� and “bad guysâ€� implied in the narrative.
4. In brief notes (bullet points are fine), write down:
1. The **name of the outlet** and **country** it is funded by.
2. The **title of the video**, its **date**, and a **working URL/link**.
3. A 2–3 sentence summary of:
– What the video says happened.
– Who is portrayed positively and who negatively (if anyone).
– What emotion(s) it seems designed to evoke in the viewer (e.g., fear, sympathy, anger, pride, indignation).
5. Bring this video (and your notes) ready for the next class:
1. Make sure you can quickly access the link during class (bookmark it or have it in your notes).
2. Be prepared to discuss:
– Which emotional reactions the video is trying to trigger.
– How those emotions might guide viewers’ “rationalâ€� judgments—connecting directly to Haidt’s argument that emotions often come first and reasoning follows.
ASSIGNMENT #3: Reading – Haidt Chapter and Short van Bavel Text
You strengthen the theoretical foundation for our work on propaganda by reading the assigned chapter by Haidt and the short van Bavel piece, so that you can connect our “gross debate� and the dog/cow/pet examples to empirical research on how moral judgment and partisan thinking actually work.
Instructions:
1. Locate the assigned readings on the course materials:
1. Find the **Jonathan Haidt** chapter that has been assigned for this course (the same one we referred to in class when discussing moral disgust and whether emotion or reason comes first).
2. Find the **short version of the van Bavel reading** (the one-page infographic/summary that has been uploaded).
3. Optional: locate the **full van Bavel article** (about 25 pages) that has also been posted, if you wish to go deeper.
2. **Read the Haidt chapter carefully.**
1. As you read, focus on:
– The central research question: *When people make moral or political judgments, do they lead with rational analysis or with emotion?*
– The experimental designs Haidt describes (e.g., scenarios similar to the disturbing story we discussed in class).
– What “moral dumbfoundingâ€� means and why it matters.
2. In your notes, briefly answer for yourself:
– What does Haidt argue about the relationship between **emotion** and **reason**?
– How does this help explain our strong reactions to certain scenarios (like the family eating the dog) even when no clear harm is present?
3. **Read the short van Bavel text (the one-page infographic/summary).**
1. As you read, concentrate on:
– How van Bavel connects psychology to **political behavior** (e.g., partisan identity, group loyalty, susceptibility to political messaging).
– Any claims about why people hold strong partisan views and why they can be easily influenced by political communication.
2. In your notes, jot down:
– 2–3 key ideas about **why people are so vulnerable to political manipulation**.
– At least one way these ideas relate to your own experiences with political or news media.
4. (Optional but recommended if you have time) **Read the full van Bavel article.**
1. If you choose to read it, focus on:
– The main arguments about identity, group psychology, and political persuasion.
– Any models or frameworks that seem especially helpful for analyzing propaganda.
5. As you read both Haidt and van Bavel, explicitly connect them to our class discussion:
1. Think about how Haidt’s findings showed up in your own reaction to the “family and the dog� scenario—where did you feel emotions first, then scramble to find reasons?
2. Think about how van Bavel’s ideas on partisanship and group identity might apply to:
– Your earlier social media artifact (from Assignment #1),
– And the state-funded news video you are selecting (Assignment #2).
6. Come to the next class prepared to:
1. Explain Haidt’s basic claim about emotion vs. reason in moral judgment in your own words.
2. Summarize at least one concrete takeaway from van Bavel about why people are so vulnerable to political and partisan propaganda.
3. Begin applying these ideas to your chosen state-funded news video and your earlier social media example.