Lesson Report:
## Title: Completing OSINT Media Reports: Temporal/Source Verification and Transition to “Politics of Truth�
This session завершила OSINT-ча�ть групповых медиа-отчетов: �туденты доработали **time assessment** (когда впервые по�вило�ь медиа) и **source assessment** (где и в каком контек�те впервые по�вило�ь). Затем кла�� начал переход к �ледующему �тапу кур�а — **политиче�кой интерпретации**: как медиа продвигает нарративы, � какой намеренно�тью, и на�колько �ффективно работает по�ледующий дебанкинг.
—
## Attendance
**Absent students explicitly noted as not present / not yet in class during roll/group setup: 6**
– Natalia (initially not present)
– Subhan (not present)
– Ainula (not present)
– Adidja/Khadija (not present)
– Kanake/Kaneki (not present)
– Alihan (not present)
*(Примечание: ча�ть �тудентов могла подключить�� позже; в тран�крипте они отмечены как “not here yet / not here�.)*
—
## Topics Covered (chronological; detailed)
### 1) Session roadmap and objectives (wrap up OSINT + connect to politics)
– Instructor framed the plan for the evening:
– **Finish OSINT reports** on each group’s chosen media item (complete the shared report form).
– **Share out** across groups to consolidate findings.
– **Bridge to “politics of truthâ€�**: identify political intentionality behind media/narratives and why they are powerful (selling an idea, dividing, chaos/instability).
– Students redirected to the **shared Google Doc** containing report templates and group work.
### 2) Group re-formation + logistical troubleshooting (Zoom breakout rooms)
– Instructor checked whether anyone missed prior sessions and needed to be re-added to groups (notably: **Elayim**, **Amina** had missed a prior class/week).
– Considerable time spent rebuilding breakout rooms due to Zoom limitation (rooms cannot be saved; renumbering issues).
– Clarification provided:
– **Room number mismatch didn’t matter**; what mattered was being with original groupmates.
– This section ensured groups were aligned before continuing OSINT tasks.
### 3) OSINT task instructions: Temporal assessment + Source attempt (core method)
Instructor gave explicit step-by-step instructions for completing remaining OSINT fields in the report:
**A. Temporal assessment (time verification)**
– Use **reverse image search** on:
– **Google Images**
– **Yandex Images**
– Identify:
– The **earliest online appearance** (first posting date/time).
– The **context** in that earliest appearance.
– Compare contexts:
– Does the earliest context match the narrative where the group originally found it?
– Or is the media being “reframedâ€� with a different story?
**B. Source attempt (origin tracing)**
– While reverse searching, document:
– **First location/platform** where it appears (site/account/platform).
– Any clues to **original uploader** vs. reposters/aggregators.
**C. Final intelligence assessment (if time)**
– If groups finished early, they should begin the concluding judgment:
– **Reliable information**
– **Misinformation** (real media attached to wrong context)
– **Disinformation/propaganda** (intentional narrative weaponization)
**Timing**
– Work period: ~10 minutes.
– Return time set (around “:23â€� mark).
### 4) Group presentations: time assessment + source assessment findings (OSINT results)
After returning, instructor facilitated sequential group reporting focused on: **when first posted** + **where first posted**.
#### Group 1 — “Olympic Village� video + AI-altered version (Telegram amplification)
– Original video traced to **TikTok / CBS channel**, posted **Feb 3** (Adrian Arsenault identified as correspondent).
– AI-altered version appeared **~3 days later**, first found in **Russian-speaking Telegram channels**:
– A pro-Russian channel described as “Patriotsâ€� (~200k followers).
– Another large Telegram channel (“Zaytaslavaâ€�).
– Spread later via “randomâ€�/low-credibility accounts on Facebook/X.
– Instructor prompted the next-step analytical question:
– After debunking/ fact-checking went public, **did reposting continue** or stop?
– This was framed as important evidence for “does debunking work?â€�
#### Group 2 — China/Taiwan compilation video (recycled archival footage + inflammatory headlines)
– Video described as a **compilation** of multiple clips (some stock/archival).
– Reverse searches (Yandex + Google) indicated footage spanning multiple years:
– Examples referenced: **2015**, **2021**, **2024**, **2025** (mix of events and contexts).
– The upload/source identified as **Times Now** (India-based English channel), posted **Feb 16, 2025**.
– Key OSINT finding: the compilation packages **old footage** as if it supports a current “China escalationâ€� narrative.
– Important mismatch identified:
– Footage from **Taiwan military drills (Han Kuang exercises)** used as if it depicts **China** (student used AI translation from Chinese to English to interpret original context).
– Instructor emphasized:
– Misattributing Taiwan actions as China actions is a strong indicator of **disinformation**.
– Next step for political analysis: why would the channel do this; what is gained by “bombastic claims + misaligned evidenceâ€�?
#### Group 3 — Iran protests video (Ekbatan/Iqbatan town claim; X/Instagram circulation)
– Group reported the video first uploaded **Feb 10** by **Ahmad Qureshi official X account**, with Iran protest hashtags; then spread on Twitter/Instagram.
– They also used AI tooling (e.g., Copilot) to assist in confirming date/place, concluding: “night in Tehran, Ekbatan town.â€�
– Instructor guidance:
– AI tools can help, but for OSINT rigor they should confirm using **non-AI reverse image search** (Google/Yandex) to find independent corroboration/other instances online.
#### Group 4 — Ukraine thermal power plant damage (multi-year similar attacks; mixed reporting)
– Group struggled due to many posts about similar attacks; identified that similar incidents were reported earlier (e.g., **CNN 2024** mentioned).
– For the **specific image in their case**:
– First upload traced to **Feb 3, 2026**
– Attribution chain included **Reuters** and **Radio Free Europe/Radio Liberty** (reported as republishing after initial citizen image).
– Complexity:
– Possible earlier damage to the same plant in **2022**; group found similar attack-related imagery but could not confirm “inside buildingâ€� matches.
– Instructor suggested an OSINT enrichment tool:
– **Google Earth time slider** to compare satellite imagery across months/years and track exterior damage progression (useful in Ukraine war OSINT).
#### Group 5 — Epstein files + AI image collage (parody account origin + screenshot verification issue)
– Group traced the collage narrative to timing of **DOJ release of “Epstein filesâ€� (Jan 30)**.
– Key timeline findings:
– One related AI-generated image published **Jan 31** by a **Twitter parody account** (“AI-powered meme engine…â€�).
– The analyzed collage post appeared **Feb 1** (first appearance they could find).
– Evidence included a screenshot where the purported creator “confirmsâ€� making the subject look like a baby to simulate an old photo.
– Instructor’s methodological push:
– A screenshot is not definitive; students must verify by checking the **original tweet on the account** (screenshots can be faked).
– Instructor also asked them to examine:
– Whether fact-checking reduced further spread (same “debunking efficacyâ€� theme).
#### Group 6 — Israeli airstrikes / South Lebanon video/image (possible recycling; uncertain match)
– Group suspected recycling of older war imagery; found fact-checking context via **BBC** (context date mentioned **Nov 27, 2025**, but not the same image).
– Reverse search via **Yandex** suggested earliest similar appearance around **Nov 19, 2024** on a news site; many reposts across Facebook/TikTok/Instagram.
– Instructor highlighted uncertainty:
– Not fully confident the images are identical (angle differs; could be different events/places).
– If claiming a match, they must explicitly describe **how** they matched (landmarks, skyline, consistent features).
– Also noted repeated bombing events complicate “earliest appearance = debunkâ€�; it may simply be a different strike.
#### Group 7 — Kyrgyzstan/Tajikistan conflict slideshow (cross-conflict photo reuse)
– Source: “Web of Tajik Musicâ€� YouTube channel (~6k subscribers); video got ~10k views (likely wider circulation via messengers).
– OSINT finding: slideshow included images from unrelated contexts:
– Example: destroyed mosque photo from **Syria 2013**.
– Key verification case:
– A burned bus/village image claimed to show Kyrgyz side burning Tajik village.
– Reverse image search showed it was actually from a **Kyrgyz village** (shop signs in Kyrgyz language visible), not Tajikistan.
– Timeline:
– Original photos posted **Sept 21, 2022** (both Kyrgyz news and an English-language international outlet).
– The misleading YouTube video posted **Sept 23, 2022**.
– Debunking/spread check:
– Student reported they couldn’t find many other reposts; the YouTube video remains, suggesting limited further spread post-debunking.
#### Group 9 (merged group due to absences) — Kyrgyz political leak photo (unclear origin; possible AI generation)
– Media: photo allegedly showing Kyrgyz political/security figures (Tashiev, Dr. Bayev, others) discussed on social media amid early-election petition context (open letter by ~75 public figures in early Feb 2026).
– OSINT result:
– Photo appeared **mid-Feb 2026** with **no earlier record**.
– Went viral through **Telegram/WhatsApp**, making first uploader hard to identify (no metadata trail).
– Kyrgyz news sites (24KG, Cactus Media, OpenKG) reported after it spread, but did not claim authorship or provide metadata.
– Additional “location hypothesisâ€� attempt:
– Student tried matching interior to a billiard club (“Condorâ€�) based on lighting; instructor was skeptical (common billiard lamps; other interior features differ).
– Instructor also noticed possible AI artifact cues (odd-looking hands), emphasizing that deeper Telegram tracing would be needed.
### 5) Transition to next phase: political message extraction (mini task)
– Instructor set up the “final stageâ€� for upcoming sessions:
– Return to the **original post** where the media was found.
– Identify the **political claim/message** implied: “a picture tells a thousand words—what words here?â€�
– Short breakout time (~2–3 minutes) to generate a one-sentence political message statement for each media item.
– Instructor closed by framing next week’s focus:
– Distinguish misinformation vs. disinformation vs. propaganda
– Study **intentionality** and **effects**
– Evaluate **debunking efficacy**
– Consider cases where propaganda uses **real facts** (weaponization without outright falsity)
### 6) Course logistics at end: readings + individual student issues
– Instructor announced **two short readings** to be posted on eCourse for next week.
– Post-class interactions:
– A student (Chinara) discussed a separate presentation project:
– Needed the final syllabus emailed.
– Instructor advised that their causal logic slide was overly linear/ambitious (historical memory → org capacity → activism → mass protests → institutional response → democratic resilience) and could be difficult to prove.
– Plan: student would email details; possible meeting later.
– Oferid asked about a late e-course video submission (20 minutes late due to computer problems):
– Instructor confirmed it was fine; no penalty implied.
—
## Actionable Items (short bullets; organized by urgency)
### High urgency (before next class / next week)
– **Post on eCourse:** upload the **two short readings** mentioned (few pages each).
– **Students:** complete readings prior to next session.
– **All groups:** prepare the next report step: a clear statement of the **political message/claim** being advanced by their media in its original post context.
### Medium urgency (to strengthen reports/assessments)
– **Groups with debunking available (e.g., Group 1, Group 5, Group 7):**
– Check whether the media **continued spreading after fact-checking** (collect at least 1–2 examples of post-debunk reposts or evidence of decline).
– **Group 5 (Epstein AI collage):**
– Verify whether the “creator confessionâ€� exists as an **actual tweet**, not only a screenshot (screenshots are easily fabricated).
– **Group 3 (Iran protests):**
– Re-run verification with **Google/Yandex reverse image search** (not only AI tools) to corroborate earliest posting and context.
– **Group 6 (South Lebanon strike imagery):**
– If claiming image match across dates/angles, document **explicit matching criteria** (unique landmarks, skyline, terrain, building shapes) or note uncertainty clearly.
– **Group 4 (Ukraine power plant):**
– Optional enhancement: try **Google Earth historical imagery** to triangulate when visible exterior damage appears.
### Low urgency / admin follow-ups
– **Chinara:** email the instructor the **final syllabus** (as requested).
– **Chinara project:** follow up on the instructor’s email feedback; schedule a meeting if needed to refine causal model and data strategy.
– **Oferid:** no further action needed (late upload excused), unless instructor wants to document policy in course notes.
Homework Instructions:
ASSIGNMENT #1: Two short readings on propaganda, debunking, and the “politics of truth�
You will read two brief texts that prepare you for next week’s shift from OSINT verification (time/source assessments) to the political analysis we began discussing in class—especially intentionality, propaganda vs. misinformation, and whether/when debunking actually works.
Instructions:
1. Locate the readings.
1. Open the course page and find the two readings your professor said he “is going to be putting on eCourse.�
2. Confirm you have access to both documents (each should be “pretty short, only a few pages each�).
2. Read both texts actively (not just skimming).
1. Read Reading 1 all the way through once to understand the main argument.
2. Re-read Reading 1 more slowly and annotate or take notes on:
– Key terms/definitions (especially anything that helps distinguish misinformation, disinformation, propaganda, and “weaponizedâ€� facts).
– Any claims about what makes political narratives persuasive or powerful.
– Any discussion of debunking/fact-checking and what affects whether it succeeds or fails.
3. Repeat the same process for Reading 2.
3. Write down discussion-ready notes (so you can use them next week).
1. For each reading, produce a short set of notes that includes:
– 3–5 main takeaways (in your own words).
– 2 concepts/claims that connect directly to what we did in class (your OSINT report work: time assessment, source assessment, and the final intelligence assessment categories such as reliable vs. misinformation vs. deliberate propaganda).
– 1 question or point of confusion you want to raise in class.
4. Connect the readings back to your group media case (briefly).
1. Look back at your group’s media example and your report work from class (especially the OSINT findings you shared: where/when the media first appeared, whether context changed, and whether it seemed intentional).
2. Write 3–5 bullets explaining how ideas from the readings might help you analyze:
– The political intentionality behind your media item (“the politics of truthâ€� framing from class).
– Whether debunking/fact-checking would likely reduce its spread—or why it might not.
5. Be prepared to use the readings next week.
1. Bring your notes to class (digital or printed).
2. Be ready to discuss how the readings help you move from “verification� (OSINT) to “political message/intentionality,� which your professor flagged as the focus for next week.
Deadline:
1. Complete both readings before the next class meeting next week (so you can participate in discussion and apply the ideas to your media analysis).