Home

Confronting Digital Extremism (English)

Course Information

The University of California, Irvine’s Office of Inclusive Excellence launched “Confronting Extremism” in 2017 as “a year-long campus initiative dedicated to understanding the ideas and behaviors advocated far outside of alignment to the campus values for social justice and equity in today’s society as a means to identify pathways for building positive campus and democratic communities” (seehttps://inclusion.uci.edu/confronting-extremism/). As part of this broader university initiative, we have developed four teaching modules on the topic of digital extremism that are designed to help raise awareness about different modes of extremist activity in online environments and propose effective means of confronting them. 
IMPORTANT NOTE: The only items that are covered by Creative Commons licensing are the videos and slides prepared by Stephen Rea, and the learning activity documents appearing in each module. All other items are NOT covered by Creative Commons licensing.
University of California, Irvine
Share on Facebook Share on Twitter
Author:
Stephen Rea
Creative Commons License
This work (Confronting Digital Extremism by Rea, Stephen) is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 United States License.
Provide a Testimonial

Module 1: Case Studies in Disinformation

This module defines three types of media manipulation—propaganda, misinformation, and disinformation—and briefly summarizes three contemporary examples of coordinated disinformation through digital media channels: Russian military intelligence operations in the Ukraine in 2014; the Duterte campaign’s social media strategy during the 2016 Philippines Presidential election; and the Internet Research Agency’s use of fake Facebook groups during the 2016 US Presidential election.

Video — Prepared by Instructor

Module 1 Presentation

Module 1 Slides (PDF Format) (English)

Video — External (not covered by Creative Commons licensing)

Frontline PBS. (2018). "The Facebook Dilemma — How Facebook was 'Weaponized' in Ukraine"

Readings

Ong, J. & Cabañes, J. (2018) Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Phillipines. Leicester: The Newton Tech4Dev Network.

Kim, Y. (2018). "Beware: Disguised as Your Community, Suspicious Groups May Target You Right Now for Election Interference Later." Madison, WI: Project DATA (August).

Additional Readings for Module 1 — Confronting Digital Extremism (English)

Learning Activity

Module 1 Learning Activity: What is your feed telling you? (English)

Module 2: Trolls and Extremists

This module compares the activities of so-called Internet “trolls” with those of digital extremists, defined as both disinformation campaigners and political extremists (focusing specifically on far-right, hate-based extremism). In particular, it identifies similar strategies and techniques that are employed by both groups in their online activities, such as the use of image-centric memes, leaderless mobilization, and targeted harassment. The circumstances surrounding #GamerGate in 2014 afford a window into online trolling’s convergence more overtly political extremism.

Video — Prepared by Instructor

Module 2 Presentation

Module 2 Presentation Slides (Downloadable PDF) (English)

​Video — External (not covered by Creative Commons licensing)

SciShow. (2016). “The Psychology of Trolling.”​

Readings

Daniels, J. (2018). "The Algorithmic Rise of the Alt-Right." Contexts 17(1): 60-65.

MacKinnon, R. & Zuckerman, E. (2012). "Don't Feed the Trolls." Index on Censorship 41(4): 14-24.

Phillips, W. (2015). "Race and the No-Spin Zone." In This is Why We Can't Have Nice Things: Mapping the Relationship Between Online Trolling and Mainstream Culture. Cambridge, MA: The MIT Press. Pp. 95-113.

Additional Readings for Module 2 — Confronting Digital Extremism (English)

Learning Activity

Module 2 Activity: Debating Section 230 (English)

Module 3: Algorithmic Exploitation

This module closely examines some of the techniques that digital extremists use to exploit algorithmic processes on social media platforms and with respect to search engines. It highlights three specific techniques using real-world examples: how automated social media accounts called “bots” helped drive narratives and engagement around specific topics or stories during the 2016 US Presidential campaign; how Russia’s Internet Research Agency used fake Facebook groups and Facebook’s targeted advertising services to coordinate voter suppression in the same election; and how a white supremacist group exploited “data voids” in online searches to promote its hate-based ideology and, indirectly, influence a 2015 mass shooting.

Video — Prepared by Instructor

Module 3 Presentation

Module 3 Presentation Slides (Downloadable PDF) — Confronting Digital Extremism (English)

Video — External (not covered by Creative Commons licensing)

TED Talks. (2017). “Zeynep Tufekci: We’re Building a Dystopia Just to Make People Click on Ads”

Readings

Vaidhyanathan, S. (2018). "The Disinformation Machine." In Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy. New York: Oxford University Press. Pp. 175-195.

Nadler, A., Crain, M., & Donovan, J. (2018). "Weaponizing the Digital Influence Machine: The Politics of Perils of Online Ad Tech." New York: Data & Society.

Golebiewski, M. & danah boyd. (2018). "Data Voids: Where Missing Data Can Easily Be Exploited." New York: Data & Society (May).

Noble, S. (2018). "Searching for People and Communities." In Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press. Pp. 110-118.

Additional Readings for Module 3 — Confronting Digital Extremism (English)

​Learning Activity

Module 3 Learning Activity: Exploring Algorithmic Recommendations (English)

Module 4: Toward a New Digital Civics

This module proposes lessons for “a new digital civics,” that is, a curriculum that addresses the affordances and hazards of participation in the digital media ecosystem. Going beyond media literacy and fact-checking efforts, new digital civics examine the technological capacities of social media platforms, search engines, and other aspects of digital media, how extremists have exploited those capacities to further their own messages and agendas, and practical strategies for confronting digital extremism. Three such strategies are introduced, targeted at different stakeholders: learning how to interpret and react to social media metadata; practicing “strategic silence” in news media coverage of digital extremists; and incentivizing the digital media ecosystem’s gatekeepers to use their powerful positions to nullify extremists’ ability to engage in exploitation and manipulation.​

Video — Prepared by Instructor

Module 4 Presentation

Module 4 Presentation Slides (Downloadable PDF) — Confronting Digital Extremism (English)

Videos — External (not covered by Creative Commons licensing)

WIRED. (2017). “Argument Clinic: How the Internet Tricks You Into Thinking You’re Always Right.”

Quartz. (2018). “Five Ways to Spot Fake News.”

Readings

Acker, A. (2018). "Data Craft: The Manipulation of Social Media Metadata." New York: Data & Society.

boyd, danah. (2014). It's Complicated: The Social Lives of Networked Teens (excerpt). New Haven: Yale University Press. Pp. 180-192.

Gillespie, T. (2018). "The Myth of the Neutral Platform." In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media. New Haven: Yale University Press. Pp. 24-44.

Additional Readings for Module 4 — Confronting Digital Extremism (English)

Learning Activity

Module 4 Learning Activity: Digital Civics in Practice (English)