Editing LFI Course Materials/Week 17: Algorithms as ideology
Jump to navigation
Jump to search
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
=== Week 17: Algorithms as ideology === | === Week 17: Algorithms as ideology === | ||
− | * Real time lecture: | + | * Real time lecture: August 16th 10 am Pacific/1 pm Eastern |
==== Overview ==== | ==== Overview ==== | ||
+ | ALGORITHMS AS IDEOLOGY: FIGHTING FASCISM IN MACHINE LEARNING & AUTOMATION | ||
+ | |||
+ | |||
Algorithms shape our world more than ever. But whose world are they creating? | Algorithms shape our world more than ever. But whose world are they creating? | ||
By understanding algorithms as a form of computationally-imposed ideology, we begin to see how machine learning systems often reinforce pre-existing oppressive structures. White colonialism, gender binary, prisons and mass-incarceration are just a few examples of ideologically-guided systems which become further entrenched with the use of automation and artificial intelligence. In this talk, we will look at several case studies demonstrating how these technologies enforce oppressive ideologies and the structural violence they inflict on marginalized communities. We will discuss how these ideological systems can be meaningfully opposed, as well as how we might counteract them using our own visions of a more just and equitable future. | By understanding algorithms as a form of computationally-imposed ideology, we begin to see how machine learning systems often reinforce pre-existing oppressive structures. White colonialism, gender binary, prisons and mass-incarceration are just a few examples of ideologically-guided systems which become further entrenched with the use of automation and artificial intelligence. In this talk, we will look at several case studies demonstrating how these technologies enforce oppressive ideologies and the structural violence they inflict on marginalized communities. We will discuss how these ideological systems can be meaningfully opposed, as well as how we might counteract them using our own visions of a more just and equitable future. | ||
+ | |||
+ | |||
+ | Janus Rose is a New York City-based writer and educator who studies technology’s impacts on privacy and human rights. Her work has been featured in DAZED Magazine, The New Yorker, VICE, and other print and online publications | ||
==== Readings ==== | ==== Readings ==== | ||
− | No readings this week! | + | No readings this week! |
==== Guest lecturer ==== | ==== Guest lecturer ==== | ||
− | + | Dr. Chris Gilliard, Professor of English at Macomb Community College, hypervisible.com | |
==== Discussion ==== | ==== Discussion ==== | ||
− | + | * What is the relationship between digital redlining and privacy? | |
==== Tasks ==== | ==== Tasks ==== | ||
− | * Discussion forum | + | * Discussion forum, small group work, and roadmap completion on wiki |