Editing LFI Course Materials/Week 17: Algorithms as ideology

Jump to navigation Jump to search

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 1: Line 1:
 
=== Week 17: Algorithms as ideology ===
 
=== Week 17: Algorithms as ideology ===
* Real time lecture: September 13th 10 am Pacific/1 pm Eastern
+
* Real time lecture: August 16th 10 am Pacific/1 pm Eastern
  
 
==== Overview ====  
 
==== Overview ====  
 +
ALGORITHMS AS IDEOLOGY: FIGHTING FASCISM IN MACHINE LEARNING & AUTOMATION
 +
 +
 
Algorithms shape our world more than ever. But whose world are they creating?
 
Algorithms shape our world more than ever. But whose world are they creating?
  
 
By understanding algorithms as a form of computationally-imposed ideology, we begin to see how machine learning systems often reinforce pre-existing oppressive structures. White colonialism, gender binary, prisons and mass-incarceration are just a few examples of ideologically-guided systems which become further entrenched with the use of automation and artificial intelligence. In this talk, we will look at several case studies demonstrating how these technologies enforce oppressive ideologies and the structural violence they inflict on marginalized communities. We will discuss how these ideological systems can be meaningfully opposed, as well as how we might counteract them using our own visions of a more just and equitable future.
 
By understanding algorithms as a form of computationally-imposed ideology, we begin to see how machine learning systems often reinforce pre-existing oppressive structures. White colonialism, gender binary, prisons and mass-incarceration are just a few examples of ideologically-guided systems which become further entrenched with the use of automation and artificial intelligence. In this talk, we will look at several case studies demonstrating how these technologies enforce oppressive ideologies and the structural violence they inflict on marginalized communities. We will discuss how these ideological systems can be meaningfully opposed, as well as how we might counteract them using our own visions of a more just and equitable future.
 +
 +
 +
Janus Rose is a New York City-based writer and educator who studies technology’s impacts on privacy and human rights. Her work has been featured in DAZED Magazine, The New Yorker, VICE, and other print and online publications
  
 
==== Readings ====
 
==== Readings ====
No readings this week! Please use the time to work on final projects.
+
No readings this week!
  
 
==== Guest lecturer ====
 
==== Guest lecturer ====
Janus Rose is a New York City-based writer and educator who studies technology’s impacts on privacy and human rights. She is a a senior editor at Motherboard and her work has been featured in DAZED Magazine, The New Yorker, and other print and online publications.
+
Dr. Chris Gilliard, Professor of English at Macomb Community College, hypervisible.com
  
 
==== Discussion ====
 
==== Discussion ====
General discussion
+
* What is the relationship between digital redlining and privacy?
  
 
==== Tasks ====  
 
==== Tasks ====  
* Discussion forum and small group work
+
* Discussion forum, small group work, and roadmap completion on wiki

Please note that all contributions to Library Freedom Wiki Page may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see Library Freedom Wiki Page:Copyrights for details). Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel Editing help (opens in new window)