Programmed Oppression
*This post has been written as part of my journey as a Ph.D student through University of Regina
Critical theory, the legacy of the theoretical work of the Frankfurt School, is centered on changing the circumstances of the oppressed. Giroux (2009) explains that critical theory in education, “…enables educators to become more knowledgeable about how teachers, students, and other educational workers become part of the system” that is emphasized through organizational practices, messages, and values; often referred to as the hidden curriculum. In the realm of educational technology, the pervasive school of thought has seen the inclusion of technology in the classroom as a hierarchy where those with increased access to devices, efficient broadband speeds, and prominent levels of classroom use are seen as more advanced than those without (Zhao, Gaoming, Lei, & Wei, 2016). While this mindset would lead one to believe that the oppressed are those without consistent access to educational technology it cannot be ignored that the nature of technology itself has been programmed in a manner that clearly marginalizes and even harms specific groups of people.
When examining the role critical theory can play in educational research, Giroux identified three central assumptions drawn from the positivist perspective (2009). The first looks at schools as a force to educate the oppressed about their situation and provide context as to where their group aligns within the hierarchy of oppressed versus oppressor. Education on how to effectively articulate oppression, the second assumption, requires in-depth analysis of one’s situation so that the cultural distortions of the oppressors is removed from the conversation. Lastly, education needs to establish motivational connection where the desires and needs of the marginalized population are identified and a vision for the future is established. With these assumptions in mind, technology development needs to evaluate not only their product(s) but the teams behind the product. What are the demographics of their programming teams? What perspectives are absent from the development team? What bias has been written into the code of the software and what potential affect could it have on end users?
Giroux (2009) highlights that students internalize the cultural messaging of the educational structure through every aspect of their experience, including elements that may be considered “insignificant practices of daily classroom life”. The Microsoft Office 365 language suite boasts over 60 languages, none of which are Indigenous languages for the treaty territories of Manitoba. What messaging does this send to students? That Indigenous languages are not valued? That Indigenous peoples cannot be involved in technology development? Programmed oppression becomes more apparent with the introduction of artificial intelligence (AI). Many of the prominent programs that our students interact with, including search engines and social media, have been programmed by teams that overly represent Caucasian cis-men. Leavy (2018) argues that, ‘if that data is laden with stereotypical concepts of gender, the resulting application of the technology will perpetuate this bias”. The result of this bias results in search findings, photo inclusion, and ads that align with the perspective of the programmers and end users who feel that they are not represented in society.
While digital equity and access to technology continues to be a central topic in the education world it cannot be assumed that the availability of these tools will bridge the gap between the “haves and have nots” of current society. The hierarchy approach to technology inclusion fails to address the systemic oppression that is written into the very code of the programs themselves. The hidden curriculum embedded within technology use needs to be openly discussed at all levels of education so that we can work together on changing the circumstances of the oppressed.
References
Giroux, H.A. (2009). Critical theory and educational practice. In A. Darder, M.P. Baltodano, &R.D. Torres (Eds.), The critical pedagogy reader (pp. 27-51). Routledge.
Leavy, S. (2018). Gender bias in artificial intelligence: the need for diversity and gender theory in machine learning. In Proceedings of the 1st International Workshop on Gender Equality in Software Engineering (GE ’18). Association for Computing Machinery, New York, NY, USA, 14–16. DOI: https://doi.org/10.1145/3195570.3195580
Zhao, Y., Gaoming, Z., Lei, J., & Wei, Q., (2016). Never send a human to do a machine’s job: Correcting the top 5 edtech mistakes. Thousand Oaks, CA: Corwin.
1 thought on “Programmed Oppression”