Photo by Pro2sound - Getty Images
Students at computers
“University of Florida Eliminates all DEI-Related Positions,” read a March 2, 2024, New York Times headline. The article documented how Florida’s decision to terminate funds for Diversity, Equity, and Inclusion (DEI) related programs resulted in the University of Florida removing all DEI-related positions from their campus. This is but one of a series of stories about how states such as Alabama and Indiana are working to eliminate DEI programs and content in education.
While the anti-DEI efforts have received much media coverage, little attention has been paid to how educational technologies (ed-tech) undermine the mission of DEI advocates.
DEI work is an outgrowth of affirmative action policies born of the 1961 Executive Order No. 10925, signed by President John F. Kennedy. The order and subsequent legislation resulted in schools largely voluntarily adopting affirmative action policies that, in education specifically, sought to increase the representation of historically underrepresented groups, such as women and people of color. Work done in the name of affirmative action never settled comfortably into the United States’ hyper-individualist culture. Since its passage, the courts wrestled with affirmative action, culminating in the 2023 Students for Fair Admissions v. Harvard, which effectively outlawed affirmative action on the grounds that current policies “lack sufficiently focused and measurable objectives warranting the use of race, unavoidably employ race in a negative manner, involve racial stereotyping, and lack meaningful end points.”
Like affirmative action, DEI has been skewered by individuals who do not believe in its mission. What is rarely discussed is how DEI advocates are often bamboozled by the ed-tech rhetoric into adopting tools and platforms that undermine the mission of DEI. The biggest ed-tech platforms and companies claim that their products adhere to DEI principles, but in practice, they counter the mission of DEI.
Captive Audience
Today, students and teachers are monitored—and monitor each other—by a complex set of surveillance tools found in common classroom software and hardware, such as Turnitin, ClassDojo, Illuminate Education and G Suite for Education, Chromebooks, and Apple tablets, that enable technology management, law enforcement, teachers, students, and families to monitor classrooms, school libraries, and reading lists. This, in addition to one’s personal devices which listen as well.
|
Rather than enhance education, these tools undermine the autonomy of students, teachers, and families and reduce them to data repositories to be mined by Big Tech corporations. Big Tech’s economic viability rests on tracking and surveilling users, then selling that data and its analysis to predict and modify human behavior. Entering classrooms, especially the classrooms of minors, enables unprecedented access to precious data. Despite this invasiveness, it is perfectly legal, thanks to 2012 changes to the student privacy rights bill, Family Educational Rights and Privacy Act of 1974 (FERPA), that allowed ed-tech companies to access the private information of enrolled students.
Because compulsory education makes the vast majority of young people in the United States a captive audience, the changes to FERPA transformed schools into a testing ground for new surveillance technologies. Often introduced under the guise of safety, surveillance technologies collect copious amounts of data beyond what might be needed for educational purposes. For example, Bark, a product specifically designed to monitor students’ communications, can read all student data, including emails, web searches, and social media posts made on their school-issued and personal devices.
In their pursuit of profit and access to data, ed-tech companies undermine equity, which refers to the campus commitment that all students receive the unique support needed to achieve student success. Due to algorithmic bias, the unfair and discriminatory outcomes that result from the bias coded into algorithms, ed-tech companies produce inequitable outcomes for historically marginalized communities. For example, research has shown algorithmic bias in ed-tech, such as admissions platforms incorrectly concluding that students of color and students with disabilities are more prone to criminality and diagnosing LGBTQ+ students with mental health problems. It is also worth noting that surveillance in schools is inequitable as poorer students’ economic challenges force them to depend on school-issued digital devices and platforms, while wealthy students can skirt school surveillance by purchasing personal devices.
Undermining Inclusivity
In addition to being inequitable, ed-tech tools often undermine inclusivity. For example, school-issued devices can and do alert campuses to student web searches about sexuality and in the process, have outed students’ sexual preference. As a result, the school closed down one of the few spaces that could potentially be inclusive for these students to explore their identity. Similarly, when surveillance is a prerequisite for education, students whose migrant status is in question face the additional challenge of protecting their place of residence, including any relatives whose status may also be contested.
Despite their rhetoric, ed-tech companies seem disinterested in promoting diversity. For example, Proctorio, a browser extension used in remote learning situations to scan the room via facial and gaze detection to determine if a student is cheating, seems to have not been coded to account for students with disabilities. Indeed, there have been cases where a student with a disability is scanned, and the program inaccurately accuses them of cheating. This discriminatory accusation creates an extra challenge for students with disabilities, who not only have to complete their education but also clear their name for an offense they did not commit. Relatedly, school districts have used algorithms in an effort to diversify their student body. Still, research has revealed that algorithmic biases in these platforms promote homogeneity, especially in terms of class and race, in schools.
As critical scholars, we argue that it is imperative to analyze, assess, and evaluate ed-tech tools and acknowledge their complexity. We do not aim to eradicate digital technologies from schools. However, the research is clear: Ed-tech, in its current form, does not support DEI. As a result, in addition to combating the anti-DEI efforts, DEI advocates must reflect upon how their use and support for ed-tech contributes to anti-DEI outcomes.
Allison Butler is a Senior Lecturer, Director of Undergraduate Advising, and the Director of the Media Literacy Certificate Program in the Department of Communication at the University of Massachusetts Amherst in Amherst, MA, where she teaches courses on critical media literacy and representations of education in the media. She is a contributor to The Media And Me: A Guide To Critical Media Literacy For Young People (2022) and co-author with Nolan Higdon of Surveillance Education Navigating the Conspicuous Absence of Privacy in Schools (Routledge, 2024).
Nolan Higdon is a founding member of the Critical Media Literacy Conference of the Americas, Project Censored National Judge, author, and university lecturer at Merrill College and the Education Department at University of California, Santa Cruz. He is a contributor The Media And Me: A Guide To Critical Media Literacy For Young People (2022) and co-author with Allison Butler of Surveillance Education Navigating the Conspicuous Absence of Privacy in Schools (Routledge, 2024).