by Dr. Sarah Payne, Instructional Designer, DLINQ

By now many of us are familiar with the ways the coronavirus pandemic has affected teaching and learning, for better and for worse. One of the most egregious trends to emerge from pandemic pedagogy is the use and expansion of online proctoring software. Companies such as Proctorio, HonorLock, and ProctorU use artificial intelligence, remote proctors, or both to surveill students while they take exams. The software requires students to enable webcams and microphones during the exam so that any suspicious behavior can be flagged as potential cheating. Incriminating actions can include looking away from the computer too frequently, leaving the room, or the presence of someone else. Instructors can then review flagged students to determine if cheating occurred.

Though levels of academic dishonesty in online courses are often similar to their in-person counterparts, that’s not what these ed tech companies would have you believe. As a self-described “learning integrity platform,” Proctorio claims to “ensure the integrity and value of distance learning and online certifications and degrees.” Similarly, ProctorU offers to “help you deter, detect and prevent cheating, authenticate identities, and protect your exam content.” HonorLock “prioritize[s] academic integrity and are continually innovating to hinder cheating.” The integrity of your exams and the credibility of online learning is at stake and they’re here to help. You wouldn’t want to hand out a “corona diploma” that’s worthless to employers, would you?

Academic integrity is a hallmark of higher education and at first glance these software companies may not seem that objectionable. However, as scholars such as Ruha Benjamin, Safiya Noble, and others have noted, technology and algorithms are rife with bias and enable exploitation and abuse. In order for the software to flag suspicious behavior, it must be taught what “normal” behavior looks like. Deviations from that baseline are then coded as suspicious. Of course, “suspicious” is not a neutral term, but one built on implicit and explicit biases. It should come as no surprise then that what the software recognizes as normal is often white, able-bodied, neurotypical, and cisgender. Other bodies and identities can register as aberrations.

Because proctoring software relies on a “normal” body, there are numerous opportunities to further marginalize students. Neurodivergent students or those with anxiety or ADHD might have difficulty sitting still for extended periods of time or limiting their eye movements so as not to seem suspicious. Scholar Joy Buolamwini documents the frequent inability of AI to detect BIPOC and proctoring software is no exception. Students with darker skin report that the software asks them to find better lighting, even when well lit, in order to detect their faces. Given that students are often asked to show an official form of identification before beginning the exam, transgender or nonbinary students may encounter issues if their gender presentation differs from the one on their ID. Women, who have assumed a disproportionate amount of childcare responsibilities during the pandemic, may not have uninterrupted hours to devote to an exam. Across the board, students are dealing with a variety of living situations and may not have a private room free of distractions. While some may argue that students in these circumstances could just talk to their instructors, students may not want to divulge private information such as their medical status or home environment.

Another issue regarding student privacy is that instructors have access to the exam recordings. A video of a student in their home that potentially exposes private information opens the door for exploitation and abuse. Unfortunately, there are numerous examples of instructors stalking and harassing their students. Proctoring software also solicits a range of student data that doesn’t always remain confidential. A petition by students at the University of Texas Dallas notes that HonorLock can “collect your face, driver’s license, and network information.” Given that TurnItIn, another ed tech company in the business of policing students, has engaged in dubious practices regarding student data, we would be right to question the data collection practices of proctoring software. To make matters worse, the CEO of Proctorio, Mike Olsen, came under fire last year when he posted a student’s private chat log on Redditduring an online argument.

Students across the country have initiated petitions protesting the use of proctoring software. Some schools have banned the use of such software altogether. Despite how much Proctorio prizes accountability, however, the company lashes out at any who dare ask it to be accountable. When Ian Linkletter, a learning technologist at the University of British Columbia, tweeted about Proctorio’s potential for abuse, Proctorio responded by suing him for copyright infringement and distribution of confidential materials. Linkletter organized a GoFundMe and colleagues organized a teach-in against surveillance to help cover his legal fees. Proctorio has engaged in similar Twitter feuds with others who criticize its product.

While there are many, many problems with online proctoring software, we must also cast a critical eye on the pedagogical beliefs that encourage the use of such software. If we view students as adversaries bent on cheating the system, then of course we’d want to surveil them to catch their inevitable dishonesty. But students aren’t our adversaries and they deserve compassion, now more than ever. Rather than engaging in punitive pedagogy, we might instead start from a place of trust and care for our students. We can listen to them when they express confusion, frustration, or distress about educational technologies and support them by creating meaningful assessments that don’t cause harm. Though proctoring software companies cast student assessment during the pandemic as a technical problem in need of a technical solution, this is instead a pedagogical question that demands a pedagogical answer. Numerous smart minds have already addressed how we can rethink the use of proctoring software. Audrey Watters, Shea Swauger, and Charles Logan are just a few who come to mind. Many of the action items below are taken from their work and I’m grateful for their expertise.

Take Action

  • If you don’t have to use proctoring software, then don’t.
  • If your institution requires you to use proctoring software, educate your students, as well as your colleagues, about the tool and its risks.
  • Advocate for students if they’re uncomfortable using this technology. Allow them to opt out if possible. Support students if they decide to organize against the technology.
  • Offer more frequent, low-stakes assignments rather than a couple of exams that account for a significant portion of the final grade. Quizzes, discussion forums, and reflective assignments are potential options.
  • Design assessments that ask students to draw from personal experience, to create an artifact, or to apply class material to real-world situations.

Keep reading!

Tea for Teaching: Remote Proctoring, podcast featuring Jessamyn Neuhaus and John Locke
Algorithms of Oppression, by Safiya Noble
Race After Technology, by Ruha Benjamin