by Dr. Amy Collier, Associate Provost for Digital Learning

As the pandemic surged, we turned to digital platforms. Tech companies like Zoom, Amazon, Microsoft, and Facebook saw their profits grow as we turned to their platforms for work and academic continuity, social connections, shopping, and other services. In the education sector, companies like Canvas and Blackboard (two Learning Management Systems) saw their user bases explode, and ed tech startups raised a record-breaking $2.2 billion in venture and private equity capital. Meanwhile, school systems and universities slashed budgets and laid off staff to stay afloat financially.

My concern is not just that we shifted our resources en masse to technology platforms, though yes, I have huge concerns about that. Educational institutions spent large amounts of money purchasing new software or extending existing contracts in the name of “academic continuity”–but the continuity of what, exactly? Platforms like Zoom, originally designed for workplaces, re-inscribed teacher-centered models of education and inequitable approaches to learning. Though the narrative we tell about digital platforms’ role in education is one of promise and innovation, what Audrey Watters calls the “ed tech imaginary,” the reality is that the use of these technologies, particularly during the pandemic, have further exacerbated issues of inequity and problematic pedagogical practices. As Watters writes, “ed-tech isn’t necessarily progressive pedagogically or politically… much of ed-tech is built on behaviorism, and its most famous advocate, you’ll recall, B. F. Skinner, famously did not believe in freedom. When it’s built to serve oppressive pedagogies and discriminatory institutions — when it’s built with a belief that students shouldn’t have agency but rather should be engineered and optimized, then ed-tech…is not the answer.”

Beyond the ways in which platforms shape our pedagogy, I have concerns about the longer-term impact of our increased reliance on digital platforms, as we give these extractive and even exploitative technologies more access to our daily lives and to our educational processes. Platforms profit from our data–data that are collected when we interact with the platform or with others on the platform, or data that platforms collect about us from other sources (our purchasing records, voting records, and other activity records). According to Nick Srnicek, author of Platform Capitalism, “suppression of privacy is at the heart of [platforms’] business models” (see also Shoshana Zuboff’s scholarship on Surveillance Capitalism; for a poignant look into the interplay of platform capitalism and Silicon Valley’s tech companies, check out Anna Wiener’s Uncanny Valley).

Citing Srnicek’s scholarship on platforms, Martin Henry writes that “platform capitalism has significant internal fissures that inevitably lead to a disembodying of student and teacher and a tendency for the university to move away from their role as nurturers of student learning and growth and guardians of knowledge, towards a position where students and teachers are seen within the data gathered around them as crops to be harvested.” Rather than resisting this treatment of students’ and teachers’ data, schools often justify extractive platforms for the purposes of assessment (e.g. learning analytics), community (e.g., social media communities), academic integrity (e.g., plagiarism software, online proctoring), security (social media monitoring), and more — intentionally or unintentionally buying into the suppression of privacy.

Contrary to the popular adage that kids these days don’t care about privacy, young people are rightfully concerned about how platforms’ extraction/use of their data impacts them, including (maybe especially) platforms they encounter in their schools. In his excellent article for the Middlebury Campus, Jake Gaughan denounces the College’s adoption of Facebook Campus in the fall of 2020, noting that the College’s use of the platform to connect students points Facebook’s exploitative data practices directly at Middlebury students. He adds:

@middlebury added a claim that “[Facebook Campus] is 100% designed for students” before linking to marketing material from fb.com. This is a lie. Facebook Campus is no more designed for students than an oil rig is designed for the Earth. Facebook needs our user data to profit, and they are willing to go to great lengths to extract it.

Now is the time to re-evaluate our individual and institutional relationships with digital platforms and to demand more engagement with these issues at the federal and state policy level. The United States is the only member of the Organization for Economic Cooperation and Development (OECD) that does not have a Data Protection Agency and the US Department of Education provides little guidance on student data privacy (and most of that limited guidance focuses on FERPA compliance), leaving policy decision making in the hands of local schools and institutions of higher education. Now is the time to create a culture of privacy that cares for student data and educates students about privacy, and act to recognize how technologies justified as a response to the pandemic need to be reconsidered in light of significant concerns about student privacy. As we re-evaluate platforms in light of privacy concerns, we also have an opportunity to think carefully about how these technologies support – or get in the way – of student-centered learning.

Take Action

Personal Action

Broader action

  • Advocate for your local schools and universities to center student privacy when making decisions about platforms. Build on the guidance provided by the Future of Privacy Forum’s Student Privacy Pledge 2020 and the Student Data Privacy Consortium’s model student data privacy agreement to help shape schools’ approaches to student data privacy.
  • Learn about platform cooperativism, and platforms designed with an orientation toward co-ownership and justice rather than extractive profit (for example, the Contratados platform).
  • Learn about and advocate for policies that limit the exploitative practices of platforms.
  • Teach students about platforms and their extractive data practices. A good place to start is Chris Gilliard’s Pedagogy and the Logic of Platforms.

 

Keep Reading

Anything by Audrey Watters at HackEducation

Anything by Chris Gilliard at Hypervisible

Privacy for Students, by EFF

Facebook Campus is not Middlebury’s friend, by Jack Gaughan

Student Privacy Compass

Also, check out our sister Digital Detox at Thompson Rivers University and Brenna Gray’s most recent post on a specific platform used in education: Habits, Data, and Things That Go Bump in the Night: Microsoft for Education.

Photo by Jason Dent on Unsplash