Loading...
Digital Literacy

Digital Detox 8: Ongoing reflections and readings

by Amy Collier, Associate Provost for Digital Learning

Throughout this series, we have offered tips and activities to detoxify your digital life. For many of us, detoxifying includes adopting more privacy protections to feel safer (and, we hope, be safer) when we engage online. Others might focus on developing daily discernment strategies that help us pick the best tools to engage online, and to approach those tools mindfully and intentionally. Whatever our goals are, detoxes like this–and more critical and mindful behaviors going forward–can bring more balance and peace into our lives, helping us to make space for quiet, for truly private moments, for creativity, for gratitude, for joy, and more.

Digital platforms and interactions cater to what Daniel Kahneman calls System 1 thinking, which “operates automatically and quickly, with little or no effort and no sense of voluntary control” (Thinking Fast and Slow, p. 20). Compare that to System 2 thinking, which “allocates attention to the effortful mental activities that demand it…The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration” (ibid). Both systems are essential parts of our cognition, but we often prefer to rely on the ease and comfort of System 1 thinking. And much of what we see in digital information and social spaces appeals to our emotions and, consequently, our System 1 (they’re actually designed to keep your System 1’s attention). Writing about social media, David Golumbia argues that it “too easily bypasses the rational or at least reasonable parts of our minds, on which a democratic public sphere depends. It speaks instead to the emotional, reactive, quick-fix parts of us, that are satisfied by images and clicks that look pleasing, that feed our egos, and that make us think we are heroic. But too often these feelings come at the expense of the deep thinking, planning, and interaction that democratic politics are built from.”

If you are starting to believe that social media and other web platforms create problems for free (as in freedom) discourse and action, you’re correct. As they focus on our System 1, meddling with our trust and attention, digital platforms create real interferences in our public spheres. For more on this, read Zeynep Tufekci’s fantastic article, It’s the (Democracy-Poisoning) Golden Age of Free Speech.

On a personal level, there are significant impacts to the push/nudge/draw of social media and the web. But what happens if we begin to make space for System 2, for more deliberate thought, reflection, and awareness? Can we introduce more System 2 thinking to our digital lives, and by intentionally managing or stepping away from the digital at times? By applying a more critical lens to our digital lives?

Check out this lovely TEDx talk that encourages a bit more System 2 thinking. I’m planning to be more intentional about slowing down, noticing, and daydreaming. I want to rediscover what TRULY private moments feel like. For what will you make time? How will you engage your System 2?

Throughout this detox, I enjoyed learning alongside you. As we did our research for each email, the DLINQ team found new resources and conversations to be had about our digital lives. Conversations in the margins of the Detox have been immensely rewarding and educational.

I want to share some additional resources that I plan to dig into over the next few months. I hope they are helpful to you as well.

Me and My Shadow: This is my favorite resource from the series. It is chock-full of helpful advice and strategies to help us be safer in the digital world. If you have been wondering throughout this series if you really need a digital detox, or if you have skeptics around you, check out their article Tracking…So What? 7 Things We Know You Are Going to Say.

Unlink hate: Did you know your class materials (syllabus & website) could be promoting hateful and disinforming websites? If you link to those site, even for educational purposes, you lend the credibility and reputation of your site and institution to those terrible sites. This article promotes unlinking hate.

Security planner: I really love this security assistance tool. It is well designed and has just enough interactivity to keep me going back for more recommendations.

How to spot a bot: Bots are a huge issue in social media. I became aware of their influence when a colleague, Kris Shaffer, became a target of bot harassment. That led me to his post, How to spot a bot (with the excellent Bill Fitzgerald), about how bots are spreading massive amount of disinformation on the web. Everyone should read this. If you teach with Wikipedia (i.e., having students edit content on Wikipedia), read this article about bot wars on Wikipedia.

Encrypt: Encryption is an area where I have not developed much aptitude, but I’m hoping to change that. This Consumer Reports guide is a helpful introduction (also this from the Electronic Frontier Foundation), and I want to explore how to bring better encryption to my digital interactions.

Privacy evaluations: Many of us are on the lookout for resources to help us be more discerning about platforms. This resource from Common Sense Media is a good start.

Teaching: We found some great teaching resources. My favorites are the Surveillance Self-Defense and the Security Education Curriculum created by the Electronic Frontier Foundation. I also love Me and My Shadow’s training curriculum. And my dear colleague Mike Caulfield has an excellent Four Moves blog that provides fact checking activities for students.

If you want to do more, I encourage you to join our emergent Information Environmentalism movement. We are exploring curricular and co-curricular ways to de-pollute digital information environments, in a variety of disciplinary contexts, cultural contexts, and languages. Contact Amy Collier to discuss what your class, student group, or community group can do.

Keep reading!

Books I have read and recommend:
The Black Box Society by Frank Pasquale (2015)

Weapons of Math Destruction by Cathy O’Neil (2016)

World Without Mind: The Existential Threat of Big Tech by Franklin Foer (2017)

Platform Capitalism by Nick Srnicek (2016)

Books I plan to read:
Automating Inequality by Virginia Eubanks (2018)

Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble (anticipated publication Feb. 20, 2018). While you’re waiting for Safiya’s book, read this interview with her and Sarah T. Roberts, titled Engine Failure.

The Internet of Us by Michael Patrick Lynch (2016)

What did you take away from the Detox? What’s next? Tell us!

 

Next newsletter
BONUS EDITION: What the loss of net neutrality means for educators and education
Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: