Loading...
Digital InclusionDigital Literacy

Digital Detox 2.6: Deconstructing Privacy

photo of a peephole on a gray door
[This post was originally sent as an email newsletter to our Digital Detox participants.]

Written by Amy Slay, DLINQ Instructional Designer

Privacy: the state of being alone, or the right to keep one’s personal matters and relationships secret. (Cambridge Dictionary)

The relationship between the outcome of the 2016 US presidential election and Silicon Valley’s predatory data practices has elevated concerns about privacy in the public sphere. 2018 bore witness to high level tech executives testifying before Congress about the role their platforms play(ed) in destabilizing democracy. In terms of policy and legislation, the US federal government falls woefully short of valuing and protecting the privacy of its citizens compared to other wealthy nations. In contrast with the data protection laws implemented in the EU last year, privacy in the US falls under the Fourth Amendment, which “protects people from unreasonable searches and seizures by the government.” However, privacy in the US typically means one person talking to another person.  When a third party (including internet platforms) is introduced, you may lose your right to privacy via the Third Party Doctrine.

These events have resulted in a certain level of consciousness-raising around the dangers of ad-revenue driven platforms and their terms of use/privacy policies, but the conversation should not stop there. In particular, important questions about the role of digital privacy in the lives of young people are obscured by dangerous myths such as the “digital native” narrative. There is often an assumption that millennials who grew up with the internet are self-sufficient and all-set to participate in an increasingly hybrid world. The digital native fallacy is closely linked to the assumption that technology is neutral and bias free — when in fact, technology is designed by humans (and not a diverse or representative group of humans) with particular world views and ideologies.

But, being tech savvy and quick to adapt to new interfaces is not synonymous with critical thinking, or with questioning the power dynamics and policies at play within privately-owned platforms (Facebook, Google, etc.) that are too often mistaken for a public commodity. While users can create accounts on these platforms at no direct financial cost, participating in these platforms is by no means free. Their business model is dependent upon selling adspace; their profit is derived from the amount of time you spend viewing ads and the number of times you click on them. The overwhelming market share of these corporate entities in an age of mass surveillance has resulted in what Zeynep Tufekci calls surveillance capitalism, a process by which proprietary algorithms — fueled by our data — push targeted ads (for everything from shoes to political mis/disinformation) to users. While some of this data is freely given, a great deal of it is not. The data collected and sold about us by platforms via pervasive web tracking typically includes:

  • given data — data that you provide/volunteer
  • extracted data — data that is taken from you without you volunteering it (your location, for example, anytime you open the Facebook app on your smartphone)
  • inferred data — assumptions that a platform makes about you based on the first two categories

Surveillance capitalism gives rise to myriad threats such as digital redlining — dangers that are not evenly distributed across our society. We tend to think of privacy as being controlled at the individual level, but we must recognize the deeply relational DNA of privacy. For example, anytime you download an app and allow it to access your contact book, that decision has implications for those contacts. What does that data point mean for your Black Lives Matter activist friend, your undocumented neighbor, your trans colleague, your sister who was a victim of stalking? In her book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, Virginia Eubanks writes that:

“Most people are targeted for digital scrutiny as members of social groups, not as individuals. People of color, migrants, unpopular religious groups, sexual minorities, the poor, and other oppressed and exploited populations bear a much higher burden of monitoring and tracking than advantaged groups. Marginalized groups face higher levels of data collection when they access public benefits, walk through highly policed neighborhoods, enter the health-care system, or cross national borders. The data acts to reinforce their marginality when it is used to target them for suspicion and extra scrutiny.”

Considering privacy from a critical lens opens a door to understanding how systemic injustice and inequality is reproduced and amplified in digital spaces. When our privacy is compromised in the form of a data breach, many of us fear severe reputational and/or financial consequences. However, “for the poor and marginal, invasions of privacy are often lethal matters. A stop-and-frisk can easily end in a police shooting. Data shared from a registry can lead to arrests or deportation. Scrutiny from a caseworker can tear a family apart.”

What can you do to protect your privacy and that of those around you? We offer some suggestions below.

Take Action

  • If you have a smartphone:
    • access services and sites via your web browser instead of installing apps (which often have very invasive Terms of Service) whenever possible
    • delete any apps that you do not use regularly; only enable the GPS function when you need it — keep it off the rest of the time (how to: iOS, Android)
    • do not give apps blanket access to your location information. You can control this at the app level (how to: iOS, Android).
  • Using privacy-oriented web tools can help you be safer on the web. But it’s not sufficient to end the story there. Conversations that put all the responsibility on the individual to protect their privacy should not be used to obscure the fact that predatory platforms must be held accountable and change their business models. Additionally, the most marginalized among us — the folks who have the greatest need for privacy-oriented tools — can paradoxically be the most endangered by them. For example, some governments treat using encryption as a jailable offense or block access to VPNs entirely.
  • Be an ally: Use your privilege to make it safer for minorities to use privacy-oriented tools. The more of us use them, the less danger there is of them being singled out.
  • Think very carefully about inviting more surveillance technology (like Alexa) into your home.

Keep Reading!

Did you miss our previous Detox articles? View them on the DLINQ blog



Sharing on social media?
Please use #DLINQdigdetox


Next newsletter >>
Resisting the “Average” and Including “Edge” Cases
Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: