by Dr. Amy Collier, Associate Provost for Digital Learning, DLINQ

This year, the Oxford English Dictionary named “infodemic” as one of its 2020 words of the year (breaking precedent, Oxford Languages chose more than 20 words of the year, noting that “2020 is not a year that could neatly be accommodated in one single ‘word of the year’”). Infodemic, a term combining the words information and epidemic, describes “a rapid and far-reaching spread of both accurate and inaccurate information about something, such as a disease. As facts, rumors, and fears mix and disperse, it becomes difficult to learn essential information about an issue” (Merriam-Webster, 2020).

In a global pandemic, an infodemic creates a dangerous situation–people need reliable and accurate information fast in order to make life-saving decisions but often turn to rumors, unverified claims, or even disinformation to alleviate their anxieties or justify their actions. When we turn to mis/disinformation during a pandemic, people die. Mis/disinformation that undermines safety measures to curb the spread of COVID, such as anti-masking and anti-physical distancing mis/disinformation, and that undermines trust in the COVID vaccine, have led to and will lead to preventable deaths and long-term health impacts for survivors.

We see mis/disinformation daily, whether we realize it or not, and it’s often spread by well-meaning folks (or, at least, folks who are not intentionally part of massive disinformation campaigns aimed at harming us) via digital platforms. Digital platforms like Facebook and Twitter make it easy to spread mis/disinformation and especially if folks don’t have low-friction strategies to understand what to trust and what not to trust on the web or social media.

My colleague and dear long-time friend, Mike Caulfield, developed an excellent set of strategies to help us deal with information that we’re unsure about. What I love most about Mike’s approach is that it is built on understanding how the web works and how limited our attention is–it’s not a 6-month course on digital literacy, it’s a set of practical easy-to-use strategies to help us better contextualize what we see on the web and to do so quickly, before we click the Share button. The video below shows Mike’s introduction to his approach.

Mike’s strategies have a helpful acronym, SIFT, which stands for:

S – STOP. This is a reminder to check our emotions. Stop, reflect, and verify the information before you share. In a pandemic, this could be a life-saving decision.

I – Investigate the source. Here you want to see what reliable sources say about the source–not what the source says about itself. Professional fact-checkers use a technique called lateral reading to do this.

F – Find better coverage. Chances are, if this is a real news story, reliable news sources will also have coverage of it. Or, the fact-checking arms of news organizations, or fact-checking organizations, will have covered the story and provided information/evidence to support or debunk it.

T – Trace media and quotes back to their original context and source. Images, quotes, even videos are often clipped, altered, or false-framed to change how people understand them. Try to get as close to the original context and source as possible.

I could walk through each of the SIFT strategies here, but instead I’ll share Mike’s very clear explanations on his Infodemic site, which includes quick and easy habits (like Control + F) to help you verify information you see on the web. Check out: https://infodemic.blog/ These small and easy-to-learn habits have changed how I interact on the web. They have changed what I share and how I share information via social media. And they have shown me how, with small changes to what I do everyday, I can be a critical, but not cynical, consumer of information on the web. We need more of this–the events at the Capitol last week underscore how dangerous mis/disinformation can be, not just to our health and safety, but to a functioning civil society. And they showed how easy it is to participate in the spreading of false information at a time of crisis.

You might be saying as you read this, “I get that mis/disinformation is a problem, but why is it a ‘digital equity and inclusion’ problem?” There are many ways to answer this question, but I’ll highlight just one. BIPOC communities and low-income communities are being ravaged by the COVID-19 pandemic. The spread of digital mis/disinformation exacerbates this problem by deepening mistrust of medical professionals and treatments among these groups, who may already be skeptical due to a “long history of medical mistreatment and ongoing lived experiences of systemic racism in health care today.” (see also this Fortune article; similarly, we also see election disinformation targeting communities of color to discourage them from voting). And ultimately, as mis/disinformation spreads and makes its way to people in power–policy- and decision-makers–their decisions about how to handle the pandemic (or decisions to NOT handle the pandemic) have a direct impact on their constituents, with a disproportionate impact on folks in marginalized communities.

We can all take part in efforts to address the toll COVID-19 is taking on marginalized communities (see work being done by the Black Coalition Against COVID-19 below) but let’s first make sure that we are not contributing to the toxic mis/disinformation problem that is exacerbating those impacts. Start using SIFT strategies today!

Take Action

Keep reading!

Health vs. hoax: Immunize yourself against health misinformation online, Tactical Tech’s Data Detox Kit

6 tips to steer clear of misinformation online, Tactical Tech’s Data Detox Kit

How to share the science of COVID-19 in uncertain times, Citizens and Tech

Unmasking misinformation, Chris Coward TED talk

Did you miss the our 2020 Digital Detox articles? View them on our Digital Detox site

Next article: COVID and equitable access to education

Featured photo by Obi Onyeador on Unsplash