Skip to main content

Misinformation is raging. Here’s how we fight back

David Rapp provides guidance in an election year filled with deepfakes, bad actors and the hazards of AI
The World Health Organization has dubbed this time an “Infodemic,” comparing the rampant spread of false information to that of a virus amid the COVID-19 pandemic.

Between the ubiquity of social media, rising concerns about artificial intelligence and deepfakes, the fall of traditional media and a growing number of bad actors, more people are in danger of consuming and being influenced by misinformation at a critical time for democracy. So much so, the World Health Organization has dubbed an “Infodemic,” comparing the rampant spread of false information to that of a virus amid the COVID-19 pandemic.

Northwestern Now spoke with David Rapp, Walter Dill Scott Professor of Psychology and Learning Sciences at the School of Education and Social Policy, who has long researched the acquisition of accurate and inaccurate information. He gave guidance on what citizens can do to better prepare themselves to not become a victim of misinformation in the mist of election season.

The use of AI and deepfakes are becoming ubiquitous. From Taylor Swift holding Trump memorabilia to robocalls that mimic candidates’ voices. How will this affect the 2024 election?

Rapp: We’ll see much more of it. My hope is that the public outcry will create an educational space where people become more savvy, aware and sophisticated in looking at information to determine whether it's real or fake.

How can people become more savvy?

When you ask a person how confident they are in an idea, how much do you really believe it, they often times are not particularly confident in it. That offers an opportunity for confronting, challenging and refuting those ideas. Confidence is an element people haven't studied or thought enough about with respect to misinformation. It offers an entry way for fighting people's misinformed beliefs.

Is that how we become more effective in these conversations?

We don't want to embarrass people when we have these conversations or put them in a place where they feel defensive because that's only going to lead to arguments. If people start to question themselves on their own, then there is a chance they will evolve their opinion. People who accept they are not confident on a subject will look online for sources, read books, talk to other experts, they’ll ask for help.

How do we get someone who thinks they are impervious to misinformation to then question their relationship with information online?

At the very least, people need to think about what they’re posting, liking and sharing. People also need to have a willingness to contemplate a variety of sources they might not usually look at to think about what information is being presented.

As in, don’t just look at things that confirm your own bias?

Right. Evaluate who the source is, where they are getting their information from and what evidence is being presented to make those claims. Hiding sources results in us not being able to verify where the information is coming from. Information coming from the U.S. Census Bureau is substantially different from information coming from your grandma.

Why do you think it’s so hard to get people to change their mind when presented with new evidence?

When you ask a person to confront religion or politics, you're not asking them to confront some idea out there in the world, but confront their identity. That's hard for people. It’s difficult because their identities are part of being in that community. It feels like an indictment on them as a person.

What responsibility do the social media platforms have in this, if any?

Responsibility is a tricky word. Ethically, I'd love for these platforms to feel a sense of accountability. At the very least, acknowledge when information on their platforms is egregiously incorrect. That would be powerful. Will they do it? Will it have an impact? Who knows.