Courageous Media
3 ways you can push back against election misinformation

3 ways you can push back against election misinformation


Election Day 2024 is quickly approaching and misinformation about candidates, issues and election processes poses a major issue, recognized by state election boards, elected officials and advocates.

As a researcher of collaborative and community technology governance, I am, unfortunately, here to tell you that platforms and politicians won’t save us from election misinformation. But the good news is that the American public is not helpless. There are simple steps we can all take this fall to contribute to a healthier online election dialogue.

In 2016, misinformation was very visible, with 6 percent of all news consumed in that presidential election cycle determined to be misinformation. In 2020, the problems posed by misinformation seemed even worse, likely exacerbated by the COVID-19 pandemic and four years of unchecked alternative facts in political speech. 

In 2024, misinformation is still all around us, from deepfakes to hateful anti-immigrant fiction. Based on casual observation, the problem appears worse than ever; research over the next few years will give us a better idea if that’s true or merely a side effect from exposure over such a long period of time.

Misinformation problems are expansive and systematic, yet over the last eight years, regulators have proven themselves too slow and platforms unwilling, despite hundreds of published studies offering governance solutions. There are many possible ways systems could intervene. 

For example, more meaningful content moderation or professional fact-checking could be implemented. Recommendation systems organizing users’ feeds could also be adjusted to provide diverse perspectives and balance against echo chambers. Platforms could also implement policies based on lessons from their practices in other countries; merely limiting content forwarding reduces the spread of election misinformation in India and Brazil.

These ideas are not new and are based on evidence, yet we are again surrounded by misinformation traffic without action to intervene.

It falls on the public to not accept the status quo and do something about misinformation. In the short term, I suggest three simple actions anyone can take.

  • Repurpose hashtags. Fandoms and celebrities often appropriate popular hashtags to suppress problematic content. This technique has played a key role in online activism and provides an easy way to bring healthy political speech to ongoing conversations that might be dominated by misinformation. From the BeyHive to Swifties, this approach has effectively limited the spread and visibility of deepfakes and misinformation. George Takei himself led a group of LGBTQ activists using #proudboys to obscure hateful and false content, reappropriating the label .
  • Flag problematic content. Given that misinformation and harmful content are more likely to be investigated by human moderators if reported, take a few seconds to let a platform know when you encounter misinformation. Simple guides exist for most major platforms, including Facebook, InstagramTikTok and YouTube. Notably, X no longer supports such reporting, as further evidence that platforms aren’t going to solve this problem alone.
  • Do not share links to articles that you didn’t read. Research shows that the spread of misinformation is largely shaped by forwarding habits that prioritize activity and audience engagement over content. People can disrupt the spread of misinformation by ensuring they are not unwitting accomplices.

These simple recommendations stem from my research exploring how communities govern misinformation in everyday environments and the research of many others. Misinformation is prevalent in many aspects of life — not only politics. 

In my research on misinformation governance on Instagram, I’ve explored how people perceive and manage misinformation around online dating, food and brand influencers. Even when the platform itself doesn’t act, online subcommunities find ways to reduce the impact and spread of harmful misinformation. Collective action can work with the recommendation systems and platform moderation structures in place to overcome a few loud voices.

To be sure, much more needs to be done in the long term, but it is important for everyone to act as individuals now. I challenge each of you to engage in healthier political discourse.

Madelyn Rose Sanfilippo is an assistant professor in the School of Information Sciences, a public voices fellow and a member of the 24-25 OpEd Alumni Project sponsored by the University of Illinois. She is co-editor ofGoverning Misinformation in Everyday Knowledge Commons.”


Source link

Christopher Hyland

Add comment

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.