Misinformation experts discussed social media, algorithms and artificial intelligence at a Tuesday panel hosted by The Information Futures Lab.
Titled “Everything We Know (And Don’t Know) About Tackling Rumors and Conspiracies,” the panel was moderated by Claire Wardle, a co-director of the IFL and a professor of the practice of health services, policy and practice.
Despite its societal impact, research on media misinformation remains “a young field,” according to Stefanie Friedoff, another co-director of the IFL and an associate professor of the practice of health services, policy and practice.
Having worked as a senior policy advisor on the White House COVID-19 Response Team, she later contributed to a literature review on pandemic misinformation interventions: a topic she discussed at the panel.
“We’re significantly understudying this,” Friedoff said, citing a lack of longitudinal research on non-American and video-based misinformation. “We don’t have a lot of useful evidence to apply in the field, and we need to work on that.”
Evelyn Pérez-Verdia, founder of We Are Más, a strategic consulting firm, spoke about her work to combat misinformation at the panel. She aims to empower Spanish-speaking diasporas in South Florida through community-based trust-building: Recently, she has worked with the IFL as a fellow to conduct a survey of information needs in Florida.
According to Pérez-Verdia, non-English-speaking and immigrant communities are prone to misinformation because of language and cultural barriers. When people are offered accessible resources, she argues, communities become empowered and less susceptible to misinformation. “People are hungry for information,” she said.
Abbie Richards, another panelist and senior video producer at Media Matters for America, a watchdog journalism organization, identified social media algorithms as an exacerbating factor. In a video shown during the panel, Richards highlighted the proliferation of misleading or inaccurate content on platforms like TikTok. As a video producer, she looks to distill research and discourse on this topic for “audiences who wouldn’t necessarily read research papers,” she said.
She researched AI-generated content on social media, which is often designed to take advantage of the various platform’s monetization policies. “There’s a monetization aspect behind this content,” Richards elaborated.
Algorithms are “designed to show (users) what they want to see and what they’ll engage with,” she said. When viewers “feel disempowered … it makes it really easy to gravitate towards misinformation."
When discussing AI-generated misinformation that is designed to be entertaining, Freidhoff noted that only “some of us have the luxury to laugh” at misinformation.
“But from the perspective of somebody behind the paywall, who doesn’t necessarily speak English,” factual information becomes increasingly difficult to access,” she added. She describes this as “misinformation inequities,” which all speakers acknowledged existed in their projects.
In an interview with The Herald, Friedhoff and Wardle emphasized how the “online information ecosystem” connects different types of misinformation. Vaccine skepticism, Wardle said, is a slippery slope towards climate change denial: “We have to understand as researchers and practitioners that we can't think in silos.”
Many of the speakers agreed that misinformation spreads in part because people tend to prioritize relationships — both in real life and parasocial — over fact. “There’s nothing more powerful than someone you trust and close to you,” Pérez-Verdia said.
Richards said emotional literacy is the backbone to navigating both AI and misinformation. This includes “teaching people how to recognize (confirmation bias) within themselves” and understanding common misinformation techniques.
When asked to offer potential solutions, the speakers offered a range of responses. Richards suggested a “marketing campaign for federal agencies” to facilitate increased governmental literacy that allows for all citizens to understand how the government functions. Pérez-Verdia also identified diverse and culturally conscientious government messaging as key, while Friedhoff recommended creating “community conversations” to explore perspectives rather than further polarizing them.
Audience member Benjy Renton, a research associate at the School of Public Health, was “inspired by” community-based approaches like Pérez-Verdia’s work: “it was great to see the diverse range of perspectives on misinformation.”
The speakers told The Herald that they found each other’s perspectives enlightening. “I'm somebody that people feel like they can go to because I've spent years talking about (misinformation),” Richards said in an interview with The Herald after the event. “But the idea of how you measure (trust) is fully beyond me.”
Pérez-Verdia ended the discussion by re-iterating the fight against misinformation as founded on teamwork: “When you look at all of these pieces, the women here, a collaboration where we all have our individual gifts… that’s exactly what needs to be done on a larger spectrum.”
Megan is a Senior Staff Writer covering community and activism in Providence. Born and raised in Hong Kong, she spends her free time drinking coffee and wishing she was Meg Ryan in a Nora Ephron movie.