Instagram Is Pushing Anti-Vaccine Misinformation and QAnon Content, Study Finds

Lifestyle

In the past year, social platforms like Facebook and Twitter have made very public efforts to try to prevent the proliferation of Covid-19-related misinformation and election fraud conspiracy theories. Yet a new study from the Center for Countering Digital Hate, which tracks the spread of misinformation on digital platforms, suggests that at least one platform — Instagram — has failed in its efforts to curb such content.

According to Imran Ahmed, CEO of the U.K.-based organization, the study was prompted by Instagram’s August 2020 rollout of a new feature called “suggested posts,” which appeared when users reached the bottoms of their feeds. The study’s authors found it curious that the platform would introduce such a feature in the midst of the pandemic, when COVID-19-related misinformation was abounding across social media. They were particularly interested in studying Instagram, which Ahmed refers to as the “fastest-growing” platform regarding misinformation about vaccines, which has been “driven by a new wave of influencers who’ve come along in the anti-vaxx space driven in part by the opportunity COVID-19 has presented,” he says.

To conduct the study, which took place from September to November, its authors set up 15 Instagram profiles. Some of the profiles followed anti-vaccine accounts, wellness influencers, and those peddling QAnon, the baseless conspiracy theory positing that Donald Trump is lying in wait to bust a cabal of child-trafficking public figures (previous research and reporting from Rolling Stone has documented the overlap between wellness culture and conspiracy theorist communities). Some of the profiles followed only verified health authorities such as the World Health Organization, while the final set of accounts followed a mix of organizations like the WHO as well as QAnon and anti-vaccine conspiracy theorists.

The researchers found that the profiles that only followed health authorities were not recommended misinformation in their suggested posts, their suggested accounts to follow, or in Instagram’s explore tab, which recommends new content to users. But that was not the case for the profiles that followed anti-vaccine accounts, wellness accounts with anti-vaccine crossover, and/or QAnon content. Interestingly, the researchers determined that Instagram’s algorithm yielded a cross-pollination effect: those who followed anti-vaccine accounts, for instance, would be recommended QAnon content, or anti-Semitic conspiracy theories.

“What the algorithm recognizes is that if you believe in one conspiracy theory, you are highly susceptible to more. We know this from the social psychology of conspiracism: it is driven by this deep feeling of uncertainty, but it doesn’t satiate that uncertainty so people tend to go down rabbit holes,” Ahmed explains. “This is the algorithm encouraging people down multiple rabbit holes so they enter a warren of misinformation.”

Ahmed ties these results directly to the events of the January 6th insurrection, which represented a “convergence” of members from various extremist fringe groups. “On January 6th, you had white supremacists, anti-vaxxers, and Roger Stone on the same stage. What on earth links them?,” he says. “What we see now is the algorithm is driving that convergence of radical worldviews and that’s really powerful.”

Facebook, which owns Instagram, did not immediately respond to Rolling Stone’s request for comment regarding the study. But in a statement to NPR, a Facebook representative took umbrage with the authors’ methodology, referring to it as “five months out of date and [using] an extremely small sample size of just 104 posts,” failing to take into account “the 12 million pieces of harmful misinformation related to vaccines and COVID-19 we’ve removed from Facebook and Instagram since the start of the pandemic.”

Ahmed disputes this critique, pointing out that the vast majority of the anti-vaccine or conspiracy theorist accounts in the study are still thriving on the platform. In response to the results of the study and others analyzing the effects of misinformation on social media, he says, Instagram and Facebook should remove prominent anti-vaccine influencers from its platforms (one of the most prominent anti-vaxxers, Robert F. Kennedy, Jr., was banned from Instagram last month, but as Ahmed points out, he has not yet been removed from Facebook). The platform should also adopt more “algorithmic transparency,” particularly for legislators concerned about the spread of misinformation, he says. But he is not optimistic this will happen. At the end of the day, he says, “these are immoral companies addicted to the profits from malignant information and malignant actors.”

Products You May Like

Articles You May Like

Common Mistakes Nearly All Men Make With Their Bags
Gary Gensler reviews accomplishments, defends crypto approach
Boston Bruins Troll Zach Bryan For Reportedly Getting Mad At Brianna Chickenfry For Singing A Morgan Wallen Song
Beacon Audiobooks Releases “Lew: The Life and Times of the Author of Ben Hur” By Author Michael E. Fox
Steve Madden to slash China sourcing as Trump’s tariff plan looms