The swath of QAnon followers may be too wide for social media companies to regulate.
QAnon has been named one of the largest extremism networks in the United States. The conspiracy theory, which began as a claim that President Trump (and Robert Mueller) were saving the world from cannibal pedophiles, grew out of 4chan message boards. Now it’s gone mainstream and it’s reaching moms and Instagram health gurus.
The group is reportedly led by “Q,” a so-called Trump insider who makes anonymous online postings. Followers of QAnon believe a lot of things but it boils down to a central message: Trump controls everything. It’s been embraced by the president and numerous politicians, but it has faced growing criticism about its presence online.
Boasting millions of devoted followers on Facebook, Twitter, and YouTube, there’s been little oversight by tech companies on regulating QAnon. The theory is also a major driver in misinformation, particularly about the 2020 election. Members of Congress have condemned the movement on Capitol Hill and in legislation, saying more needs to be done to address moderating its internet reach.
Facebook announced in October that it was banning the group from its platform, but it’s unclear how well the social networking site will enforce its new rules. Some civil rights groups claim Facebook’s actions compromise free speech; government agencies, meanwhile, have shown QAnon has been linked to real-world violence.
Social media algorithms, like Facebook’s, make QAnon-related content easily accessible for people who are dedicated followers or just simply curious. Before people know it, users can find themselves down a rabbit hole. All it takes is one click.
Follow this storystream for all of Vox’s QAnon coverage and updates.