Home / Technology / How social media and gamification fueled QAnon’s ‘crowdsourced cult’

How social media and gamification fueled QAnon’s ‘crowdsourced cult’

Facebook announced this week it will ban all QAnon groups and content from its platform after critics lobbied to halt the rise of the conspiracy theory movement. But the move may have come too late to rein in a dangerous global group that has moved from the fringes of the internet into mainstream politics in just three years.

Blackbird.AI recently released a report titled “The Global QAnon Phenomenon” that takes an in-depth look at tactics and tools that have allowed the groups to build a massive following. The report highlights the role of social media platforms such as Facebook and Twitter but also points to underlying social conditions that have allowed so many people to be seduced by QAnon’s library of conspiracy theories.

QAnon’s flexibility and adaptability have allowed it to spread rapidly despite having no identifiable leaders. And the group continues to gain momentum as it draws in vulnerable believers who embrace it with religious fervor, according to Blackbird.AI CEO Wasim Khaled.

“QAnon is not just a group of conspiracy theorists or people who spread hoaxes to explain their world,” he said. “They’re probably the first known instance of a crowdsourced cult.”

Founded in 2014, Blackbird.AI has developed a platform that uses artificial intelligence to sift through massive amounts of content to dissect disinformation events. It uses a combination of machine learning and human specialists to identify and categorize the types of information flowing across social media and news sites. In doing so, Blackbird.AI can separate information being created by bots from human-generated content and track how it’s being amplified.

Typically, the company works with corporations and brands to monitor changes to their reputation. But Blackbird.AI has also looked at the spread of COVID-19 disinformation, as well as QAnon itself.

In the report, the company explored the way conspiracy theories and protests have intersected in the U.K., Germany, and Brazil. Across these territories, QAnon attaches itself to a variety of issues, such as anti-mask protests, child trafficking conspiracies, Holocaust denial, and general pandemic disinformation.

The group hit Blackbird.AI’s radar in mid-May 2020 when the company began tracking the hashtags #WWG1WGA_WORLDWIDE and #WWG1WGA_GLOBAL on Twitter. The acronym has become a kind of QAnon slogan: “Where We Go One, We Go All.”

The FBI has labeled QAnon a domestic terrorist threat for its ability to incite people to take violent measures in real life. The U.S. House of Representatives recently voted to condemn QAnon. And yet fervent QAnon supporter Marjorie Taylor Greene recently won a Congressional primary in Georgia and is heavily favored to win in the 2020 general election next month.

“Less than three years ago, QAnon was essentially a theory on a kind of extremist and vulgar message board,” Khaled said. “And so it’s only taken about three years to reach a tipping point where you have a member of Congress who is a firm believer in QAnon conspiracies.”

An internet origin story

QAnon is like a hub of conspiracy theories, attaching itself to everything from anti-vaccine ideas to climate change skepticism to mobilize people and bring them into the larger conspiracy fold. The individual topics may be distinct, but the conspiracy theory element injects suggestions of deep-state cabals or secret organizations pulling the levers. COVID-19 provided a fresh new group of targets as people raised questions about liberty, governance, and public safety.

“We all live in this new normal now after seven months that have disrupted patterns of behavior,” Khaled said. “And so fringe conspiracy theories have found this new, unusual world to tap into of audiences that are obviously frustrated. And it’s easier for them to kind of expedite their belief system with people that are concerned and scared. Those are the people who are always more likely to jump into a cult.”

QAnon is unlike a classic cult in that there is not a single charismatic leader defining a philosophy to attract followers. Instead, QAnon is more a grab bag of hoaxes that suggest there is a larger hoax afoot.

“There are real socio-political concerns that people have that lead them to these belief systems,” Khaled said. “These are typically people who have traditionally felt disenfranchised and powerless about the world around them. This is their way of essentially taking back some power: ‘We’re gonna take down these sinister forces through our activism.’ It’s easier for people who feel that way, whether it’s a classic cult or a terrorist organization, to be taken in by those groups. It is a hoax, but it is a crowdsourced cult designed to exploit these social fault lines. One thing that needs to be understood and talked about more is that QAnon is an expression of political disenfranchisement.”

QAnon activists aren’t inventing these economic and social divisions. Rather, they look for existing fractures and exploit them.

“The fact that it is crowdsourced is the perfect way to jump on every single breaking topic and attach QAnon to it,” Khaled said. “In any conspiracy that we look at, even if we’re not looking for QAnon, QAnon is there as an amplifier. Across multiple campaigns that we’ve analyzed over the last 16 months, QAnon always pops up.”

Social media and gamification

While the range of topics QAnon covers is vast, common strategies are being used to ensnare people — chief among them gamification. While QAnon may be crowdsourced, active core participants drop puzzles or clues on various websites or social platforms hinting at some larger truth for those who solve them.

“It’s an amazing amplification technique and a testament to how well these guys know the social media information ecosystem and how to spread content,” Khaled said. “It leaves people to figure out the puzzle through their own research. And then they discuss it together, whether it’s in a private group or whether it’s in a public forum. This is a brilliant and terrifying way of crowdsourcing conspiracies. Because not only is it engaging, it makes you think you’re contributing to this larger cause of dismantling this invisible harm. And it keeps a constant supply of disinformation pumping through the ranks. People get drawn into QAnon and its worldview in this kind of soft-touch way and then they are eventually led to their more extreme beliefs. And by then, you’re kind of in trouble.”

This strategy has allowed QAnon to adapt to different geopolitical settings and spread around the world. But the effort has been catalyzed by the structures underlying Facebook and Twitter.

“I think it speaks to the power of our current information ecosystem and the awesome responsibility that the platforms have,” Khaled said. “Social media provided the perfect platform around the world to throw fire on a lot of the QAnon fringe conspiracies. Without Facebook and Twitter, I can unequivocally say that it wouldn’t be nearly as bad. The internet has been around for a long time, long enough for a lot of these things to have occurred. But the thing that we see here today is that without platforms like Facebook or Twitter, these kinds of conspiracy theories, or these kinds of ideologies, simply can’t move to gain the traction that they would get with traditional things like internet email chains. Those can’t get the type of traction because they don’t have this kind of multi-network approach to how social platforms work.”

These viral dynamics are core features rather than minor faults of social platforms, making them tough to address.

“Facebook is where [conspiracy groups] are because they created a unique mechanism and user base that had that kind of speed and size,” Khaled said. “Never before has the media been able to spread that much information that fast without any kind of moderation in place.”

Khaled acknowledged that Facebook and Twitter have some moderation in place to weed out stuff like ISIS recruitment, violence, and sexually explicit material. But he said for a long time the companies too narrowly defined which categories to focus on.

“They could have just as easily applied [content moderation] to extremism, but they chose not to make that decision,” he said. “They’re very good with it when they decide to suppress a particular category. But they just turned a blind eye to QAnon.”

Khaled also echoed another big criticism of these platforms — that the underlying AI has accelerated the growth of extremist groups.

“If you join one QAnon group, then you’re getting recommended to 10 others,” he said. “If you watch one video, the YouTube algorithm automatically is trying to get you to more extreme things. Nobody programmed the YouTube algorithm to show you more extremism. But the AI stupidly realized that that kind of content will result in greater engagement. These are the very mechanisms that make these companies billions of dollars. These recommendation algorithms are designed to show you things that are going to keep you engaged. And disinformation, conspiracies, rumors, and extreme content are more engaging than a 50-page report from the [National Institue of Health] talking about the efficacy of COVID vaccines.”

Stopping the spread

Khaled and I spoke earlier this week, before Facebook’s QAnon announcement. Back in August, Facebook had started taking steps to remove some subset of QAnon content, but Khaled was deeply skeptical that this would have much of an impact.

“I would say that it’s somewhat too little, too late,” he said. “This has been going on since 2017, 2018. And only in the last month is Facebook saying they’re going to shut down these groups.”

Part of the problem is that QAnon’s rise is an unintended consequence of better privacy and better encryption, which allows many of these groups to fly under the radar. Also problematic is QAnon’s alignment with political conservatives. When platforms start removing QAnon content, conservatives claim their voices are being suppressed, and companies like Facebook become more hesitant to tackle the problem.

“There are no easy fixes for them,” Khaled said. “They’re in a bind for sure.”

I circled back with Khaled after Facebook announced it would remove all QAnon content. While he applauded the gesture, he doubts the company can truly stamp out the problem at this point.

Khaled noted that QAnon had already been evolving its look and feel through such efforts as “Pastel QAnon.” That’s an effort to spread conspiracies by targeting suburban women who follow topics such as astrology, tarot, alternative medicine, and other subgroups. QAnon can identify these groups using Facebook’s audience targeting tools, and Khaled wondered if the company would be able to detect such content.

He also pointed out that such a massive purge is likely to be viewed as part of a big conspiracy by members who are already prone to believe such things.

“In the current information age, what will most likely happen is that QAnon members would go back to the extremist message boards where they originated or to encrypted private messaging,” Khaled said. “It’s notable that Facebook Groups and Pages and Instagram will be addressed without any mention of WhatsApp, which will be the very next refuge. In any case, it’s a step in the right direction, as it will reduce the viral spread and algorithmic recruitment efforts, as Facebook has done in the past with … terrorist groups and militias.”


You can’t solo security COVID-19 game security report: Learn the latest attack trends in gaming. Access here


Let’s block ads! (Why?)

VentureBeat

About

Check Also

SAG-AFTRA hits out at AI Taylor Swift deepfakes and George Carlin special, calls to make nonconsensual ‘fake images’ illegal

The Screen Actors Guild – American Federation of Television and Radio Artists (SAG-AFTRA) put out …