The increasingly visible and vocal followers of QAnon promote a bewildering blend of unsubstantiated conspiracy theories, worrying everyone from Facebook to the FBI.
Once on the fringes of the internet and focused on US politics, the movement has seen sharp growth on mainstream social media platforms this year, prompting tech firms to tighten controls and ban QAnon followers.
The movement is centred on the unsubstantiated belief that the world is run by a cabal of Satan-worshipping paedophiles. It has extended that this year to allege, without proof, that the coronavirus is a conspiracy by that group to control people using vaccines and 5G.
Researchers detected sharp spikes in QAnon content and related searches in March, when many countries had started imposing lockdowns and other social distancing measures.
The anxiety, frustration and economic pain caused by the pandemic — coupled with the increased amount of time people were spending — online became an explosive mix that drew people to QAnon, experts say.
“QAnon blamed these events on global elites while also increasing distrust in mainstream media, government and organisations such as the WHO,” said Mackenzie Hart, a disinformation researcher at the London-based ISD think tank.
Core QAnon beliefs were also coupled with anti-vaccine messaging and far-right campaigns, further expanding its following.
Tech analysis have pointed to a feature at the core of most major social media platforms as a key driver of QAnon growth: the recommendation algorithm.
Users who view, post or search for certain content are guided to what the platform’s algorithm determines to be other content they may be interested in. Analysts have said this helped link existing conspiracy theories — such as those about vaccines and 5G — with