Where Has Online Extremism Moved Over The Past Year?
Part 2: Assessing Domestic Extremism a Year After January 6, 2021
There’s been some progress in the online fight against extremism. In the wake of the insurrection last January 6, former President Trump and many of those promoting or participating in the insurrection were pushed from mainstream social media platforms like Twitter, Facebook, and YouTube. As our team discussed last summer, this created a tumult causing a domestic extremist migration to communication platforms thought to be more secure from law enforcement and less moderated than mainstream social media apps.
In Part 1 of this update, we discussed the broad shifts in the domestic extremist landscape in the year since the January 6 insurrection. Here in Part 2, we examine how extremist groups and online collectives have shifted their digital organizing over the last year. The most violent extremists have gone do dark by moving to closed, member-only forums and encrypted direct messaging apps, while other movements made their way to newer, lightly-moderated social media platforms or foreign hosted services.
Note, as discussed in Part 1 “Where did all the insurrectionists go?”, online collectives, depending on their structure and ideology, have differing distributions of adherents - from passive observer to vocal supporter to committed terrorist. The more closed (private) the platform and channel, the higher the propensity for violence. On the public platforms and collectives noted here below, every account is not a ‘terrorist’ account, but in these discussion spaces there are a number of real world extremists advocating or preparing for violence.
What do extremists need to be prolific online?
Whether it’s foreign terrorists in 2010 or domestic terrorists in 2020, extremists need an Internet and social media presence to amplify their message, challenge detractors, radicalize future recruits, bring in new recruits, and, in some cases, plot and plan attacks against targets. Since the days of Anwar al-Awlaki a little more than a decade ago, mass-mediated terrorism—the kind where adherents seek ideological guidance and operational support via digital connections rather than in-person meeting—has risen sharply. Facebook, Twitter, and YouTube dealt with the challenges of policing al Qaeda, al Shabaab (Somalia), and then the Islamic State. A decade of lessons prepared them for the broad de-platforming of domestic extremists after the insurrection.
The domestic terrorists of today travel a similar, but different path, of the foreign terrorists before them, seeking out new social media safe spaces where they can congregate and spread their hate once they lose access to mainstream platforms. For the older generation of extremists, many of the newer social media platforms prove less user friendly and lack the features of the major platforms. Video sharing, live streaming, voice chat, and content posting with a wider audience are illusive on smaller platforms, difficult to find in a single application, and, if they do work, they often don’t work quite right. BitChute is not a sufficient substitute to YouTube, for example, and no alternative service comes close to offering the range of options provided by Facebook, Instagram, and WhatsApp. As extremists seek platforms that are safe from law enforcement and completely unmoderated by administrators, options narrow quickly. The most sophisticated, generally younger tech-savvy extremists have reverted back to their own hosted and controlled (closed) forums that drastically limit membership. To be prolific though, extremist groups need a public presence leading them to platforms that market themselves as unwilling to moderate content, or to platforms hosted overseas like VKontakte, 4Chan or, most crucially—Telegram.
Even if extremists make the jump to alternative or foreign-based platforms, two more challenges await them once they land. Unmoderated or lightly policed platforms, like Parler in its heyday, can become a giant waste of time: Hate speech quickly reaches a level that triggers a violent attack, creating blowback that leads to the platform being banned in certain countries or removed from the app store entirely (i.e. Parler). Even if not shuttered by regulators or hosting services, unmoderated smaller platforms turn out not to be much fun for extremists—there is no one to recruit and no one to bully. An information bubble in which angry people say and share the same things becomes quite boring.
Where have extremists migrated since the insurrection?
The January 6, 2021 insurrection provided the impetus for social media platforms to remove large swathes of domestic U.S. extremists from their platforms. Those participating in the raid on the Capitol or cheering them on from afar quickly needed a new place to congregate online, kicking off intense searches by the recently de-platformed to find a new home. The early months of 2021 saw a rapid, sporadic migration to and from smaller platforms as extremist collectives tested out new and existing services. Today, a year later, the dust has settled a bit and the insurrectionists, their supporters, and other extremists worried about moderation have navigated their digital journey to new online safe havens.
Nearly every major domestic extremist group, like international jihadists several years back and nation state disinformation peddlers more recently, moved decisively to Telegram throughout the year. Telegram offers the best available substitute to U.S. mainstream social media platforms: public channels for connecting with wider audiences, private messaging for direct, encrypted communication, the ability to video stream, overseas hosting with limited moderation, and easy accessibility in Western app stores. With Telegram, extremists get a substitute for both Twitter and WhatsApp without Silicon Valley oversight or as much American law enforcement interference. Note: Over the last year, Telegram has been repeatedly referred to as an encrypted, private messaging app in mass media broadcasts and publications. While it does have those features, Telegram has both public and private features, and many extremist groups use the app’s public messaging as a way to vet recruits and spread propaganda. Separately, with extremists unable to congregate on Facebook and Twitter, Gab has become the landing spot for many of the far-right groups and their leading figures, a welcoming home for white nationalists, neo-Nazis, accelerationists, and conspiracists alike.
Beyond Telegram and Gab, newer platforms have attracted different stripes of extremists based on the features they offer that can replace mainstream platforms. Younger accelerationist terror groups, after being pushed from YouTube and even BitChute, descended upon Odysee recently as an alternative video-sharing app making use of blockchain technology. Militia groups and ‘stochastic haters’ (See Assessment #1) angered about election conspiracies, mask mandates or misinformation about COVID-19 and vaccines, have moved to GETTR to harass local officials and issue death threats against their personal and political opponents. Younger white supremacists, incels, and anti-government types trend toward closed chats or private forums, the Chans, and Discord for discussion spaces with more of a tech edge. QAnon leftovers find newer, smaller platforms like Minds as a hangout and Rumble as a way to stay connected to the last administration.
Often overlooked in social media conversations is audio content. Extremists, like society more broadly, have dramatically increased their creation and consumption of podcasts. Across alternative platforms and various streaming services, far-right figures have continued to promote hate speech and grant appearances to other white supremacists and anti-vaccine advocates on podcasts. In some cases, audio and video streaming segments are where prominent domestic and foreign extremist groups overlap. Podcasts likely represent the least moderated, mainstream extremist channel for reaching mass audiences and will become the next big moderation challenge for tech companies.
In conclusion, domestic extremist groups across the ideological spectrum face far more challenges online today, than they did a year ago on the day before the insurrection. A smaller online presence has muted their ability to radicalize, recruit and organize. However, the extremists most committed to violence, have adapted and moved to new digital safe havens, and in 2022 and beyond, this will be where they plot future attacks.
Next up, Part 3 (Posted here, Jan. 6, 2022) of Assessing Domestic Extremism a Year After January 6, 2021, where will domestic terrorism arise in the midterm election year of 2022?
What's the end game of these extremists? Do they understand the realities of life in parts of the world where governments have toppled and mass chaos, incessant violence, extreme poverty, starvation, homelessness are the staples of daily life? Do they believe life will be a 'real life' video game where they can maim and kill during the day, then go to restaurants and bars at night? I'm trying to understand whether they are trying to remake the US into a war-torn 3rd world country or whether it's a way to cosplay their violent fantasies while being protected by an actual democracy? Clearly there is a faction that is translating fantasies into real life: 1/6, the gunman who just killed 5 people, but are these the exceptions?
Let's try this again
Your analysis are always interesting,
However I do have a problem with the "bad attitude"
on the counter terrorism community toward singles.
Shouldn't you focus on identifying the root causes.
Were you guys even aware that Russia had targeted any and every moderate to fringe community (even anime fans and gamers). I've been tracking that for 2 decades.
This ranged from videos containing "inserts" not unlike the good old subliminal adds of the 70s.
Then getting gradually more sophisticated with obviously tons of money poured into these experiments.
I'm sure you're aware of what they did with using Onlyfans to draw peoples toward Telegram.
Back to "singles" and how they are misleadingly labbelled by counter terrorism.
Japan has 50% singles, declining sexuality according to studies
Sweden ... getting close
So my question is this
Since I don't know any single who would call themselves voluntarily,
except perhap the so called MGTOW (and that's another controversy)
Why focus on taxonomy and throw every single person onto extremist bangwagon ?
Souldn't you simply round up those guys as far right since their single status is accidental.
Also, why INCEL theory on Chad is fantasy.
Looking around, I found these that seem to hint at a probable cause for their angst.
https://www.menshealth.com.au/research-women-are-attracted-to-psychopaths
https://www.bhg.com.au/study-finds-women-are-attracted-to-psychopaths
Basically psychopaths are what, 2% of the population,
but this study hint they would be 10% of the selection,
and since genetics do play a role in psychopathy, it may not be a good thing.
I'm quite sure it also apply to men (attracted to psychopath women)
and it may explain all those vampire movies.
It seems it's the confidence that create that attraction.
Obviously they don't know the person they are attracted to is a psychopath,
since those are undetectable (unless they commit crimes)
Also, I doubt every single person in those "incel communities" is actually single. Because guys that are single are not going to boast about it, let alone make a forum to discuss it with .. let's face it ... the competition. Neither can they really get advice from other singles.
in fact real singles are probably trying to be friends with real life girls to ask them for advice.
That said, 4chan is one hell of a toxic place. I used to "visit anonymously" just to break up their plans. Nothing worst for them than their effort to harass women going to dust because of someone like me.
(I just report what they did to platform, this could be automated, this shouldn't be "free speech")
I still don't get why the NSA don't use bots to detect those efforts and instantly send report to target platforms.
I saw them deliberately build hate campaign and smear campaign against anyone ranging from activists or famous peoples to asmr girls, and, of course any sexy girl on twitter or onlyfan.
The NSA could do the same to wreck pirate sites, but for some reason don't.
I long suspected the owner of Onlyfan must get a profit from there. He doesn't have the best reputation. And I don't like his blind eye toward those Telegram connection.
As they say, some guy may go to Telegram from Onlyfan for the boobs, but end up with unwanted side effect, such as an influence campaign that may change their views.
Don't mind me, just sharing information.
Keep up the good work