Beyond the familiar Twitter debates, Facebook invitations and Instagram selfies, an entirely new digital universe has emerged. Extremists have built their own parallel online world — a self-contained echo chamber complete with ultra-libertarian social media sites, extremist crowdsourcing platforms, disinformation encyclopedias, and white nationalist dating apps. The rise of these alternate platforms in the last few years has not only allowed extremist propagandists to escape takedown measures and circumvent conventional information ecosystems. This parallel online world has also effectively given rise to an alternative reality for its users, one that the average digital citizen might find both disturbing and surreal.
In this essay, Julia Ebner, an Austrian scholar and senior researcher at the Institute for Strategic Dialogue, uses her knowledge of far-right extremism, reciprocal radicalisation and European terrorism prevention initiatives to explore the impact of alt-tech ecosystems and the dangers these hotbeds of self-reinforcing extremism pose to our security, politics and society.
Brenton Tarrant smirked as his lawyer entered his plea of not guilty: he faces 51 charges of murder, 40 of attempted murder, and one of engaging in a terrorist act for his role in killing dozens of Muslim worshippers in Christchurch, New Zealand, on 15 March 2019.1 With at least 1.5 million Facebook re-uploads of his GoPro livestream of the attack, the facts would seem hardly disputable. But reality is as relative to Tarrant as it is to his audience on 8Chan, the image board he chose for his last satirical post before going on his shooting spree.
“I’ve only been lurking for a year and a half, yet what I’ve learned here is priceless,” another white nationalist called John Earnest posted on 8chan just a few weeks after the Christchurch massacre. On 27 April 2019, the 19-year-old Earnest opened fire in a synagogue close to San Diego, killing one woman and injuring three other worshippers. In an open letter Earnest left on 8chan, he referenced both Tarrant and the terrorist Robert Bowers, who killed 11 people at the Pittsburgh Tree of Life synagogue in October 2018. Following the latest attack, the FBI issued 8chan a search warrant for the IP and metadata of John Earnest as well as other commentators on the platform who they believe “inspired and/or educated” the attacker.2
The imageboard is only one part of an emerging far-right online universe, the so-called “alt-tech space,” which has given rise to a range of far-right sub-cultures and (un)realities. All three terrorists — Tarrant, Earnest and Bowers — were radicalized in these online spaces that exist in parallel to the more well-known social media platforms. They all began to believe in the conspiracy theory of the so-called “white genocide” or “great replacement” — the idea that the white European race is being wiped out as a result of the concerted policies of the “Jewish globalist elites.”3
“Lurking for a year and a half” on 8Chan implies a sustained exposure to a mixture of twisted statistics, racist jokes, and anti-Semitic memes. An Institute for Strategic Dialogue analysis of 480 alt-right memes about the “great replacement” showed that 90 percent of these pieces of content contained generalized dehumanizing and racist discussion playing on racial stereotypes, themes of racial impurity, cultural differences, and anti-Semitic conspiracy theories. With only 10 percent of the memes referencing statistics, often from unverified sources, most content remained unsubstantiated, playing more to emotion than rational argument.4
Violence-inciting language is as virulent as alternative truths on platforms like 8Chan. A study conducted jointly by the Anti-Defamation League (ADL) and the Network Contagion Research Institute found that 8chan and the alt-right Twitter alternative Gab are rife with genocidal language (e.g. “slaughter” of groups of people) and conspiracy theories.5 Many users in these forums engage in so-called “real-life effortposting” — online campaigns aimed at inspiring real-world action against their enemies, in particular Jews, Muslims, Blacks and left-liberal political activists.6 The killings committed by the three shooters are an extreme version of such real-life effortposting; lesser versions have included unsolicited pizza deliveries to political opponents7 and hacks causing printers on US university campuses to print neo-Nazi propaganda.8
The declared goal of all above-mentioned extreme-right terrorists was to spark a civil war by staging terror attacks. The Christchurch shooter declared in his so-called manifesto that he wanted “to add momentum to the pendulum swings of history, further destabilizing and polarizing Western society (…).” This idea to accelerate a supposedly “inevitable race war” is also known as Accelerationism or Siege-posting, inspired by the book Siege by the American neo-Nazi James Mason.9 “In case you haven’t noticed we are running out of time,” the Poway terrorist wrote in his open letter, and “If this revolution doesn’t happen soon, we won’t have the numbers to win it.” Accelerationists find themselves in the same camp as other groups that seek to inflame existing tensions to tear apart the social fabric that holds together our societies. One example of such bedfellows with converging interests are the trolls of the Russian Internet Research Agency who operated highly divisive accounts such as “Secured Borders” and “Blacktivist” during the 2016 election.10
Since the lethal white nationalist rally in Charlottesville in August 2017, policymakers across the world and mainstream social media platforms such as Facebook and Twitter have stepped up their efforts to take down extremist accounts and violence-inciting content.11 Germany’s NetzDG (Network Enforcement Act) was the first real legislative attempt to regulate content spread by extremist groups. The law focuses only on platforms with over 2 million registered users, requiring them to remove content containing illegal hatespeech within 24 hours.12 Germany has much stricter laws on hatespeech than most countries, particularly the US. For example, Nazi symbols and Holocaust denial materials are prohibited by the “incitement to hatred” provisions in the country’s criminal code.13
The Christchurch shooting prompted another wave of policy announcements due to mounting public and political pressure on the big tech platforms. In recent months, Facebook banned white nationalist and separatist content,14 which resulted in the removal of groups like Generation Identity, Wolves of Odin, the British National Party (BNP), the English Defence League (EDL) and Britain First15 as well as far-right influencers including Faith Goldy, Milo Yiannopoulos and Alex Jones.16 Meanwhile, YouTube announced the removal of thousands of white supremacist videos17 and Apple and Microsoft blocked extreme-right Telegram channels that advocated terrorism from their platforms.18
The clamp-down on far-right activities on the big social media platforms was exploited by the alt-right to frame any anti-hatespeech intervention as an illegitimate attack on their freedom of speech. Their brand of unreality is strengthened by such campaigns, allowing them to paint themselves as the victims of a cabal of “global elites” and complicit tech firms. Over the past few years, an ideologically diverse coalition of white nationalists, conservatives and ultra-libertarians has launched attempts to build out its own online infrastructure, setting in motion a migration towards newly established “censorship-free” platforms. “This is war,” claimed the so-called “Alt-Tech-Alliance,” which self-identifies as “a passionate group of brave engineers, product managers, investors and others who are tired of the status quo in the technology industry” in an announcement published in August 2017, following Charlottesville. “The Free Speech Tech revolution has begun,” they said.”19 A similar dynamic of platform migration could be observed with the decentralized publishing platform Mastodon, which became the go-to alternative to Twitter for many Japanese users after Twitter started cracking down on sexualized imagery of children.20
Today’s alt-tech Internet encompasses a range of sites from social media platforms like Gab and Minds, which host extremist users and content, to crowdsourcing websites such as the now-closed Hatreon and WeSearchr sites that allowed white supremacists to raise funds for hate campaigns, as well as hacking and trolling operations. Alternative dating websites like WASP Love and White Singles provide platforms for white-only singles, and online shops such as Redbubble sell miniskirts featuring pictures of WW2 concentration camps and T-shirts with imprints of “Zyklon B”, the cyanide-based pesticide that was used in the Holocaust gas chambers.21 Even alt-right Wikipedia equivalents exist: Metapedia describes the Holocaust as a concept of “politically correct history,” likening it to a religion,22 and The Encyclopedia Dramatica labels Brenton Tarrant “a heroic IRL JC Denton Aussie troll who took it upon himself to remove the Mooslem filth from a country whose existence was questionable at best.”23 While Metapedia can be seen as an ideologically tainted attempt to create an alternative Wikipedia, Encyclopedia Dramatica interprets the world from a sarcastic Internet culture perspective.
New media has changed the way information is produced and consumed. The shift to a less intermediated information selection process has increased the likelihood of users consuming information aligned with their preexisting belief systems, thereby reinforcing confirmation biases. Furthermore, it has also led users to form groups of like-minded people.24 Researchers have referred to this phenomenon of isolated communities on mainstream platforms as filter bubbles or echo chambers.25
The dynamics in closed-off echo chambers, such as those created on alt-tech platforms, are still under-researched, but their impact likely goes beyond the creation of safe havens for extremist voices. Within the confines of these closed echo chambers, its users effectively develop their own language, cultural references, and insider jokes. Such online hubs effectively help extremist actors create new collective identities around their fringe ideologies. Researchers at the Alt-Right Open Intelligence Initiative of the University of Amsterdam conducted an analysis of three billion Reddit comments, illustrating the linguistic peculiarities of the alt-right’s different online subcultures. Each sub-group — from the shitposters, gamers, and men’s rights activists to the anti-globalists and white supremacists — has adopted its own language and reference points. Boards such as /theDonald have fostered a cohesive group identity across these different subcultures on the basis of their lowest common denominator: their shared enemy, the political establishment.26 Such collective identities are not limited by geographic borders, with /TheDonald subreddit, and Gab and 8 Chan’s /pol board, attracting members from around the globe.
The development of global far-right echo chambers has also facilitated the circulation of hyper-biased, inaccurate, and wrong news stories on an international level. After the fire that severely damaged Notre Dame de Paris, far-right activists on 4chan coordinated attempts to weaponize the fire to spread disinformation against minority communities. For example, they asked users on the platform to share “pictures of smiling Muslims near Notre Dame” to frame the incident as an Islamist attack on Christianity.27 Moments of breaking news are particularly prone to attempts of reality-hacking, as users enter new search terms in search engines like Google and Bing for which limited data is available. Such so-called “data voids” can easily be exploited by online actors who seek to spread disinformation.28
The QAnon community, which propagates the conspiracy theory that Donald Trump is working with Robert Mueller to expose a massive pedophile ring run by the Clintons and the Deep State, is another example of the internationalization of unreality. Despite the initial US focus of the community, QAnon activists have created offshoots in most European countries, from QBritannia to QDeutschland and QNetherlands, to tailor the overall conspiracy theory to the local context.29 The alt-tech universe has effectively allowed extremist activists to adopt a hyper-localised yet internationally standardized and networked approach — in business lingo a “glocal” strategy — to disinformation. For example, QDeutschland extended the core QAnon conspiracy theory beyond prominent US figures to include German chancellor Angela Merkel and former Justice Minister Heiko Maas, who introduced the “NetzDG” anti-hatespeech laws.
The result of the emerging alt-tech world is an entirely new information, communication and socialization ecosystem. Not only does this allow fringe actors to redefine journalism and exercise information authority, it has also made it possible for them to revolutionize the ideas of truth and reality among their audiences. In his 1976 book How Real is Real, the Austro-American psychologist and communication theorist Paul Watzlawick questioned the universality of reality, arguing that reality is in fact always a result of communication. As such, reality creation would be inherently subjective and fluent. Transferring this constructivist view of the relationship between communication and reality to the digital age, where the communication environment can easily be manipulated, means that reality has now become easily hackable. For example, confusion leads immediately to a quest for order, according to Watzlawick.30 When individuals find themselves in unfamiliar or complex situations where previous experiences cannot be applied to make sense of the environment, “a state of disinformation,” they will seek explanation and order in the chaos.31
While the artificial creation of confusion for the purpose of induced reality adaptation can be used for positive ends like hypnosis therapy, for instance,32 it may also be turned into a weapon by malign manipulators who seek to impose their own (un)realities on others for political ends. Obfuscation tactics in the form of the Soviet concept of dezinformatsiya have traditionally been used by state actors such as the Kremlin.33 But in recent years, non-state activists such as the international alt-right have adopted similar playbooks for their online radicalization and manipulation campaigns.34 The goal is to make individuals vulnerable to disinformation, conspiracy theories and extreme ideologies.
As the lines between imagination and reality, prank and assault, game and terror become increasingly blurred, making a clear-cut distinction between the virtual and real world becomes more difficult for members of extremist echo chambers. “Is this a LARP35?” reads the first comment beneath Tarrant’s livestream post on 8Chan. “Get the high score” was the first one beneath the Poway synagogue shooter’s livestream on the same platform.36 Even as Tarrant’s massacre was unfolding, many watchers treated it as a prank or a game. It took a few moments until reality caught up with them and they came to understand that the borders of their collectively imagined unreality had been crossed.37 Even members of politically motivated groups, which spread the ideologies that motivated the Christchurch and Poway shooters such as the “great replacement” and “white genocide” theories, had not necessarily been conscious of the real-world impact their virtual activities could have.
In the days following the Christchurch attack, some members of the extreme-right Discord group “JFG World” shared their shock and announced that they would leave the platform. “I don’t even know where the fuck to begin… Why do people like you find dead bodies something to joke about?” one member wrote. The lethal incident essentially tore apart their illusions of “digital dualism,” the idea that the virtual and the real world are fully separate entities.
The founder of the Cyborgology blog, Nathan Jurgenson, first coined the term “digital dualism” in 2011.38 In recent years, however, researchers have found ample evidence that casts doubt on the idea, instead exposing significant overlaps and interplays between the online and offline world.39 It became increasingly clear that virtual unrealities can shape real-world realities. Potential real-world effects of collective extremist illusions reach from terrorism to election manipulation. Online propaganda, mobilization, grooming, hate speech, and disinformation may lead to hate crimes, bullying, sexual abuse, terrorism, and election manipulation. Research by the Institute for Strategic Dialogue suggests that online hate speech posts often correlate with offline hate crimes in the same geographic areas.40
The impact of the emerging alt-tech space is both a security threat and a political issue. Shared unrealities have provided a link between online fringe communities and far-right populist politicians, leading to a mutually reinforcing relationship between the two: The latter have empowered the former by normalizing the rhetoric of extremist online networks, flirting with their constituencies and emboldening their influencers. The Network Contagion Research Institute analyzed 100 million comments and images of 4chan's Politically Incorrect board (/pol/) and the alt-right’s Twitter equivalent Gab from July 2016 to January 2018 and concluded that racist and anti-Semitic slurs surged dramatically after Trump’s election.41 It is probably not coincidental that President Trump and members of his family have helped amplify some of these memes.
Far-right populist politicians have on the other hand benefitted from extreme-right influence operations that emerged from the alt-tech space. For example, the now quarantined Reddit board /TheDonald,42 which was created shortly after Donald Trump announced his presidential candidacy in 2015, has significantly supported his social media campaigns. With over 750,000 subscribers by 2019, it has grown into one of Reddit’s largest and most active boards.43 Meanwhile, the neo-Nazi trolling army Reconquista Germanica managed to bring several pro-AfD hashtags into the top trends on German Twitter in the two-week run-up to the German Federal Election in 2017, influencing the online discourse and pre-election mood.44 In recent national and regional elections across Europe, activists of US alt-right hotspots such as /TheDonald and 4Chan’s /pol board joined forces with their European counterparts. For example, the coordination of election influence campaigns in favor of far-right parties in France, Germany, and Italy bridged international and local alt-tech platforms, where alt-right meme databases and psychological warfare manuals were shared and repurposed based on location.45
To tackle the spread of extremist (un)realities, it will be crucial to tackle the alt-tech ecosystem. However, finding effective solutions will prove especially difficult given the range of shapes these platforms can take. While some alt-tech sites have been created by extremists for extremists and are unlikely to cooperate with counter-extremism units, others are entirely apolitical sites hijacked by malign actors. Accordingly, alt-tech platforms can be categorized into four different types.46
The first type are extremist in-house creations, hence platforms designed by extremists for the purpose of offering them a safe haven. Examples are the crowdsourcing platform Hatreon and the Identitarian social networking app Patriot Peer, which is still in development at the time of writing.47
A second type of alt-tech sites are ultra-libertarian platforms, which tend to operate in the name of free speech. Created by libertarians or commercially driven developers, they usually tolerate violent and extremist content, do not proactively take down any content, and refrain from cooperating with counter-extremism organizations. The Twitter substitutes Gab and Minds as well as Jordan Peterson’s newly announced “anti-censorship” site Thinkspot fall into this category.48
The third type of alt-tech sites are hijacked platforms created for an entirely different purpose, which extremists co-opt usually because their infrastructure fits their purpose. An example of this is the gaming chat channel Discord, which lends itself to military-like trolling operations and gamified hate campaigns due to its unique architecture. For example, trolling armies such as the German neo-Nazi movement Reconquista Germanica used the app to build databases of memes and coordinate large-scale influence operations that are then carried out in a carefully orchestrated way on mainstream platforms like Facebook, Twitter, and Instagram.
The fourth type are fringe platforms, which are popular forums beyond the known Silicon Valley social media platforms and serve as engine-rooms for Internet culture. Example of fringe platforms are the /pol board of 4chan or 8Chan. While hijacked platforms such as Discord tend to proactively combat extremism on their platforms by cooperating with security agencies and counter-extremism organizations, most fringe platforms such as the chan boards are reluctant to engage in any such efforts.
Two factors make it especially difficult to track and analyze these alt-tech spaces: 1.) anonymity on the imageboards, forums, and fringe platforms and 2.) privacy in the encrypted channels and messaging apps, which researchers have also called “Dark Social” due to their opaque nature. 49 The introduction of a paywall system for chatrooms could further add obstacles for security forces, investigative journalists, and liberal-leaning activists who will find themselves in a moral dilemma if they have to pay to get access to extreme-right channels.50
Ultimately, the danger of the online alt-tech universe does not only lie in the creation of separate information ecosystems and self-reinforcing extremism hotbeds. The opportunities these platforms may provide their extremist creators, most notably to collect, analyze and monetize data from their users, could equip them with new weapons to refine their radicalization tactics and micro-target new audiences based on user profile analyses. The Cambridge Analytica scandal exposed the damage and political backlash such data-driven campaigning can have. Holding online user data would provide extremist actors with power to shape the perceptions, attitudes, and behaviors of Internet citizens. Regulating (and possibly restricting) commercially motivated companies such as Facebook from selling their data to third parties is crucial. But simultaneously ignoring the rise of politically motivated alt-tech platforms is democratic suicide. It could bring both the political and militant far right a step closer to transforming their dystopian illusions into reality.
This implies an urgent need for a comprehensive cross-platform strategy. Removal systems for violence-inciting content and stricter data usage laws should not be limited to the big tech firms but must encompass the entire new-media ecosystem, including alt-tech. However, tackling online content and behavior that falls into the legal gray zones will require a more nuanced strategy than removal and oversight mechanisms; one that does not play into the hands of extremists by contributing to their narratives of persecution. A first step for such a strategy would be to enhance our understanding of the tactics used to create spaces that escape and redefine reality. More research will need to be done in the hidden corners of the Internet, where the lines between reality and unreality are erased for the purpose of manipulation and radicalization. While raising public awareness about these distortion techniques is important, it will be crucial to mitigate the risk of giving them too much oxygen to avoid inadvertently directing more users towards alternative realities.