Trends in Ecommerce: Globalization and Digital Transformation
2024 TRENDS REPORT
Skip to content
December 12, 2023

Since the Capitol breach on January 6, 2021, multiple federal prosecutions of far-right extremist groups have shed light on the path from rhetoric to action. In these cases, online bombast centered on forcibly opposing the US government, but it is far from the only example of the disturbing progression from extremist content to violent behavior.  

This is a watershed moment for the payments industry: What role can we play to stem the flow of online hate and harm? 

In May, the founder and leader of the Oath Keepers Stewart Rhodes was sentenced to 18 years in prison for seditious conspiracy, obstruction of an official proceeding and tampering with documents and proceedings. In September, Proud Boys leaders Ethan Nordean and Joseph Biggs were sentenced to 18 and 17 years, respectively, for multiple felony charges related to their roles in the January 6 insurrection. Also in that same month, the former national chairman of the Proud Boys, Enrique Tarrio, was sentenced to 22 years in prison – the longest sentence to date.  

The Justice Department’s press release stated that Tarrio created a special chapter of the Proud Boys known as the “Ministry of Self-Defense,” which conspired to oppose by force the authority of the government of the United States. As the events unfolded on the day of the assault, Tarrio was monitoring from afar, posting encouraging messages to his tens of thousands of social media followers, writing “make no mistake...we did this”.  

Crossing the line: From hateful thoughts and speech to violent action 

Not only are the court sentencings the most severe among more than 350 individuals charged for the January 6 Capitol breach, they are also the first to fit the legal definition of terrorism. During the trial against Tarrio, Assistant US Attorney Conor Mulroe stated “His leadership over the Proud Boys was about violence and manipulation... He demonized his perceived adversaries. He glorified the use of force against them. He elevated the street fighting element in his group — the so-called rally boys — and he practiced and endorsed the use of disinformation, deceiving the public and cultivating fear.” 

These high-profile cases represent only a fraction of the consequence following the range of political violence that occurred on January 6, as well as in the days and weeks leading up to it. Lest we forget that the incident itself resulted in more than $30 million in damages, at least 140 police officers injured, and five deaths. Furthermore, it's imperative to acknowledge the enduring repercussions of such an assault on the foundational principles of American democracy and the images projected to the world of a defaced Capitol. 

Extremism is still active and intensifying 

Despite these legal measures, the persistence of far-right extremism remains evident in North America, and the resilience of these hate groups underscores the challenges in eradicating their influence. 

One with the most resolve being the Proud Boys, believed to serve as the “foot soldiers for the right” by their own members. The group has since pivoted from denying former President Trump’s loss in the 2020 election toward cultural issues, such as anti-LGBTQ activism.  

In fact, the number of recorded incidents of political violence or protests staged by the group has only increased. According to the Armed Conflict Location & Event Data Project (ACLED), far-right militias and militant social movements have ramped up their engagement in anti-LGBTQ demonstrations by over three times compared to 2021. Some of these protests included the use of aggressive intimidation tactics, violence against civilians, and on occasion required intervention by authorities. Due to their proclivity for violence, the Canadian government formally designated the Proud Boys, the Three Percenters, and The Base as “terrorist entities” in 2021. Similar designations have not been assigned to these groups in the US. 

How do hate groups remain active despite federal prosecutions? 

Contemporary far-right extremist groups like the Proud Boys maintain a decentralized power structure, giving their local chapters autonomy to have more impact on the local level. This partially explains how convictions of Proud Boys leaders, such as former chairman Enrique Tarrio, have shown the group’s resilience despite the mounting lawsuits against them. 

Social media 

Their ability to remain operational can often be traced back to the groups’ online behavior. Proud Boys members latch on to politicized social issues, such LGBTQ or “Drag Queen Story Hour”, and produce jokes, memes, and trolling campaigns on various online platforms. They gain traction and communicate through internet ecosystems like Telegram and alternative platforms such as Gab, 4Chan, and Parler. 

Moreover, the dissemination of conspiracy theories or extremist ideologies on encrypted social media platforms serves as a tool for recruitment and propaganda circulation. This is not so different from strategies used by other terrorist groups, such as how the Islamic State or al-Qaeda engage with users over social media to bolster their profile and recruit from abroad.  

Indeed, there has been increasing ideological convergence between North American extremist groups and designated foreign terror organizations documented on social media. The most recent example being the revival of old tropes about Jewish power going viral across social media platforms following the Hamas-perpetrated terror attacks on October 7 – a major contributor to the drastic spike in antisemitism across the globe.

Let’s double-click on that for a minute and explain how... 

According to research conducted by leading digital investigations firm, Memetica, Hamas has proven its ability to hijack the feedback loop between real-world events and online discourse by weaponizing platform dynamics of major social media companies - allowing them to dominate discussions about events in real-time.  

Much of the online chatter revolved around the al-Aqsa Flood campaign (the given name for the Hamas operation) as well as anti-Israel conspiracies. Shortly after the attack, pro-Hamas content began circulating on virtually unmoderated message boards frequented by far-right communities around the world.  

Memetica’s research uncovered that many of the extremist groups linked to the Jan. 6 insurrection view Hamas’ actions as aspirational. They shared clips of attacks against Israeli civilians and military personnel as evidence of how to successfully implement guerilla warfare tactics against a more powerful opponent.  

This convergence of both extremist groups was then quickly harnessed by state-sponsored media outlets in Iran, China, and Russia to promote antisemitic and anti-Israel content among Western audiences, which was almost immediately reshared on popular social media platforms. Many of the media outlets sharing this content, for example, have been found to be deeply involved in Russia’s broader disinformation and propaganda campaign – which saw similar use in its war with Ukraine.  

Gaming platforms 

Extremist members from across the political and religious spectrum also regularly reach users on gaming spaces to radicalize and recruit young followers. It’s a well-known reality that multiplayer games and gaming platforms, such as Discord or Twitch, are fertile grounds for extremism. According to the UN Counter-Terrorism Centre report on Examining the Intersection Between Gaming and Violent Extremism, “Considering the number of users in gaming spaces and the appeal of gaming-related content, it is unsurprising that a range of violent extremist ideologies have appeared on these platforms and extremists are seeking to exploit the popularity and attractiveness of gaming spaces for their own ends.” 

In January 2023, NYU commissioned a representative survey of multiplayer gamers in five of the top video-game markets globally. It found that 51% of gamers had come across some form of extremist statement or narrative while playing multiplayer games in the past year. What’s more, players under the age of 18 are more likely to encounter statements promoting white supremacy, genocide or political violence.  

What about AI? 

Not only is there a problem of extremist discourse taking place on social media or gaming platforms, but there is also the possibility of content manipulation within those platforms. Some extremist groups are known to make their own video games depicting harmful content and others use modding to alter the narrative of mainstream games to fit their own ideology. In some instances, researchers found that this method has been effective at inspiring extremists to perpetrate real-world attacks.  

It should be noted that these forms of extremist propaganda are largely perpetrated by semi-trained personnel to effectively create and distribute such harmful material to the masses. However, the emergence of AI generative models may allow for radicalized and extremist actors to produce a significant amount of propaganda – which will be more sophisticated and take less effort. Read more about it here. What is certain is that the ability of homegrown radicalized groups to reach new audiences will continue to evolve.  

So, what does that mean for the future?   

Children and adolescents are spending more time online than ever before. There are unprecedented opportunities for younger generations to benefit from increased connectivity, learning, socialization, and more. What it also means is there are endless opportunities for highly impressionable youths to be exposed to hate speech, violent content, disinformation, and propaganda across the internet landscape. This is especially true for social media, encrypted chat and gaming platforms where moderators face significant challenges in preventing harmful content from reaching the screens of younger audiences. 

Before midterm elections in 2022, threats against members of Congress increased, local school board races became fraught with threats, and Paul Pelosi, the husband of former Speaker of the House of Representatives Nancy Pelosi, was attacked in his home. While the midterm elections did not witness nearly the same violence that followed former President Trump’s loss in 2020, researchers suggest that his bid for candidacy in 2024 could inspire radicalization among his supporters – especially if he loses. 

Extremism poses a significant, widespread threat 

Looking back at the events surrounding the January 6 insurrection, the world was witness to the impact of mass radicalization across the American populace, fueled by fringe narratives that were found in major news and social media platforms frequented by the young and old alike. In 2020, the Department of Homeland Security concluded that “racially and ethnically motivated violent extremists—specifically white supremacist extremists (WSEs)—will remain the most persistent and lethal threat in the Homeland”.  

The increased rate at which far-right extremist groups spread racist or hateful messages thus indicates that that upcoming national elections will witness a spike in political violence. But an important factor in terms of how these extremist groups monetarily sustain their operations remains. Such as, how they generate funds to support legal fees, pay for supplies, and stage politically fueled protests.  

The abuse of payments channels for terrorist funding 

Key findings from the Global Disinformation Index’s (GDI) report on online funding strategies of American hate groups show that these groups often have diverse means of acquiring funds. Payments platforms and financial institutions have been aware of how hate groups abuse financial technology to facilitate organizational funding for years.  

EverC Use of funding by hate groupSource: Global Disinformation Index 

The GDI report notes that, while there are often policies in place prohibiting their use by extremist and violent organizations, 83% of identified platforms processing payments were used by hate groups to raise funds. This includes extremist organizations known to stage violent protests, such as the Proud Boys and Oath Keepers – which fall under the “militia or street protest” grouping in the above chart. 

Furthermore, 44% of hate groups examined in the study were found to be registered as charity or non-profit organizations in the US – which is an effort to add legitimacy to their activities. One of the aftereffects of the Jan. 6 prosecutions has resulted in tens of thousands of dollars being donated to convicted individuals to offset their legal fees. This is despite many of those charged having received government-funded legal representation.  

Following the events surrounding January 6, players in the payments ecosystem quickly realized that the risks associated with processing transactions for sites that may spread racist or hateful content drastically increased.  

  • Law enforcement enhanced their focus on uncovering and disrupting the activity of extremist groups, as well as their supporters.  
  • Major card schemes and payment providers halted political donations and payment processing capabilities of extremist sites selling merchandise, appeals for support (donations), and subscription services of independent content creators.  
  • It should be noted that white nationalist and militia groups, such as the Proud Boys and Oath Keepers were found to prioritize raising funds through the sale of merchandise (see GDI chart).  
  • And many far-right groups were found to have been raising and moving money by employing various forms of transaction laundering.  

Beyond this, many sites offered information, propaganda, and a platform for like-minded individuals to organize without offering anything for purchase 

The role of the payments industry in disruption of terrorist funding 

Law enforcement officials will of course remain vigilant to not get caught off guard as they did on January 6. It will be more difficult for far-right groups to organize or incite violence.  

A crucial component will be how the payments industry prevents domestic extremist groups from earning funds to finance their activities. 

 Industry players can do this by: 

  • Employing effective tools to monitor and block far-right extremist activity across online platforms, which will prevent these groups from acquiring resources to commit public harm 
  • Sharing intel on trends, best practices, and changing tactics throughout the industry and in collaboration with law enforcement 
  • Use our skills -- digital literacy, technology, and web intelligence -- to find new and innovative ways to fight illicit activity online 

This is especially true as the 2024 election quickly approaches. Contemporary far-right groups are unifying and mobilizing against politicized social issues, promoting false rhetoric about LGBTQ grooming of children or a plethora of other far-right conspiracy theories. They’ve even gone as far as creating fake websites targeting the LGBTQ community, selling counterfeit Hormone Replacement Therapy (HRT) pills, which can cause health problems. Moreover, these groups are often well armed, stage aggressive and potentially violent protests, and insist that their operations align with the conservative political agenda of the GOP. 

A strong collaborative effort between public and private entities is essential to combat the proliferation of these groups within the online arena. 

Extremist groups will continue to use online platforms to spread hate and harm in real life. They will, at times, succeed. But we can use the weapons at our disposal to disrupt their attempts.

Working together to disrupt their communication and put a chokehold on their funding, we can take them down on a larger scale. 

Extremist groups abuse payment rails to obtain the money they need. Those of us in the payments industry have an opportunity – and a responsibility – to stop it.