Alt-right pipeline
The alt-right pipeline (also called the alt-right rabbit hole) is a conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups.[1][2] This process is most commonly associated with and has been documented on the video platform YouTube, is also largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.[2][3]

Many political movements have been associated with the pipeline concept. The intellectual dark web,[2] libertarianism,[4] the men's rights movement,[5] and the alt-lite movement[2] have all been identified as possibly introducing audiences to alt-right ideas. Audiences that seek out and are willing to accept extreme content in this fashion typically consist of young men, commonly those that experience significant loneliness and seek belonging or meaning.[6] In an attempt to find community and belonging, message boards that are often proliferated with hard right social commentary, such as 4chan and 8chan, have been well documented in their importance in the radicalization process.[7]
The alt-right pipeline may be a contributing factor to domestic terrorism.[8][9] Many social media platforms have acknowledged this path of radicalization and have taken measures to prevent it, including the removal of extremist figures and rules against hate speech and misinformation.[10][6] Content creators affiliated with BreadTube have attempted to counter the alt-right pipeline with left-wing content.[11]
Process
Use of the internet allows individuals with heterodox beliefs to alter their environment, which in turn has transformative effects on the user. Influence from external sources such as the internet can be gradual so that the individual is not immediately aware of their changing understanding or surroundings. Members of the alt-right refer to this radicalization process as "taking the red pill" in reference to the method of immediately achieving greater awareness in The Matrix. This is in contrast to the gradual nature of radicalization described by the alt-right pipeline.[9][12]
Many on the far-right recognize the potential of this radicalization method and actively share right-wing content with the intention of gradually radicalizing those around them. A common method used to rope in new audiences into far-right circles is to use humor and memes to spread right-wing rhetoric and ideology to make the content palpable and acceptable to newer audiences. The nature of internet memes means they can easily be recreated and spread to many different internet communities. Examples of this can be seen in cartoon character Pepe the Frog, and internet personalities such as Steven Crowder.[12][13]
YouTube has been identified as a major element in the alt-right pipeline. This is facilitated through an "Alternative Influence Network", in which various right-wing scholars, pundits, and internet personalities interact with one another to boost performance of their content. These figures may vary in their ideologies between conservatism, libertarianism, or white nationalism, but they share a common opposition to feminism, progressivism, and social justice that allows viewers of one figure to quickly acclimate to another.[1] They often prioritize right-wing social issues over right-wing economic issues, with little discussion of fiscal conservatism. Some individuals in this network may not interact with one another, but a collection of interviews, internet debates, and other interactions create pathways for users to be introduced to new content.[2]
Youtube’s algorithm works by recommending similar content to users via a powerful recommendation system; which has been designed to quickly and easily recommend similar content or content that is otherwise similar to a user’s interest. Users can deep dive into topics that interest them easily instead of manually searching for content. On the other hand, this seamless user experience likewise allows newer audiences to be exposed to extreme content, and for already radicalized individuals to reconfirm their biases. As well, it is common for videos that promote misinformation, and conspiracy theories gain large traction in Youtube’s algorithm, because of how alarmist video content like this regularly garners millions of views, and continues to go viral.[9][6]
When a user is exposed to certain content featuring certain political issues or culture war issues, this recommendation system may lead users to different ideas or issues, including Islamophobia, opposition to immigration, antifeminism, or reproduction rates.[9][14] Recommended content is often somewhat related, which creates an effect of gradual radicalization between multiple issues, referred to as a pipeline. However, the platform has also been documented on several occasions recommending certain types of content that is entirely unrelated to the previous content a user typically engages with.[3][14] Radicalization also takes place in interactions with other radicalized users online, on varied platforms such as Gab, Reddit, 4chan, or Discord.[9] Major personalities in this chain often have a presence on Facebook and Twitter, though YouTube is typically their primary platform for messaging and earning income.[6] The sheer scale of the internet, as well as the many entry points into the pipeline, adds a level of complexity to the radicalization process that often makes it difficult to target and prevent.
Algorithms, regardless of the social media platform of which they serve, play a huge role in the way an individual’s internet experience is altered as they begin consuming more far-right content, due to the nature of always funnelling similar content to users. This phenomenon of algorithmic filtering can create echo chambers, which can make the de-programming process extremely difficult for individuals who are deeply entrenched in this content sphere and community. However, research into algorithms and social media is often difficult, since companies are not transparent or forthcoming to third-parties about their algorithms and data, and while the effects have been replicated in some studies,[2] and the extent of algorithmic bias is unclear.[10][15]
Content
The alt-right pipeline has been found to begin with the intellectual dark web community, which is made up of internet personalities that are unified by an opposition to identity politics and political correctness, such as Joe Rogan, Ben Shapiro, Dave Rubin, and Jordan Peterson. The intellectual dark web community overlaps and interacts with the alt-lite community, such as Steven Crowder, Paul Joseph Watson, Mark Dice, and Sargon of Akkad. This community in turn overlaps and interacts with the alt-right community, such as James Allsup, Black Pigeon Speaks, Varg Vikernes, and Red Ice.[2] The most extreme endpoint often involves fascism or belief in an international Jewish conspiracy,[12] though the severity of extremism can vary between individuals.[6]
The antifeminist Manosphere has been identified as another early point in the alt-right pipeline.[5] The men's rights movement often discusses men's issues more visibly than other groups, attracting young men with interest in such issues when no alternative is made available. Many right-wing internet personalities have developed a method to expand their audiences by commenting on popular media; videos that criticize movies or video games for supporting left-wing ideas are more likely to attract fans of the respective franchises.[6]
When a person clicks on a YouTube video that is on the edge of the alt-right pipeline, they may not realize the political nature of the video. videos at the edge of the pipeline express views that are consistent with a more agreeable conservative viewpoint, such as an anti-trans or capitalist viewpoint.[16] This then leads the viewer into an algorithmic rabbit hole that becomes more and more fringe until someone, who was initially interested in conservative content, is now consuming content that is incredibly radical, extremist, and often spreading fascist ideology.[17]
The online alt-right spreads ideology that is very similar to earlier white supremacist and fascist movements. The internet packages the ideology differently and, in many cases, makes the ideology more palatable and thus is more successful in delivering it to a larger number of people. The end goal of the alt-right movements in America and in Europe is to create a white ethnostate, to expel all non-europeans from whatever country white supremacists claim belongs to them.[17]
The format presented by YouTube has allowed various ideologies to access new audiences through this means.[6] The same process has also been used to facilitate access to anti-capitalist politics through the internet community BreadTube. This community was developed through the use this pipeline process to introduce users to left-wing content and mitigate exposure to right-wing content,[6][11] though the pipeline process has been found to be less effective for left-wing politics due to the larger variety of opposing left-wing groups that limits interaction and overlap.[11] This dichotomy can also cause a "whiplash polarization" in which individuals are converted between far-right and far-left politics.[6]
Psychological factors
The psychological factors of radicalization through the alt-right pipeline are similar to other forms of radicalization, including normalization, acclimation, and dehumanization. Normalization involves the trivialization of racist and antisemitic rhetoric. Individuals early in the alt-right pipeline will not willingly embrace such rhetoric, but will adopt it under the guise of dark humor, causing it to be less shocking over time. This may sometimes be engineered intentionally by members of the alt-right to make their beliefs more palatable and provide plausible deniability for extreme beliefs. Acclimation is the process of being conditioned to seeing bigoted content. By acclimating to controversial content, individuals become more open to slightly more extreme content. Over time, conservative figures appear too moderate and users seek out more extreme voices. Dehumanization is the final step of the alt-right pipeline, where minorities are seen as lesser or undeserving of life and dehumanizing language is used to refer to people that disagree with far-right beliefs.[9]
The process is associated with young men that experience loneliness, meaninglessness, or a lack of belonging.[6] Furthermore, the amount of young men experiencing loneliness and a lack of meaningful attachments means that these individuals will likely bond together over their shared feelings and commonality, which reinforces the feeling of community in right-wing internet spheres.[7] An openness to unpopular views is necessary for individuals to accept beliefs associated with the alt-right pipeline. It has been associated with contrarianism, in which an individual uses the working assumption that the worldviews of most people are entirely wrong. From this assumption, individuals are more inclined to adopt beliefs that are unpopular or fringe. This makes effective several entry points of the alt-right pipeline, such as libertarianism, in which ideologies attract individuals with traits that make them susceptible to radicalization when exposed to other fringe ideas.[4] Motivation for pursuing these communities varies, with some people finding them by chance while others seek them out. Interest in video games is associated with the early stages of the alt-right pipeline.[6]
One of the most effective methods of radicalization among the alt-right pipeline is the way freedom of speech and expression is portrayed by pundits. The belief is that, when expressing these socially conservative or bigoted viewpoints, individuals will experience criticism or blow back, their freedom of speech is being infringed upon. Often, it is called being "cancelled" or censored by opponents, and fighting back against these perceived attacks is highly encouraged by prevalent right-wing figures on the internet. The framing of criticism and discourse as a personal attack often means that individuals are highly defensive of their beliefs, and it can be difficult to engage in rational debates with such individuals, and to change and challenge their beliefs.
Because the alt-right are socially and politically conservative a lot of the ideology being spread is associated with the preservation of traditional values and ways of living. The idea that there may be a secret adversary that is focused on destroying their way of life is a powerful one. This primes many people to start believing in increasingly absurd and hateful conspiracies.[16]
The Illuminati, new world order, Q-anon, race realism, the great replacement, and the Jewish question are all popular conspiracies amongst the alt-right, some even making it to the mainstream because of their spread on facebook and youtube. Conspiratorial thinking has become a way to prime an audience to believe anything. It also makes it impossible to find objective truth because they start to believe that any facts or statistics that are found in mainstream science or news media are lies being spread by whatever force is trying to control the narrative.[17] The notion that information is being obstructed by a group pulling the strings is behind many of the most popular ideas among the alt-right. Race realism, the belief that different races have inherent biological and intellectual differences, is sold as credible when the audience is primed to believe that the science saying otherwise is being created by a group of people (usually Jewish people or “cultural Marxists”) trying to control the narrative and lie to people.[16]
This type of conspiratorial thinking being spread online has now become very popular and mainstream, even to the point of reaching the former president Donald Trump who was spreading conspiracies such as the coronavirus being a Chinese hoax and that vaccines cause autism. Conspiracy theories like these can start off mundane but quickly become rationale for hateful ideologies. The alt-right uses conspiracy and misinformation to excuse bigotry and as a way to radicalize others.[18]
Along with algorithms, online communities can also play a large part in radicalization. People with fringe and radical ideologies can meet other people who share, validate and reinforce those ideologies. Because people can control who and what they engage with online they end up never hearing any opinion or idea that conflicts with what they already believe. Echo Chambers uphold, reinforce and build upon radical beliefs and the strong sense of community and belonging that comes with it is a large contributing factor for people joining the alt-right and staying it in.[19]
Concerns and prevention
Internet radicalization correlates with an increase in lone wolf attacks and domestic terrorism.[8][20] The alt-right pipeline has been associated with the Christchurch mosque shootings, in which a far-right extremist killed 51 Muslim worshippers in Christchurch, who directly credited the Internet for the formation of his beliefs in his manifesto.[9][21] Many social media platforms have recognized the potential of radicalization and have implemented measures to limit its prevalence. High profile extremist commentators such as Alex Jones have been banned from several platforms, and platforms often have rules against hate speech and misinformation.[6]
In 2019, YouTube announced a change to its recommendation algorithm to reduce conspiracy theory related content.[6][14] In spite of promises to change, the Mozilla foundation launched a crowd-sourced research study into Youtube’s recommendation software, inspired by the lack of change from the platform.[3] In 2020, the extension RegretsReporter was launched, which was designed so users can report videos and content they regretted watching. Between July 2020 and May 2021, the Mozilla foundation began collecting reports from volunteers located in 190 countries to collect data on Youtube’s algorithm.[3] The topics of videos were wide ranging, but found out that 71% of reports came from videos recommended in the algorithm and were 40% more likely to be reported as a regret than videos that were searched for.[3]
Some extreme content, such as explicit deceptions of violence, are typically removed on most social media platforms. On YouTube, content that expresses support of extremism may have monetization features removed, may be flagged for review, or may have public user comments disabled.[10]
Violence committed by the alt-right has risen in the past decades while the amount of organized hate groups has actually decreased. Online radicalization has had a hand in increasing hate crimes and mass shootings, but because the majority of radicalized individuals have not yet acted on their inherently violent beliefs and have not formally joined any documented hate groups, they are hard for outsiders to categorize and understand.[22] This is what makes alt-right violence so unpredictable and hard to prevent, people who are radicalized online often show no outward signs of their extremist beliefs while offline, and even when they are open about their beliefs there's no way to tell when or if radicalized individuals may become violent because of those beliefs.[19]
Because the beliefs of radicalized individuals are not monolithic and they are virtually invisible outside the internet, until an individual decides to act on their inherently violent ideologies, there is little way to tell who is a part of the online white supremacy, neo-nazi or incel movements. The movements themselves are virtually leaderless without any cohesive plans or patterns in committing violence; attacks and acts of terror occur randomly and isolated from any known groups.[23] Because of this many view these acts of violence as freak occurrences with no ties to any real political agenda, when in reality it is the consequence of a very popular and growing ideology that actively justifies and rewards violence.[22]
The online alt-right has been dangerous in many ways besides inciting and producing lone wolf violence. Trolling has become very popular in the online alt-right and is often the most organized activities engage in. Massive online harassment campaigns that include death threats and doxxing have become prevalent amongst the alt-right.[22] This extremist trolling can be very dangerous because doxxing can incite violence towards the person whose information was leaked and there have even been occurrences of trolls anonymously making calls to police reporting false claims causing SWAT teams to raid the houses of people targeted.
See also
References
- Lewis, Rebecca (2018-09-18). Alternative Influence: Broadcasting the Reactionary Right on YouTube (Report). Data & Society.
- Ribeiro, Manoel Horta; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (2020-01-27). "Auditing Radicalization Pathways on YouTube". FAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency: 131–141. doi:10.1145/3351095.3372879. ISBN 9781450369367. S2CID 201316434.
- Mozilla (7 July 2021). "Mozilla Investigation: YouTube Algorithm Recommends Videos that Violate the Platform's Very Own Policies". Mozilla Foundation. Retrieved 25 March 2023.
- Hermansson, Patrik; Lawrence, David; Mulhall, Joe; Murdoch, Simon (2020-01-31). The International Alt-Right: Fascism for the 21st Century?. Routledge. pp. 57–58. ISBN 978-0-429-62709-5.
- Mamié, Robin; Horta Ribeiro, Manoel; West, Robert (2021-06-21). "Are Anti-Feminist Communities Gateways to the Far Right? Evidence from Reddit and YouTube". 13th ACM Web Science Conference 2021. WebSci '21. New York, NY, USA: Association for Computing Machinery: 139–147. arXiv:2102.12837. doi:10.1145/3447535.3462504. ISBN 978-1-4503-8330-1. S2CID 232045966.
- Roose, Kevin (2019-06-08). "The Making of a YouTube Radical". The New York Times. ISSN 0362-4331. Retrieved 2022-10-26.
- Hughes, Terwyn (26 January 2021). "Canada's alt-right pipeline". The Pigeon. Retrieved 25 March 2023.
- Piazza, James A. (2022-01-02). "Fake news: the effects of social media disinformation on domestic terrorism". Dynamics of Asymmetric Conflict. 15 (1): 55–77. doi:10.1080/17467586.2021.1895263. ISSN 1746-7586. S2CID 233679934.
- Munn, Luke (2019-06-01). "Alt-right pipeline: Individual journeys to extremism online". First Monday. doi:10.5210/fm.v24i6.10108. ISSN 1396-0466. S2CID 184483249.
- Ledwich, Mark; Zaitsev, Anna (2020-02-26). "Algorithmic extremism: Examining YouTube's rabbit hole of radicalization". First Monday. arXiv:1912.11211. doi:10.5210/fm.v25i3.10419. ISSN 1396-0466. S2CID 209460683.
- Cotter, Kelley (2022-03-18). "Practical knowledge of algorithms: The case of BreadTube". New Media & Society. doi:10.1177/14614448221081802. ISSN 1461-4448. S2CID 247560346.
- Evans, Robert (2018-10-11). "From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled"". Bellingcat. Retrieved 2022-10-27.
- Wilson, Jason (2017-05-23). "Hiding in plain sight: how the 'alt-right' is weaponizing irony to spread fascism". The Guardian. Retrieved 2022-10-28.
- Bennhold, Katrin; Fisher, Max (7 September 2018). "As Germans Seek News, YouTube Delivers Far-Right Tirades". The New York Times.
- Munger, Kevin; Phillips, Joseph (2020). "Right-Wing YouTube: A Supply and Demand Perspective". The International Journal of Press/Politics. 27 (1): 186–219. doi:10.1177/1940161220964767. ISSN 1940-1612. S2CID 226339609.
- Wilson, Andrew (2018-02-16). "#whitegenocide, the Alt-right and Conspiracy Theory: How Secrecy and Suspicion Contributed to the Mainstreaming of Hate". Secrecy and Society. 1 (2). doi:10.31979/2377-6188.2018.010201. ISSN 2377-6188.
- Daniels, Jessie (2018). "The Algorithmic Rise of the "Alt-Right"". Contexts. 17 (1): 60–65. doi:10.1177/1536504218766547. ISSN 1536-5042. S2CID 196005328.
- "Conspiracy Propagandists". Southern Poverty Law Center. Retrieved 2023-05-07.
- Alfano, Mark; Carter, J. Adam; Cheong, Marc (2018). "Technological Seduction and Self-Radicalization". Journal of the American Philosophical Association. 4 (3): 298–322. doi:10.1017/apa.2018.27. ISSN 2053-4477. S2CID 150119516.
- Hunter, Lance Y.; Griffith, Candace E.; Warren, Thomas (2020-05-03). "Internet connectivity and domestic terrorism in democracies". International Journal of Sociology. 50 (3): 201–219. doi:10.1080/00207659.2020.1757297. ISSN 0020-7659. S2CID 219059064.
- Veilleux-Lepage, Yannick; Daymon, Chelsea; Amarasingam, Amarnath (2020). The Christchurch attack report: key takeaways on tarrant's radicalization and attack planning (PDF) (Report). International Centre for Counter-Terrorism.
- "Alt-right pipeline: Individual journeys to extremism online". firstmonday.org. Retrieved 2023-05-07.
- Mølmen, Guri Nordtorp; Ravndal, Jacob Aasland (2021-10-30). "Mechanisms of online radicalisation: how the internet affects the radicalisation of extreme-right lone actor terrorists". Behavioral Sciences of Terrorism and Political Aggression: 1–25. doi:10.1080/19434472.2021.1993302. ISSN 1943-4472. S2CID 240343107.