Alt-right pipeline

The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups. This process is most commonly associated with and has been documented on the video platform YouTube, and is largely faceted by the method in which algorithms on various social media platforms function through the process recommending content that is similar to what users engage with, but can quickly lead users down rabbit-holes.

Many political movements have been associated with the pipeline concept. The intellectual dark web, libertarianism, the men's rights movement, and the alt-lite movement have all been identified as possibly introducing audiences to alt-right ideas. Audiences that seek out and are willing to accept extreme content in this fashion typically consist of young men, commonly those that experience significant loneliness and seek belonging or meaning. In an attempt to find community and belonging, message boards that are often proliferated with hard right social commentary, such as 4chan and 8chan, have been well documented in their importance in the radicalization process.

The alt-right pipeline may be a contributing factor to domestic terrorism. Many social media platforms have acknowledged this path of radicalization and have taken measures to prevent it, including the removal of extremist figures and rules against hate speech and misinformation. Left-wing movements, such as BreadTube, also oppose the alt-right pipeline and "seek to create a 'leftist pipeline' as a counterforce to the alt-right pipeline."

The effects of YouTube's algorithmic bias in radicalizing users has been replicated by one study, although two other studies found little or no evidence of a radicalization process.

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.