How YouTube helps the Alt-Right’s Recruitment
Indoctrination always begins innocently. Do you feel isolated, directionless, angry or depressed? Go to YouTube and search “depression” or “depressed.” It won’t be long until you stumble upon some lecture on self-empowerment. The video has thousands of views. The speaker seems professional and balanced. Unbeknownst to you, the lecturer you’re watching is a public figure beloved by the alt-right.
You don’t know that, or maybe you barely care. You’re at your lowest, or angriest, and you’ve finally found a community and a message with direction. YouTube suggests several more related videos for you to watch, and you’re hooked. You’re in the alt-right’s recruitment pipeline now. It’s easy.
White nationalist groups in America have infiltrated the online depression community, targeting and grooming people using the same tactics as ISIS and Al Qaeda. YouTube and Facebook are helping them.
“People have a choice,” Professor of Psychology Jordan Peterson says in the video “Advice For People With Depression,” an excerpt from one of his lectures which now has over half a million views on YouTube. “Choice number one, nothing you do means anything. Well that’s kind of a drag right? Meaninglessness of life and all that existential angst.”
The attendees laugh. He weighs the pros and cons. “The upside is…you don’t have to do anything! You’ve got no responsibility.” Peterson’s tone of voice clearly communicates his sarcasm. “Now, you have to suffer because things are meaningless, but that’s a small price to pay for being able to be completely useless!”
Peterson goes on to choice number two. “The alternative is: everything you do matters…Well, if you buy that then you can have a meaningful life. But there’s no mucking around. It means responsibility.”
Peterson seems legitimate. He’s a clinical psychologist who teaches at the University of Toronto, after all. And the content of the lecture isn’t outrageous. He talks about the value of cleaning your room, the helpfulness of antidepressants, and the value in attributing meaning to smaller parts of life.
Peterson has become a vocal and well-known presence on YouTube, where his lectures are readily available. He recently published a book, 12 Rules For Life. David Brooks of the New York Times wrote that Peterson might be “the most influential public individual in the Western world right now.”
Peterson is also beloved by the alt-right.
Upon viewing, Peterson’s “Advice For People With Depression,” YouTube’s algorithm will suggest several more of his videos. Titles include, “Jordan Peterson on the Meaning of Life For Men,” “Jordan Peterson-How To Stop Rotting Away at Home,” “Jordan Peterson-The Tragic Story of the Man-Child,” and “Jordan Peterson on Western Women.”
The videos are far less innocuous, and his beliefs clearer. Peterson disagrees with transgender identity, the notion of white privilege, as well as issues of racial and gender inequality.
Peterson is fervently against “identity politics.” He claimed Canadian anti-discrimination laws could lead to people being arrested for misgendering someone. One of his YouTube lectures is titled, “Identity politics and the Marxist lie of white privilege.”
His arguments frame behavioral or psychological differences, as well as cultural issues, within evolutionary theory. This logic reinforces and justifies present societal assumptions using biology, ignoring the inescapable relativism of culture in favor of a sanitized and purportedly logical approach.
Peterson argues that there are intrinsic differences between men and women. He has stated that the gender pay gap is not a result of discriminatory practices, but a result of natural preferences men and women possess for jobs that appear masculine or feminine.
The idea that there are intrinsic differences between men and women beyond physiology has been frequently disproven. But the idea is appealing to his target audience: alienated men.
“The implied readers of his work are men who feel fatherless, solitary, floating in a chaotic moral vacuum, constantly outperformed and humiliated by women, haunted by pain and self-contempt.” Brooks writes in his NYT profile of Peterson.
“At some level Peterson is offering assertiveness training to men whom society is trying to turn into emasculated snowflakes.” Brooks continues. “His worldview begins with the belief that life is essentially a series of ruthless dominance competitions. The strong get the spoils and the weak become meek, defeated, unknown and unloved.”
Peterson is so successful because his audience is composed of young white men and teenagers who feel disaffected, depressed, or angry. They’re volatile, lost, and looking for answers. This makes them easy to exploit.
Peterson downplays the struggle of women, the LGBTQ+ community and people of color, while simultaneously emphasizing a narrative that focuses on the struggle of men. His videos encourage traditional gender roles and responsibilities as answers to feelings of aimlessness and depression. Men are supposed to work and fight for their families. Women are supposed to have and raise children.
It’s a narrative that’s comforting in its simplicity, but only serves to reinforce the insecurities and angers that have brought viewers to Peterson in the first place. It’s harm disguised as help.
It’s also effective. The comments posted to one of Peterson’s videos about what men’s purpose in life is are demonstrative of the influence he exerts.
User flat5sharp11 calls Peterson a wise and paternal role model for a “generation of young men brought up by single mothers, robbed of the empowerment of a father’s paternal influence by feminist idiocy.” A comment left by user 90 thanks Peterson for being a “true light in all this darkness” and commenter Angel Herrera writes, “man here, lost, but using you to find my way back. Thank you.”
As is the case with most public figures, an online community exists around Peterson and his ideas. Recruiters for alt-right movements operate in these digital circles, reaching out to people as online pals, befriending and grooming users before eventually sending them links to more extreme content.
Fans of Peterson are often directed towards Stefan Molyneux, an anarcho-capitalist known for openly promoting scientific racism and eugenics. Molyneux has stated repeatedly that black people have a lower IQ than whites, and believes that rape is a moral right. He has over 650,000 subscribers on YouTube.
Links to Molyneux’s videos and ideas are frequently posted in the Jordan Peterson sub-reddit, luring users further into extremist territory. From there, more grooming and linking is employed to draw people into the central circles of the alt-right.
The alt-right offers disillusioned men a sense of acceptance and community. Their online support groups are adept at validating men’s depression and then capitalizing on it, funneling new members of the community into a plan of action against an imagined enemy.
Al-Qaeda and other extremist groups target people in a near identical way. In a special report published in the United States Institute of Peace entitled, “Why Youth Join al-Qaeda,” Colonel John M. Venhaus outlines the archetypal personalities of those lured into the terrorist organization. One of the most common of the archetypes is what Venhaus refers to as the “revenge seeker.”
“In his logic, external forces are causing his unhappiness and making it hard for him to succeed.” Venhaus says. “He doesn’t know why he feels angry, so he is looking for something to be angry about.”
Al-Qaeda befriends the revenge-seeker, validates their frustrations and then provides them with a target for their anger. Alt-right organizations do the same with white men, directing their unhappiness and anger at imagined threats: people of color, the LGBTQ+ community, “globalists” or Jewish people.
“[Recruiters] are actively looking for these kinds of broken individuals…who they can promise identity to,” Christian Piccolini, a former neo-Nazi and co-founder of the peace organization Life After Hate, explains in an interview. “The ideology is not what drives people to this extremism, it’s…a broken search for that acceptance and that purpose and that community.”
In his TED talk, Piccolini explains that he was approached by a neo-Nazi recruiter as a young, disillusioned teenager. Piccolini was fourteen, smoking a joint in an alley, when a man twice his age came up to him, snatched the joint from his hands, and told him that the Jews and the communists wanted him to smoke marijuana to keep him docile.
“I was fourteen. I’d been trading baseball cards and watching ‘Happy Days,’ I didn’t really know what a Jew was.” Piccolini admits he didn’t even understand what the word “docile” meant. It didn’t matter. “It was as if this man in this alley had offered me a lifeline. For fourteen years I’d felt marginalized and bullied. I had low self-esteem, and frankly, I didn’t know who I was, where I belonged, or what my purpose was. I was lost.”
Piccolini was indoctrinated into the ideology and went on to become the leader of that same white supremacist organization, where he created white power music, recruited new members, stockpiled weapons and committed “acts of violence against people solely for the color of their skin, who they loved, or the god they prayed to.”
Piccolini was recruited in the 1980’s. Now, thanks to social media platforms like YouTube, it’s even easier to be drawn in.
Social media actively shapes the way we perceive reality. The internet has become the primary source of information for the vast majority of Americans and the world, but the entire platform has a glaring, fatal flaw: most all the information we see is being filtered through private companies.
Social media companies’ primary incentive is to get users to return to their site. To do that, sites use algorithms to show you content that you like to see based on what you’ve seen and searched for before. YouTube’s suggested video algorithm is central to its functioning. Engineers from within the company describe it as one of the “largest scale and most sophisticated industrial recommendation systems in existence.”
The algorithm pays attention to the videos you watch and search for and then suggests more content based on what you’ve seen. Eventually, all the videos presented to you on the YouTube homepage are specially tailored to you, calibrated to reaffirm or encourage your present beliefs and interests.
“YouTube is something that looks like reality, but it is distorted to make you spend more time online.” Guillaume Chaslot, an ex-employee at YouTube, explains. He worked on the website’s algorithm for three years. “The recommendation system is not optimizing for what is truthful, or balanced, or healthy for democracy.”
In fact, Chaslot argues YouTube is actively promoting radicalizing content. He designed software that simulates the behavior of a user who starts at a video and then follows a chain of recommended clips. The program operates without viewing history, ensuring that the videos being served are YouTube’s generic recommendations.
His research, published on Algotransparency.org, suggests that YouTube systematically amplifies videos that are divisive, sensational and conspiratal.
Why? Radicalization is profitable. Much like a fast food restaurant will pack customers’ food with tons of salt and sugar, YouTube’s algorithm peppers viewers’ suggested feeds with content that’s edgy, attention grabbing or incendiary. People like content that jolts them, and they’ll return to the site to see more of it. It’s addictive.
Not only are people seeing content that reinforces what they already believe, they’re also highly likely to see content designed to make them angry. It funnels users down isolated rabbit holes filled with inflammatory, niche content and tells them that’s the entire world. It’s mechanized indoctrination on a massive scale.
To write this article, I watched just three of Jordan Peterson’s videos. When I went to my YouTube homepage, there were over ten waiting for me. The site had also suggested others that clearly shared the same, or more extreme viewpoint.
How long would it take until that was all YouTube showed me? What if I were younger and didn’t know any better?
Kids today spend a huge amount of time learning and interacting with the world through the Internet. Half or more of their world exists there. It’s where they learn new things, find out what their interests are, meet people and talk to their friends. What happens when kids only see and hear increasingly extreme extensions of what they’ve seen before?
Let’s apply the operating logic of YouTube’s algorithm to Piccolini’s recruitment story. A far-right skinhead approaches a fourteen year-old boy in an alley and tells him that the Jews are trying to keep him docile. The boy isn’t sure what he’s talking about. He’s only fourteen. He’s young and hasn’t seen much of the world yet. There are a lot of people he hasn’t met yet, and a lot of things he hasn’t done. He might walk off undecided and uncertain.
But now he’s viewed the skinhead. Now, when he goes into the convenience store, he will see three more skinheads. They may also come up and talk to him. Now that he’s seen those skinheads he’ll go to the park and see ten more. Some of them are uniformed Nazis. He turns on the TV and there are skinheads delivering the news. The radio is playing more and more skinhead music.
Soon, everyone he meets begins to look, talk and think like the skinhead he first saw in the alleyway. The skinhead’s ideas are the only ones the boy hears.
The boy doesn’t even get a chance to change. He’ll never be shown a different viewpoint, just more extreme ones, and everywhere he looks he will see skinheads confirming the beliefs other skinheads have already said, and anyone who tries to convince him otherwise can be easily framed as the enemy. Why would these skinheads lie to him? They’re his friends! He’s known them, and only them, since he was fourteen. As a result of an accident, the boy now lives in the white power reality.
It is frightening to consider that this happens to someone every single day.
It is alarming to realize this is happening to all of us.
What we see online shapes how we think, but all we see online exists on the spectrum of what we’ve seen before. That means none of us are getting the larger picture. Everyone is radicalizing.
A culture fractured into radicalized and isolated splinter groups is headed for serious danger. If everyone is being led down increasingly extreme rabbit holes, there may come a time where no one can agree on any basic factual reality.
That’s an environment extremism and autocracy thrive in. It’s one that discourages empathy, limits patience and reduces our capacity to grapple with ambiguity. It encourages the types of mass violence we’ve seen play out in school shootings and hate crimes across the country. It’s one where the only fact is force.
The alt-right is using the inherently radicalizing properties of the Internet to their advantage. White power groups are preying on and indoctrinating Americans. Social media sites, with profit-driven algorithms, are complicit.
We are faced, then, with a domestic terrorism problem wrapped within a cultural crisis. If we don’t make crucial adjustments soon, the situation can deteriorate further than it already has. How do we block the alt-right’s recruitment pipeline?
In his report on Al-Qaeda, Venhaus writes that young men join the terrorist organization because “they see no other viable choice.” He consistently recommends creating programs and initiatives to divert the frustrations and narratives that drive men to Al-Qaeda into healthier areas.
In the same way, we must create more effective, kinder infrastructure to help those who might be susceptible to alt-right messaging. Such infrastructure should be community oriented, encouraging of cooperative work and connection with others from different backgrounds, and supportive of productive and creative outlets.
But it won’t be nearly enough if the Internet isn’t restructured. The private companies that dominate the digital sphere have recognized that radicalized and hateful content is profitable, and have taken steps to encourage its spread. In doing so, our reality has been fractured and polluted, and hate is bubbling up through the cracks.
Today, hateful groups prey on alienated young people with the aid of powerful, privately owned, and artificially intelligent algorithms designed to radicalize. Mass shootings and racist hate crimes dominate the news cycle while people on the left and the right fight with each other over the legitimacy of the survivors’ message, or whether the event even occurred. Social media companies feed the flames, only caring about whether the fighting is happening on their sites. It’s heartless, chaotic, and violent. It seems as though it can’t get much worse.
But it can. It can get much, much worse.