At first, it wasn’t obvious that anything was amiss. Kids are naturally curious about the complicated world around them, so Joanna Schroeder wasn’t surprised when her 11- and 14-year-old boys recently started asking questions about timely topics such as cultural appropriation and transgender rights.
But she sensed something off about the way they framed their questions, she says — tinged with a bias that didn’t reflect their family’s progressive values. She heard one of her sons use the word “triggered” in a sarcastic, mocking tone. And there was the time Schroeder watched as her son scrolled through the “Explore” screen on his Instagram account and she caught a glimpse of a meme depicting Adolf Hitler.
Schroeder, a writer and editor in Southern California, started paying closer attention, talking to her boys about what they’d encountered online. Then, after her kids were in bed one night last month, she opened Twitter and began to type.
“Do you have white teenage sons?” she wrote. “Listen up.”
In a series of tweets, Schroeder described the onslaught of racist, sexist and homophobic memes that had inundated her kids’ social media accounts unbidden, and the way those memes — packaged as irreverent, “edgy” humor — can indoctrinate children into the world of alt-right extremism and white supremacy.
She didn’t know whether anyone would pay attention to her warning. But by the time she awoke the next morning, her thread had gone viral; as of Sept. 16, it had been retweeted more than 81,000 times and liked more than 180,000 times. Over the following days, Schroeder’s inbox filled with messages from other parents who were deeply concerned about what their own kids were seeing and sharing online.
“It just exploded, it hit a nerve,” she says of her message. “I realized, okay, there are other people who are also seeing this.”
Over recent years, white-supremacist and alt-right groups have steadily emerged from the shadows — marching with torches through the streets Charlottesville, clashing with counterprotesters in Portland, Ore., papering school campuses with racist fliers. In June, the Anti-Defamation League reported that white-supremacist recruitment efforts on college campuses had increased for the third straight year, with more than 313 cases of white-supremacist propaganda recorded between September 2018 and May 2019. This marked a 7 percent increase over the previous academic year, which saw 292 incidents of extremist propaganda, according to the ADL.
As extremist groups have grown increasingly visible in the physical world, their influence over malleable young minds in the digital realm has become a particularly urgent concern for parents. A barrage of recent reports has revealed how online platforms popular with kids (YouTube, iFunny, Instagram, Reddit and multiplayer video games, among others) are used as tools for extremists looking to recruit. Earlier this year, a viral essay in Washingtonian magazine — written by an anonymous mother who chronicled a harrowing, year-long struggle to reclaim her teenage son from the grips of alt-right extremists who had befriended him online — sparked a flurry of passionate discussions and debates among parents across social media.
Parents wanted to know: What was happening to their kids? Why was it happening, and how could it be stopped?
For extremist groups, the goal is hardly a secret; the founder and editor of the neo-Nazi website Daily Stormer has openly declared that the site targets children as young as 11.
“This is a specific strategy of white nationalists and alt-right groups,” says Lindsay Schubiner, program director at the Western States Center, a nonprofit focused on social, economic, racial and environmental justice. Schubiner co-authored a tool kit published by the center this year that offers guidance to school officials and parents who are facing white-nationalist threats in their communities.
“White-nationalist and alt-right groups use jokes and memes as a way to normalize bigotry while still maintaining plausible deniability,” Schubiner says, “and it works very well as a recruitment strategy for young people.”
Schroeder saw this firsthand when she sat down with her kids to look at their Instagram accounts together.
“I saw the memes that came across my kids’ timelines, and once I started clicking on those and seeking this material out, then it became clear what was really happening,” she says. With each tap of a finger, the memes grew darker: Sexist and racist jokes (for instance, a looping video clip of a white boy demonstrating how to “get away with saying the n-word,” or memes referring to teen girls as “thots,” an acronym for “that ho over there”) led to more racist and dehumanizing propaganda, such as infographics falsely asserting that black people are inherently violent.
“The more I clicked, the more I started to see memes about white supremacy,” Schroeder says, “and that’s what was really scary.”
That pattern of escalation is familiar to Christian Picciolini, an author and former neo-Nazi who left the movement in 1996 and now runs the Free Radicals Project, which supports others who want to leave extremist movements.
“Youth have always been critical to the growth of extremist movements, since the beginning of time. Young people are idealistic, they’re driven, they are motivated, and they’re not afraid to be vocal. So if you can fool them into a certain narrative that seems to speak to them, then that’s the growth of your movement,” he says. “And I’ve never seen an extremist movement grow as fast as I have in the last 10 years.”
Most of the people who contact Picciolini looking for help — anywhere from 10 to 30 per week, he says — are “bystanders,” people who are scared that someone they know or love is a white supremacist. And most of those bystanders are parents of teens and young adults.