One month after controversial adult-content purge, far-right pages are thriving on Tumblr
This subtle far-right creep echoes a 2017 study by the Institute of Strategic Dialogue which warned that the far-right had become extremely adapt at using internet platforms to normalize their ideas. “The weaponisation of internet culture is deliberately used by extreme-right influencers to bring about attitude and behavioural change, in particular among the younger generations,” the report warned.
This sort of normalization of white nationalist talking points was what tech companies were supposedly shocked by – and promised to stop – in the wake of Charlottesville, as they provided an easy way of “red-pilling,” or radicalizing and recruiting, new members, most of whom are young, white, disaffected men.
White Supremacist Propaganda
The intent is to convince racist white people (who don’t think of themselves as racist, but who clearly are, and clearly feel angry when their entitlement isn’t immediately gratified) that the hate group in question is just ‘misunderstood’ and is really about pride and celebrating your own culture, etc.
The intent is that once someone falls for that bait and hook, they can play up on their underlying resentment and entitlement. If you already believe that you should be able to celebrate being white, and they can bring you from that belief to the belief that people of color are preventing you from your right to have pride in that, then they can foster anger against people of color. From there, any time there is a collective societal reaction of disgust towards the hate group or towards the notion of white pride, the recruited whites can be relied upon to feel victimized by society collectively.
Both of these essays accurately reflect my social media experience with mainstream white supremacy and white supremacist messaging, which is worded and coded in such a way that it seems plausible that a decent, reasonable person would agree with it if they didn’t know where it was coming from. “Loving your heritage doesn’t mean being a racist” is representative of this type of entry-level messaging, which is intended to target people who feel socially alienated and are searching for positivity and affirmation.
At the beginning of 2019, I expressed concern about people on Twitter getting upset over chunky otters and Marie Kondo. I understood why people were getting upset, of course, and there were some important discussions on the subject of Marie Kondo in particular. Still, there needs to be a serious and public conversation about covert white supremacist messaging, and I’m not sure that we took advantage of the opportunity to have it.
To give an example of what I mean by “covert white supremacist messaging,” back in 2016 or so I followed a few people who occasionally reblogged lovely nature photography. Tumblr’s algorithm then began to recommend all manner of weird gender essentialist and white supremacist posts. What I was eventually able to figure out is that the nature photography was of scenery in Germany specifically, and that the blogs posting it had tagged these posts as “featherwood,” a term that may have once been associated with female prison gangs but has since spread to people who have embraced a Quiverfull-style ideology concerning race and gender (namely, that it is the duty of white Christian women to have as many white Christian children as possible). As soon as I blocked the word “featherwood” on Tumblr, the problem was mostly fixed. I also had to unfollow three or four people who reblogged these posts, often alongside Steven Universe photosets and radical leftist “are the cishets okay” memes.
What I’m trying to demonstrate with this example is that there are in fact codewords and ideological patterns that are strong indicators of veiled white supremacist leanings. Because it’s entirely possible for intelligent people in progressive communities to spread alt-right memes without understanding what they are, I wish the huge public conversations about race and representation happening on social media would touch on this sort of thing.
Another example is the expression “the coastal elites,” which has been a white supremacist codeword for “the Jewish global conspiracy” since I was in college (and long before that, I’m sure). When people associated with the American left wing started talking about “coastal elites” during the lead-up and aftermath of the 2016 presidential election, that was a huge red flag for me. There were people on Tumblr reblogging all sorts of authoritarian craziness in the name of social justice, and I had no idea how to tell them that the ideological purity they were advocating was using the language of hardcore white supremacy. When I tried to explain my understanding of what was going on to a few people close to me, the response was inevitably something along the lines of “well you’re a racist for not understanding that Hillary is just as evil as Trump.”
It’s 2019, and you’d think we’d have figured this mess out by now, but that’s not the case. To give yet another example, I’ve recently seen a few of my friends and professional contacts on Twitter retweet things coming from people who advocate #humanscience and #humanbiodiversity. What the people who use these hashtags are specifically referring to is “race science” (here’s an archived webcapture of a widely circulated “human biodiversity reading list” for reference), whose main guiding principle seems to be the “scientifically proven” assertion that melanin is a chemical that causes violent and antisocial behavior. The message that these people (many of whom are writers whose work has been published in respected tech journals) are advocating is that, if we accept that science tells us that climate change is real and that we need to vaccinate our children, then we must also accept it when science tells us [insert whatever racist bullshit is currently trending here].
During the past month or so, when I’ve messaged a few people whom I know personally and have been friends with for years with a gentle note of caution, the responses have been along the lines of “So you’re an antivaxxer then” or “I wouldn’t have pegged you for a climate change denier.” It’s like, “Hang on there friend, I was just trying to give you a heads-up that the person you’ve been retweeting constantly for the past week is a secret white supremacist!” Except it’s not even a secret, because all the codewords are right there in their Twitter profiles.
What I’m trying to explain with these examples is that some people are indeed “secret racists,” and the reason that most decent people don’t see them for what they are is because most of us (thankfully) don’t have much exposure to white supremacist vocabulary or alt-right online spaces. The only reason I know a tiny fraction of what’s going on is because I grew up in the rural Deep South (where people tend to feel more comfortable with being openly racist) and now spend time on gaming forums where MRA-style misogyny can often serve as a gateway to more radical belief systems. My first instinct is to block and avoid this sort of thing when I encounter it, so I’m not an expert, and I still experience the occasional unpleasant surprise when I realize that something I thought was silly and harmless is, in fact, deeply disturbing. (I actually just reblogged something that turned out to be propaganda for the Church of Scientology, and I was mortified when someone told me what it was.)
This is why I wish the conversations people engage in on Twitter and other social media platforms about covert white supremacist messaging would focus more on identifying and explaining codewords and exposing and calling out creepy individuals. If we had more of a collective awareness that this sort of hidden messaging exists and is carefully designed to spread throughout mainstream social networks, it might be easier to identify and quarantine it. After all, the reason that cult-like belief systems are so fringe is because most people find them uncomfortable and strange and don’t want anything to do with them, but that doesn’t mean there isn’t real danger in inadvertently spreading this type of messaging.