The Internet Conspiracy Machine

About a month ago, a post that felt weird to me started circulating within my small circle of Tumblr mutuals. To make a long story short, there was a smart post by a popular Tumblr artist that someone had reblogged with an inflammatory addition. The inflammatory addition was from 2018, so I was curious why it had started circulating again in December 2020.

I asked my mutuals if they were reblogging the post because something specific had happened recently, but they couldn’t give me any background. It seemed that the reblog was nothing more than clickbait making the rounds while riding on the back of the original post. Tumblr being Tumblr, this happens all the time.

But this reblogged addition still felt strange to me. The user who created the reblog had deactivated their account, so I searched for their username to try to figure out who they are. I wanted to figure out if the inflammatory addition was referring to something specific or whether it was just someone venting on Tumblr – which, again, is fair. I honestly didn’t expect to find anything, but I was working on an academic essay on the general topic of the original post and thought it might be interesting to follow up on this lead.

What I found was that the inflammatory addition had originated in 2018 and spread within a circle of blogs dedicated to video games whose users openly identified as male. All of these blogs were only briefly active and hadn’t been updated since 2018. Their reblogs alternated between memes, game release announcements, and incendiary “social justice” posts.

I’m not sure how to explain the particular flavor of circa-2018 “social justice” posts on Tumblr, save to say that they are totalizing, polarizing, and extremely aggressive to an absurd degree. In aggregate, these posts engender a sense that there is an elite group of enlightened people who all share the same position and values, and who must foster their anger in order to stand against their enemies, who are presumed to be an equally monolithic group. Let me be clear that these posts are not about any specific real-world issues or political groups, but more along the lines of general ideological programming spread through discourse surrounding fictional characters and entertainment media. Such posts have nothing to do with critical readings or cultural critique, but instead take the form of brief and easily digestible “this thing is bad” slogans with jingoistic “people who don’t agree are also bad” insinuations.

In any case, what I found regarding the circle of video game blogs on Tumblr seemed suspicious, so I tried to figure out who these users were and where they’d gone. (I was no longer doing research for my essay, by the way; now I was just morbidly curious.) Tumblr has an optional function that allows users to crosspost to Twitter, so I ended up tracking down a few of these blogs via reposts on Twitter, where I ran across a surprising number of deactivated accounts. Between one thing and another – and this was a very deep rabbit hole, so I’m afraid I didn’t document my process as well as I could have – I ended up on Parler, a social media platform for the sort of alt-right people who tend to get kicked off Twitter.

Along with 8kun, Parler is one of the main seeds of the QAnon material that makes its way to Facebook and YouTube, and the conversations I saw on the site were completely divorced from consensus reality. There’s an excellent article about this on The Atlantic (here); but, to summarize, “the QAnon conspiracy” holds that the American government is rotten to its core, and even conservative politicians are almost literal comic book villains. Donald Trump, as someone coming from outside these evil political circles, is only person that “real” Americans can rely on, and he must therefore be defended from Democrats and Republicans alike.

At the time I encountered Parler in mid-December 2020, it was filled with people talking about contesting the election results, by force if necessary. Many of the hashtags, like #HoldTheLine, were military in tone, and people were sharing state-specific resources for obtaining firearms. There were a lot of links to videos associated with the Dorr Brothers, who oversee various regional organizations devoted to “no compromise” “Second Amendment rights.” (NPR has a limited-run podcast about this, if you’re curious.) There was also an extraordinary deal of antisemitism, with coded references ranging from “global capitalists” to “lizard people.”

I did not stay there long. I got super creeped out, to be honest.

The worst thing was that, between all the “Take Back America” rhetoric, links to QAnon videos on YouTube, and announcements for the Facebook Live events of reactionary political groups, people were sharing memes and joke posts about video games… and a lot of them were really good. To my shame, that’s why I stayed on the site for as long as I did, even after it had become painfully clear what I was looking at.

The appeal of QAnon conspiracies is that they speak to the marginalized in their own language, whether that language is video game memes, “traditional feminist” slogans, or decontextualized Bible verses. These conspiracies provide both an “it’s not your fault” justification for why individuals don’t succeed in neoliberal capitalism and a concrete path of action that elevates a normal person sitting at a computer to the status of a righteous crusader.

This sort of messaging is designed to appeal to anyone who feels as if they’re under attack from forces they don’t understand, which is perhaps why it has appealed so strongly to Donald Trump himself. Once I started picking up on QAnon codewords and hashtags, some of Trump’s more bizarre tweets from 2020 (such as “Nothing can stop what is coming”) started to make much more sense.

When Trump posted a video telling the rioters who stormed the Capitol building on January 6 that they’re “special” and that he loves them, this also made sense to me. Trump seemed to genuinely believe, as the rioters did, that they were on the righteous side of a holy war to protect the rights of the marginalized and prevent the fall of civilization at the hands of a nebulous and unspeakable evil.

Given my actual research interests, which have very little to do with American politics, you can probably guess that this whole thing started with Legend of Zelda. There may be some people reading this essay who might feel tempted to jump to the conclusion that the Zelda series is to blame for fostering an apocalyptic mindset because [insert racist generalization about Japanese people here]. I’m not saying that the Zelda games – or gaming culture and video games in general – aren’t without their problems, but please don’t let that be your take-away point.

I’m also not suggesting that the people on Tumblr who reblogged a post I found upsetting are ignorant. After all, most people on the platform are fully aware of how misinformation spreads, and we rely on a carefully curated grassroots social vetting system that serves as something of a Geiger counter to make sure we’re not getting close to anything radioactive. We’re all doing the best we can, and a few isolated posts from malicious actors aren’t going to hurt anyone.

Rather, what has struck me about this whole mess is how the tendency toward authoritarian thinking transcends political lines. I can’t say whether the Tumblr blogs that were active in spreading inflammatory “social justice” posts in 2018 were real people who ended up gravitating to the far right or the sock puppets of people already involved with far-right groups, and I don’t know who started circulating their posts again in December.

What I do know is that “us vs. them” essentialism is just as appealing to online communities in favor of progressive social justice as it is to online communities that propagate QAnon theories. Because of the way social media algorithms privilege content that evokes “engagement,” this type of thinking can spread far beyond these communities and become normalized even for people who don’t know anything about Wojak memes or video games or Tumblr or 4chan, whether they’re financially precarious retirees or recent college graduates who have just started to understand that they will never be able to pay off their student loans.

The key word here is “normalization,” because this is what makes extremists feel as though they have broad support for what they’re doing. For every one person who creates a social media account solely for the purpose of telling an artist or showrunner that she should kill herself because her content is “problematic,” or for every one person who showed up to the riot in DC on January 6, there are thousands of people in each of their extended communities who are directly supporting their actions online.

I think that, if both young people and older people could envision an actual future for themselves as valued members of society, then perhaps they wouldn’t be so invested in fantasies about destroying society. I know this makes me sound like a moderate apologist, but I’m not advocating for “compromise” or “seeing both sides.” What I’m trying to say is this: If there are multiple generations of people who are unemployed, underemployed, deeply in debt, and one random accident away from complete financial ruin, of course they’re going to be upset and looking for guidance, especially while they’re stuck at home or trapped in “essential worker” jobs during an ongoing pandemic. This is not a controversial statement to make.

Neoliberal capitalism is irreparably broken. So many people wouldn’t be in such a precarious position if it weren’t. Something needs to happen, because people need to be able to live without feeling as though they have to fight each other to survive.

In the meantime, social media corporations need to change their algorithms. It’s unnecessary, undesirable, and impossible to destroy the platforms on which authoritarian and apocalyptic discourses are created and disseminated. That being said, these fringe beliefs should not be as accessible, widespread, and normative as they are.

The individual and social formation of identity and ideology that happens online is real, and it has real consequences. I think it’s high time to start taking this seriously.

The Captain-Planet-Official Problem

I’ve been open about my distaste for “call-out” and “canceling” culture within left-leaning spaces on social media. My overarching point is that people shouldn’t be harassed online, especially not for stupid shit that doesn’t matter.

I’m going to put it out there, however, that sometimes people do in fact need to be shut down. Figuring out where to draw the line between “something you don’t like that isn’t hurting anyone” and “a toxic asshole who can just be blocked and ignored” and “a serious problem that is affecting far more than a tiny online community” isn’t always easy, however. There’s a lot of moral gray area here, and I think it’s worth talking about.

In my experience, one of the main issues that comes up during these discussions is something I’m going to call “the Captain-Planet-Official Problem” after an ecofascist blog on Tumblr that was extremely popular in relatively mainstream circles before it got shut down. The problem is this, basically: A lot of alt-right gateway accounts are popular because they’re funny, relatable, friendly, kind, and filled with memes that most reasonable people will find silly and inoffensive.

If you’re unfamiliar with ecofascism, its message is more or less “save the planet by getting rid of all the brown people.” This is often couched in terms of “controlling invasive species that are threatening native plants and wildlife,” and it’s connected to the tendency of various “European identity movements” to celebrate the natural beauty and “environmental heritage” of places like Germany and England. There’s often a superficial level of anti-capitalism accompanying this messaging, like, “What if this beautiful landscape were bought and destroyed by global capitalists?” In this case, “global capitalists” usually means “Jewish people,” but I’ve also seen it applied to “Asian people” with no attempt to differentiate between people from various East Asian and South Asian countries.

Left-leaning communities in social media spaces love the environment and hate capitalism, which is fair. What this means is that, if some well-meaning person sees a random post from a blog with a catchy username like Captain-Planet-Official about how capitalism destroys the environment, they probably won’t think twice about reblogging it. Maybe they, or one of their followers, might even go to the Captain-Planet-Official blog, which (based on the screencaps I’ve seen) had a lot of clever shitposts and a charming and active moderator. If they decide to subscribe to this blog, they’ll probably realize soon enough that there’s a disturbing current of white supremacy underlying the memes.

At this point, one of three things will happen. Some people will unfollow the blog and resolve to be more careful in the future. Some people might have gotten a taste of the Kool-Aid (or red pill, or whatever) and decide that they like it and want to pursue it farther into some of the more overtly right-wing blogs that regularly interact with Captain-Planet-Official. Most people, however, will decide that it’s just Tumblr or Twitter or Reddit or Imgur and therefore doesn’t really matter. This latter group of people is willing to put up with occasional messaging about “invasive species,” which they might not understand or even see if they’re not familiar with that specific type of coded dog-whistle political language or just don’t spend that much time on social media.

The people who are interested in the white supremacist messaging will probably only be a tiny minority, but those people are out there, and I’m willing to bet that there are actually a lot of them in the left-wing spaces occupied by their peers. They feel increasingly alienated, but they’re also like, “I’m not a literal Nazi,” so they won’t enter clearly marked right-wing spaces directly. For people like this, something like Captain-Planet-Official is a gateway; and, the wider the gateway – the more people who promote it by reblogging its more inoffensive posts – the more people who will end up passing through it.

(By the way, this is a post with screencaps of a good collection of tweets about how these gateways work.)

Obviously there’s something unpleasant going on with left-wing spaces that alienate certain subsets of people so much that they’re driven to white supremacy or men’s rights activism or whatever Breitbart is going on about these days, but that’s an unpopular conversation to have for many (extremely valid) reasons. It’s also highly likely that a baby cryptofascist would have found their way into the alt-right anyway, even if they didn’t encounter Captain-Planet-Official on Tumblr. So why bother doing or saying anything about Captain-Planet-Official? They’re not aggressively hurting anyone, and they’re probably doing more good than harm in the way by spreading awareness of environmental issues and helping vocalize resistance against capitalism. Right?

So this is the problem: How do you explain that what a blog like Captain-Planet-Official is doing is a completely different type and level of “problematic” than a blog that celebrates an imagined romantic relationship between two fictional characters? And are you going to send a message to everyone you follow who reblogs a popular post from that blog? And if one of these people tells you that they don’t care, because it’s just a stupid meme on Tumblr and doesn’t matter, are you going to unfollow them? And if you unfollow them, what are you going to say to the other people in your fandom who still follow them? That they shouldn’t interact with them because they don’t take white supremacy seriously? Because they reblogged a Captain Planet meme about protecting the environment by fighting capitalism?

I should mention that this goes for radical left-wing messaging as well, especially when it comes to TERFs (trans-exclusionary radical feminists) who use catchy feminist slogans to promote homophobic and exclusionary rhetoric and ideology. “How dare you say that protecting women’s rights and reblogging pride flags is bad” works in exactly the same way as “How dare you say that protecting the environment and reblogging cute animal pictures is bad.”

In any case, I don’t think the answer is necessarily to appeal to the powers that be to shut down a particular blog. As the Tumblr “flagged posts” debacle proved last December, there’s a lot of potential for abuse and basic ignorance when distant authorities are invoked, so it’s in the best interests of a community to figure out how to handle problems like Captain-Planet-Official on their own. Deciding where the line is between the normal stupid bullshit that happens on social media and something that’s genuinely scary isn’t always as easy as it is with Captain-Planet-Official, nor is it always easy to tell when someone has crossed that line and isn’t coming back. This is why I wish left-leaning communities would stop devoting so much energy to asinine ship wars (Will Rey and Kylo Ren kiss? Who cares??) and start using the principles of social justice to figure out what to do about the promotion of dangerous ideologies that’s happening in the real world right in front of them.