The Internet Conspiracy Machine

About a month ago, a post that felt weird to me started circulating within my small circle of Tumblr mutuals. To make a long story short, there was a smart post by a popular Tumblr artist that someone had reblogged with an inflammatory addition. The inflammatory addition was from 2018, so I was curious why it had started circulating again in December 2020.

I asked my mutuals if they were reblogging the post because something specific had happened recently, but they couldn’t give me any background. It seemed that the reblog was nothing more than clickbait making the rounds while riding on the back of the original post. Tumblr being Tumblr, this happens all the time.

But this reblogged addition still felt strange to me. The user who created the reblog had deactivated their account, so I searched for their username to try to figure out who they are. I wanted to figure out if the inflammatory addition was referring to something specific or whether it was just someone venting on Tumblr – which, again, is fair. I honestly didn’t expect to find anything, but I was working on an academic essay on the general topic of the original post and thought it might be interesting to follow up on this lead.

What I found was that the inflammatory addition had originated in 2018 and spread within a circle of blogs dedicated to video games whose users openly identified as male. All of these blogs were only briefly active and hadn’t been updated since 2018. Their reblogs alternated between memes, game release announcements, and incendiary “social justice” posts.

I’m not sure how to explain the particular flavor of circa-2018 “social justice” posts on Tumblr, save to say that they are totalizing, polarizing, and extremely aggressive to an absurd degree. In aggregate, these posts engender a sense that there is an elite group of enlightened people who all share the same position and values, and who must foster their anger in order to stand against their enemies, who are presumed to be an equally monolithic group. Let me be clear that these posts are not about any specific real-world issues or political groups, but more along the lines of general ideological programming spread through discourse surrounding fictional characters and entertainment media. Such posts have nothing to do with critical readings or cultural critique, but instead take the form of brief and easily digestible “this thing is bad” slogans with jingoistic “people who don’t agree are also bad” insinuations.

In any case, what I found regarding the circle of video game blogs on Tumblr seemed suspicious, so I tried to figure out who these users were and where they’d gone. (I was no longer doing research for my essay, by the way; now I was just morbidly curious.) Tumblr has an optional function that allows users to crosspost to Twitter, so I ended up tracking down a few of these blogs via reposts on Twitter, where I ran across a surprising number of deactivated accounts. Between one thing and another – and this was a very deep rabbit hole, so I’m afraid I didn’t document my process as well as I could have – I ended up on Parler, a social media platform for the sort of alt-right people who tend to get kicked off Twitter.

Along with 8kun, Parler is one of the main seeds of the QAnon material that makes its way to Facebook and YouTube, and the conversations I saw on the site were completely divorced from consensus reality. There’s an excellent article about this on The Atlantic (here); but, to summarize, “the QAnon conspiracy” holds that the American government is rotten to its core, and even conservative politicians are almost literal comic book villains. Donald Trump, as someone coming from outside these evil political circles, is only person that “real” Americans can rely on, and he must therefore be defended from Democrats and Republicans alike.

At the time I encountered Parler in mid-December 2020, it was filled with people talking about contesting the election results, by force if necessary. Many of the hashtags, like #HoldTheLine, were military in tone, and people were sharing state-specific resources for obtaining firearms. There were a lot of links to videos associated with the Dorr Brothers, who oversee various regional organizations devoted to “no compromise” “Second Amendment rights.” (NPR has a limited-run podcast about this, if you’re curious.) There was also an extraordinary deal of antisemitism, with coded references ranging from “global capitalists” to “lizard people.”

I did not stay there long. I got super creeped out, to be honest.

The worst thing was that, between all the “Take Back America” rhetoric, links to QAnon videos on YouTube, and announcements for the Facebook Live events of reactionary political groups, people were sharing memes and joke posts about video games… and a lot of them were really good. To my shame, that’s why I stayed on the site for as long as I did, even after it had become painfully clear what I was looking at.

The appeal of QAnon conspiracies is that they speak to the marginalized in their own language, whether that language is video game memes, “traditional feminist” slogans, or decontextualized Bible verses. These conspiracies provide both an “it’s not your fault” justification for why individuals don’t succeed in neoliberal capitalism and a concrete path of action that elevates a normal person sitting at a computer to the status of a righteous crusader.

This sort of messaging is designed to appeal to anyone who feels as if they’re under attack from forces they don’t understand, which is perhaps why it has appealed so strongly to Donald Trump himself. Once I started picking up on QAnon codewords and hashtags, some of Trump’s more bizarre tweets from 2020 (such as “Nothing can stop what is coming”) started to make much more sense.

When Trump posted a video telling the rioters who stormed the Capitol building on January 6 that they’re “special” and that he loves them, this also made sense to me. Trump seemed to genuinely believe, as the rioters did, that they were on the righteous side of a holy war to protect the rights of the marginalized and prevent the fall of civilization at the hands of a nebulous and unspeakable evil.

Given my actual research interests, which have very little to do with American politics, you can probably guess that this whole thing started with Legend of Zelda. There may be some people reading this essay who might feel tempted to jump to the conclusion that the Zelda series is to blame for fostering an apocalyptic mindset because [insert racist generalization about Japanese people here]. I’m not saying that the Zelda games – or gaming culture and video games in general – aren’t without their problems, but please don’t let that be your take-away point.

I’m also not suggesting that the people on Tumblr who reblogged a post I found upsetting are ignorant. After all, most people on the platform are fully aware of how misinformation spreads, and we rely on a carefully curated grassroots social vetting system that serves as something of a Geiger counter to make sure we’re not getting close to anything radioactive. We’re all doing the best we can, and a few isolated posts from malicious actors aren’t going to hurt anyone.

Rather, what has struck me about this whole mess is how the tendency toward authoritarian thinking transcends political lines. I can’t say whether the Tumblr blogs that were active in spreading inflammatory “social justice” posts in 2018 were real people who ended up gravitating to the far right or the sock puppets of people already involved with far-right groups, and I don’t know who started circulating their posts again in December.

What I do know is that “us vs. them” essentialism is just as appealing to online communities in favor of progressive social justice as it is to online communities that propagate QAnon theories. Because of the way social media algorithms privilege content that evokes “engagement,” this type of thinking can spread far beyond these communities and become normalized even for people who don’t know anything about Wojak memes or video games or Tumblr or 4chan, whether they’re financially precarious retirees or recent college graduates who have just started to understand that they will never be able to pay off their student loans.

The key word here is “normalization,” because this is what makes extremists feel as though they have broad support for what they’re doing. For every one person who creates a social media account solely for the purpose of telling an artist or showrunner that she should kill herself because her content is “problematic,” or for every one person who showed up to the riot in DC on January 6, there are thousands of people in each of their extended communities who are directly supporting their actions online.

I think that, if both young people and older people could envision an actual future for themselves as valued members of society, then perhaps they wouldn’t be so invested in fantasies about destroying society. I know this makes me sound like a moderate apologist, but I’m not advocating for “compromise” or “seeing both sides.” What I’m trying to say is this: If there are multiple generations of people who are unemployed, underemployed, deeply in debt, and one random accident away from complete financial ruin, of course they’re going to be upset and looking for guidance, especially while they’re stuck at home or trapped in “essential worker” jobs during an ongoing pandemic. This is not a controversial statement to make.

Neoliberal capitalism is irreparably broken. So many people wouldn’t be in such a precarious position if it weren’t. Something needs to happen, because people need to be able to live without feeling as though they have to fight each other to survive.

In the meantime, social media corporations need to change their algorithms. It’s unnecessary, undesirable, and impossible to destroy the platforms on which authoritarian and apocalyptic discourses are created and disseminated. That being said, these fringe beliefs should not be as accessible, widespread, and normative as they are.

The individual and social formation of identity and ideology that happens online is real, and it has real consequences. I think it’s high time to start taking this seriously.