When Seminar Classes No Longer Work

I’d like to say that the Fall 2019 semester was wonderful and that all my students were brilliant. It feels good when everything is going well, after all, and I like to brag about how smart my students are.

The truth is, however, that this semester was miserable, and I couldn’t for the life of me figure out why. What was going on? What was I doing wrong? What could I do better?

Now that the semester is behind me, I’ve come to the conclusion that something wasn’t working with the students. I hate to shift blame onto someone who isn’t me, but I think that what happened with my classes this semester is indicative of a larger trend in higher education.

Students are no longer capable of engaging with course material that requires reading or watching something for more than a few minutes. This is fine in large lecture classes, but it makes smaller and more discussion-focused seminar classes very awkward and uncomfortable.

There’s a lot to unpack here, and it would be helpful to give some background information first.

There are generally three types of classes in a university: lectures, seminars, and labs.

Lab classes are above my pay grade, so I’m not going to talk about them.

Lecture classes are larger classes with minimal participation. The professor stands on stage presenting information, and evaluations are structured so as to measure the students’ retention of this information. If an individual student performs poorly, they fail. Unless a specific professor has an innate knack for entertaining an audience, most people don’t particularly enjoy lecture classes. These classes are necessary and vital to what a university does, of course, but students tend not to think that they’re fun or engaging.

Seminar classes are smaller and focus more on individual student participation. Although it’s still necessary to evaluate students based on their retention of information, these classes allow much more room for individual student expression. A lot of Social Science and Humanities courses with more specialized topics, such as the ones I teach, are seminar classes. If a seminar class goes well, both the students and the instructor can get a lot out of the experience. This is the type of class where students have the sorts of transformative experiences that the institution of higher education likes to celebrate.

In lecture classes, the instructor basically has to show up, present material, perform evaluations, and make sure these evaluations are assessed properly. This is far from easy, and there’s a lot of planning and preparation involved. Still, it’s the responsibility of the students to do the work or risk failing the class. If a student gets a bad grade, it’s the fault of the student, not the instructor.

In seminar classes, the instructor still has to show up, present material, perform evaluations, and make sure these evaluations are assessed properly. At the same time, because the classes are smaller, there’s more of an opportunity for student engagement, and most seminar classes are designed to take advantage of this. While an introductory lecture class in Art History might expect students to memorize lists of names and dates and manufacture short and shallow essays about, for example, the symbolism of fruit in Renaissance painting, a seminar class would encourage students to dig deeply into a more specialized topic while discussing their own thoughts and intellectual interests in a structured manner.

Because the instructor of a seminar class has more leniency in evaluating each individual student’s performance, they also have much more responsibility for what each student gets out of the class. This involves a lot of extra work for the instructor, but the trade-off is that they can more or less teach what they want.

Most seminar classes are upper-level – meaning that they’re intended for students in their third year or above – and many also have prerequisites. The idea is that you can’t just throw someone with no prior knowledge, training, or skills in the discipline into a small class and expect them to succeed. Essentially, by the time someone becomes eligible to take a seminar class, they should be able to contribute to it at the requisite level. In other words, it wouldn’t make much sense to allow someone with no knowledge of Art History into an advanced seminar for specialists, even if they really enjoy art.

A major problem in American universities, however, is that many colleges now require courses to have a minimum enrollment in order to avoid being cancelled. People who teach seminars are therefore under pressure to open registration to everyone. In addition, instructors are also pressured into allowing upper-level seminars to count for university general education requirements, especially if the class is offered by a “program” instead of a “department.” (A department can offer a major and thus grant a degree, while a program is generally younger and smaller and can generally only offer a minor or a secondary major.) What this means is that, practically speaking, you might have a bunch of sophomore accounting majors taking an upper-level seminar (offered by the Gender Studies program instead of the Art History department) specifically about the queer symbolism of fruit in Renaissance painting.

Meanwhile, students don’t want to risk their GPA on a seminar with a strict instructor and a qualitative assessment structure, so people who teach seminar classes are under a lot of pressure to make sure everyone gets a decent grade. This means that there’s no way to warn students if they’re not doing the work and not performing at the required level.

This is not an ideal situation. As you might imagine, it creates problems.

Many experienced instructors can handle these problems with a range of strategies that can help to make the best of less-than-ideal situation. Unfortunately, a collection of ill-prepared students sitting in an advanced seminar can result in some truly awkward situations that nothing can be done about.

In order for a seminar to work, there need to be at least two good students. If there’s only one good student, the rest of the class will resent them, and that student will grow to resent the class in turn. Two students can get a discussion going, however, and all it takes is the enthusiasm of two people to encourage the other students to contribute as well. Having multiple good students also enables each individual student to slack off sometimes, meaning that the classroom space feels more collaborative.

What I mean by a “good” student is a student who can and will do the work and contribute to the class. A student who can do the work but doesn’t isn’t a good student. A student who has done all the work but sits quietly in the back of the room isn’t a good student. A student who hasn’t done any of the work but “contributes” anyway isn’t a good student either. Meanwhile, the students who don’t do the work, don’t pay attention, and have to be “reminded” to put away their phones and laptops in the middle of class are aggressively bad students. For a seminar to be successful, the instructor needs to work with the good students to create a critical mass of goodness (attention, engagement, contribution, and collaboration) that overwhelms the badness (the attitude of students who clearly don’t want to be in the room) before it becomes pervasive.

An experienced instructor can set boundaries and encourage a productive classroom environment by rewarding goodness and punishing badness. If there’s no genuine student engagement, however, there’s nothing to reward. Meanwhile, in the hope of at least maintaining a neutral status quo, it can be tempting for an instructor to avoid punishing disengagement by, for example, calling out a student who is clearly spending the entire class scrolling through social media.

It goes without saying that it’s easier for men to set these boundaries. If you’re not a cisgender man, you’re already starting at a disadvantage, and every other minority positionality you occupy makes it even harder to maintain an atmosphere of civility and respect in the classroom. There has been so much work (here’s an annotated selection) that demonstrates how sexism functions in a university setting that this observation has almost become a truism, but it’s still worth commenting on.

To return to the point of this essay, what happened this semester is that I didn’t have a single good student in either of my seminar classes. I know that sounds awful and judgmental, but this is what I mean:

(1) Not a single student prepared for class by doing the required reading or viewing.
(2) Even if one or more of them did, those students did not engage with the class.
(3) Multiple students were actively disengaged and disruptive.

What this meant was that…

(1) It was not possible to have a sustained discussion about the material.
(2) It was not possible for students to otherwise engage with the material.
(3) Students were not capable of retaining or intellectually processing the material.
(4) Students grew to resent being asked to engage with material they couldn’t understand.
(5) Students who were not prepared still insisted on speaking, which was awkward and awful.

In other words, everyone was uncomfortable and no one learned anything.

I tried to mitigate this as best I could by offering praise and encouragement on evaluations, devising in-class groupwork projects and other activities meant to stimulate engagement, and having students watch and discuss short videos during class in lieu of doing any preparation outside of class. I learned the students’ names, I memorized their interests and hobbies in order to help bring them into class discussion, and I played trivia games about tangentially related material to help wake them up and get them in a positive frame of mind. I gave them all sorts of snacks with sugar and caffeine, and I even brought my dog to class several times for stress relief.

But nothing worked. Even students who showed promise at the beginning of the semester were performing poorly by the end, and I felt awful.

What I ended up doing was relying more and more on my presentations, thereby transitioning my classes away from seminar discussions and more toward lectures. Unfortunately, the course material I chose for my seminars was ill-suited to become the subject of a series of lectures, by which I mean a set of discrete topics that could be broken down into smaller units of information suitable for evaluations intended to test basic retention.

Writing these lectures and evaluations felt boring and empty to me, and I hated it. To give an example, imagine having a class about Sailor Moon in which, instead of discussing what makes the show so fun and interesting and culturally meaningful, you have to present the text as something along the lines of, say, “Please list three visual elements of Sailor Moon intended to appeal to its target demographic.” As a result, class sessions that could have been really special and magical became tedious and soulless.

The main problem was that none of the students did any of the assigned reading. Let me emphasize this: None of the students did any of the assigned reading. Moreover, most of them had no intention of doing the reading. The course hosting platform my university uses, Blackboard, allows the instructor to track who has accessed the course material, and almost no one even clicked on the links for the assigned readings and videos. Because none of the students were willing to do any work for my classes, I had nothing to work with myself, and every single one of my attempts to engage the students with the material (and with each other) ended in an awkward failure.

As I wrote earlier, I ended up turning my seminar classes into lecture classes, and any discussions we had were very broad and not too terribly productive. I did my best to smile and laugh through the entire semester, but I have no idea what my students got out of my classes.

I’ve been noticing a trend that’s become more pronounced with each passing year, and I think it’s finally time to acknowledge what’s going on – the undergraduates at my university are unable to read or watch more than a few uninterrupted minutes of video. Even when presented with short and accessible material, they cannot engage with it. I’ve always had a few students in every course who were capable of doing enough work to contribute to a productive classroom environment, but their numbers have been shrinking, and this semester I finally hit zero.

I have theories about how we got to this point, but that’s immaterial. What’s more important is figuring out where to go from here.

Specially, what sort of material can students engage with? Moreover, is the specific information gained through college classes what’s important, or should the emphasis of seminar classes be on the process of developing textual and media literacy and critical thinking skills? If the goal is the retention of information, is there a better way to deliver this information? Something like a podcast, perhaps? If the process is important – and I really think it is – what can be done to encourage it? What needs to change so that students can do the work they need to do?

The only conclusion I have to offer is that this issue requires more research. Surely I’m not the first person to have made this set of observations, and there have to be strategies for addressing these problems in order to create classes that are more useful, beneficial, and interesting to students.

Writers Have to Be Supported to Survive


I’ve recently seen several posts with tens of thousands of notes circulating around Tumblr that are extremely critical of the idea of fanfic writers accepting donations to support their activities. Many of them, such as the one excerpted above, refer to the guidelines of AO3, which are meant to defend the right of the site to exist on the basis that the content it hosts is purely transformative and not intended for profit. The undertone of these posts, however, is a strong pushback against the idea that fanfic writers might aspire to the same levels of professional success and support as other creators in fandom.

I would like to argue that the idea that fan writers deserve to have a choice whether to receive compensation for their work is reasonable, especially since many highly visible fan artists, YouTubers, and Twitch streamers can receive hundreds, thousands, and even tens of thousands of dollars every month through donation sites like Patreon and Ko-fi.

Yes, intellectual property is protected by law and legal precedent, and it’s important to understand fandom history and to respect the ongoing battle AO3 has to fight. And yes, fan writers use copyright-protected names and scenarios. At the same time, fan artists use protected names, scenarios, and images, while YouTubers and streamers use protected sound and video – and sometimes the entirety of the protected work. If the “transformative work” and “added value” and “critical commentary” and “performance” arguments of fair use laws apply to visual artists, video creators, and streamers, why don’t they work for writers?

There are three things going on here.

The first is that AO3 is an independent NPO, not a giant media conglomerate. Even if YouTube is forced to take down certain videos that violate intellectual property laws, YouTube itself is not in danger of being taken offline. AO3 is in a much more precarious situation and therefore has to be extra cautious. This is an issue specific to AO3, however, and it’s not universally applicable to other hosting and sharing sites.

The second is that many media corporations in the United States consider digital images to be ephemeral, meaning that they have a short shelf life in the popular consciousness. Fan art and video streams shared on social media will help to promote a piece of media while it’s still trendy, but they also tend to be quickly consumed and discarded and thus aren’t perceived as being in danger of becoming long-term competition for the original media property. Because it used to be published in the form of physical books and magazines, fanfic was considered to be competition, but this perception has changed, partially due to the support fanfic has received from commercially successful writers like J.K. Rowling and George R.R. Martin.

The third is sexism. This is complicated; but, to make a long story short, fanfic has been treated differently because, unlike illustration and video editing, it is primarily associated with communities of women.

Media industries overwhelmingly dominated by men, such as comics and movies, have always provided ways for younger male fans to enter the industry as professionals. There is a long history of commercial studios actively scouting emerging talent from popular fan artists and amateur video producers, so media corporations have a vested interest in not completely shutting down spaces in which these creators can develop and exhibit their talents. For example, an aspiring comic artist can take his portfolio of X-Men character illustrations to a comics convention to show to an industry representative, and Marvel will hire him if they like his fan art. Because these industries have traditionally been male-dominated, however, the work of women was seen as derivative and embarrassing. A male artist who drew a fan comic would get a job, and a woman who wrote fanfic of the same media property would get a cease and desist letter.

Moreover, women have historically been expected to be the keepers of public morality. For instance, a male professor who writes mediocre novels about cheating on his wife with underage female students can easily be promoted to the head of a prestigious creative writing program, while a woman in any profession can be in danger of losing her job for writing any novel at all. Because of this, many female writers have had to hide their creative careers in a way that male artists and video producers have not. Even though these prejudices are fading, many fic writers are still very serious about protecting their real names and identities. At the same time, many fan artists and other creators use their fanwork to promote themselves while using their professional names – and, thanks to social media, we can now see that not all of these creators are male.

Because a new generation of female and nonbinary fan artists, animators, video producers, and streamers are now comfortable pursuing their creative careers while using their professional names and accepting donations while they establish themselves, it only makes sense that fan writers would want to do the same thing. After all, if people like Rebecca Sugar and Noelle Stevenson can go from posting popular fan art and fan comics on Tumblr to becoming mainstream showrunners, why couldn’t a female or nonbinary fanfic writer go on to become an actual scriptwriter for the next, say, Star Wars or Pokémon movie? If illustrators, comic artists, YouTubers, and Twitch streamers can receive donations to support their fanwork while they establish their careers, what arbitrary rule says that writers can’t do the same thing while still respecting AO3’s legal guidelines?

There is an entire generation of younger writers who have come into fandom with ambitions of professional success and no understanding of why they should feel pressured to separate their fandom identity from their professional identity or why they shouldn’t have the choice to receive the same support as creators working in other mediums. Instead of mocking younger writers for not knowing fandom history – and instead of shaming older writers for resisting outdated prejudices – I think it’s worth it to support them and hopefully change the culture.

Most people don’t want donations and only think of fandom as a fun escapist hobby, but writers should still be able to access the same choices as other creatives. I’ve already shared my thoughts about the issues I personally have with Patreon, but that doesn’t mean I don’t want other people to explore that option for themselves. After all, writers have to be supported for fanfic to survive.

I feel like I could write an entire book about this – and I have! I’m keeping my fingers crossed that the publisher can stick to the May 2020 release date, because I’d really like to talk more about fandom and cultural change, as well as what the achievements of artists might suggest about the future of fiction.

How Tenure Works (and Doesn’t Work)

There are three broad types of teaching faculty in an American university: tenured, tenure-track, and everyone else. Tenured and tenure-track professors are essentially white-collar workers on multi-year contracts who receive full benefits and are eligible for paid research leave.

“Everyone else” varies from university to university, but the majority of people who aren’t tenured or on the tenure track have short-term contracts and receive no benefits. Although “everyone else” used to be the exception, they now make up roughly 75% of all teaching faculty in higher education in the United States. This is obviously a huge problem, and I’ll return to it later.

Tenured faculty enjoy the full privileges of employment at a university, including the ability to participate in the committees that decide department and university policy. They are also eligible to rise to high-level administrative positions. Tenured faculty also have a bit more power when it comes to “quality of life” issues like being able to schedule their classes at their preferred times and not having to teach large first-year classes. Their salaries are higher, the length of research leave they can take is longer, and they’re more likely to receive institutional funding. Tenured faculty can also teach graduate-level seminars – sometimes exclusively – and take on grad students.

The main privilege that tenured professors enjoy is that, short of sexually assaulting someone, it’s very difficult for the university to fire them (or to force them to retire, which is actually a major issue right now). This means that they can take longer to complete more ambitious projects, and they can start publishing with commercial presses and become public intellectuals if they like. There’s also no need for them to receive high student course evaluations, which gives them the freedom to develop more experimental classes and teaching methods while not having to put up with stupid undergraduate bullshit (like worrying about whether a kid will give your course a low score if you have a class session about race or LGBTQ+ material, for instance). Because they don’t have to worry about teaching and publishing so much anymore, tenured professors also have more time to become active in university service and administration.

A tenure-track professor, usually referred to as an “assistant professor,” has been hired by the university at an entry-level position. In order to be promoted through tenure, an assistant professor has to jump through burning hoops of fire. I know that’s an abstract description, but I don’t how else to put it. Because of the extremely competitive academic job market, the only people who are hired for tenure-track positions tend to be already functioning at the level of a tenured professor when they walk in the door. Regardless, receiving tenure isn’t a foregone conclusion, even at second- and third-tier schools with very few institutional resources and a nonexistent level of faculty support.

Unless a tenure-track professor is a serial molester or a complete academic fuck-up (or both at the same time), it’s actually in a university’s best interests to grant them tenure, usually after they’ve spent four or five years in the position. During this time, this person will have published research with the university’s name on it and otherwise promoted the university’s brand through their work, and they will have established a set of classes they can reliably teach. They will also have grown accustomed to the university’s culture while making connections with other faculty and staff members. In other words, the university has already put a lot of investment into someone by the time they go up for tenure, and that person has already become associated with their university in their broader field. Both as an institution and as a brand, a university wants to show that they have a lot of tenured faculty members, as faculty retention demonstrates not only the university’s wealth but also its prestige.

Still, assistant professors are required to demonstrate professional excellence in order to be granted tenure. The details of how this works differ from school to school; but, generally speaking, applicants are required to submit a portfolio of various materials that often runs more than a thousand pages in length. This portfolio will not only contain letters of support from people within the university but also from leading members of the applicant’s field – none of whom the applicant can choose or otherwise designate.

A tenure-track professor therefore has to publish as much as they can while establishing a strong professional reputation within four years, all while developing new classes, teaching a full course load, and getting high scores on student course evaluations. Although their service to their own university is limited by their rank, tenure-track professors need to “serve the field” by doing things like editing, translation, peer review, public lectures, media appearances, and so on. It’s a lot of work, obviously, but we wouldn’t be in the profession in the first place if this sort of thing didn’t give us a sense of satisfaction. This is one of the main reasons why the attrition rate for PhD programs is so high – at some point a lot of people realize that this isn’t what they want to spend their lives doing, which is valid.

In any case, someone going up for tenure first submits their portfolio to a special committee made up of members of their department, as well as one or more members of other departments who are qualified to judge their competence. The committee then makes a recommendation to the applicant’s home department, which takes a vote. The department chair will write a letter of support (or caution) based on that vote, and the applicant’s tenure case will be assigned to a nonpartial liaison who will present the case to the university.

In the end, it’s the university that decides whether or not to grant tenure. Even if the department votes against someone’s case, and even if their department chair hates them, the university can still decide to give them tenure. Because professors have a well-known tendency to be petty and resentful toward each other, it’s often the case that the university will grant tenure to someone a department has voted against. The reverse is also true – a person can be admired and respected by everyone they work with, but the university can still decide not to grant tenure for whatever reason it chooses. A decision against tenure may have nothing to do with the applicant at all; the university may have decided to discontinue funding for that particular tenure line in order to open a tenure line in another department, for example.

It goes without saying – and there is a towering tsunami of evidence that supports this – that the tenure process is biased against women, people of color, and other minorities. Women especially are held to higher standards, and any other minority identity that might apply to them only makes them more vulnerable to being perceived as inadequate and expendable. During the past ten years, I have seen one female colleague after another fail to get tenure, and it’s terrifying. In fact, the person who held my position before I did, a woman of color, apparently felt so alienated by the inherent prejudice of this system that she didn’t even submit her tenure portfolio even though (in my personal opinion) she would have had a strong case and benefited my department immensely in the long run.

If you don’t get tenure, you have one year to make an appeal. After that, if the appeal isn’t granted, you have to leave when your contract ends. The appeal process is a nightmare and requires the complete revision and re-submission of a tenure portfolio. Most appeals aren’t granted (even if lawyers get involved), so many people don’t even try. After all, if you’re going to go through all that trouble, it makes more sense to apply to other jobs than to stay at a school that has already made it clear that it doesn’t value the work you’ve done.

Unfortunately, as the number of tenure-track positions that open every year continues to shrink, it’s highly unlikely that someone who is denied tenure will find another tenure-track job. In addition, tenured professors in their seventies and eighties will not retire, thereby denying opportunities for younger people to enter their departments.

This is where we return to the problem of “everyone else” that I mentioned at the beginning of this essay. Although there are both abstract and tangible benefits to having tenured faculty, many universities have begun to privilege their short-term interests. According to this mindset, why would you pay a tenured professor an actual salary when it’s much more cost-effective to pay an average of $3,000 per class to a short-term worker who often has the exact same (or even better) educational qualifications?

Because of the state of the American economy since around 2008 or so, more people have been completing graduate degrees. Meanwhile, universities are relying more on short-term contracts, which means that there is a horrifying scarcity of tenure-track jobs. My field is one of the fastest growing fields in higher education, yet it’s a very good year when fifteen tenure-track jobs open in English-speaking countries. University departments tend not to hire across fields – for example, someone who wrote a dissertation about queer literature for a Gender Studies department will probably not be considered as a viable applicant to an English department – which places additional limits on the number of jobs that even highly qualified people can apply to.

Competition is fierce, even for temporary positions that don’t provide benefits or a remotely livable wage, so why should a university have to settle for a tenure-track professor who isn’t perfect? It doesn’t help that both tenure-track job searches and the process of reviewing a tenure case necessitate a staggering amount of unpaid labor from everyone involved. And what department would want to hire someone who already has a tenure-track job but didn’t appreciate it enough to go through the tenure process? I mean, given how much institutional investment goes into an assistant professor, why would a university want to hire someone who’s clearly interested in job hopping? And, if someone went up for tenure but didn’t get it, why would a university want to hire someone they see as another university’s discarded trash?

What I’m saying is that, because of the tenure system, there is either too much mobility as early-career academics are uprooted from their communities and forced to move to a different university every year (and sometimes every semester) as they apply to tenure-track jobs, or zero mobility for people who actually get a tenure-track job and but can’t leave without effectively ending their career.

I’m not yet sure what suggestions I would offer to help restructure the tenure system in American universities, but I think acknowledging that it looks good on paper but has major disadvantages in practice is probably a good start.

Decontextualizing Harry Potter

From the beginning of the 2016 American election cycle, a popular way to signal social belonging on Tumblr has been to reblog angry posts about J.K. Rowling like the one above.

J.K. Rowling isn’t perfect. No human being on this earth is perfect, and Rowling is no exception. Rowling’s books are far from perfect, and I have to admit that I personally don’t particularly like or enjoy them. It’s important to critique popular media, and it’s reasonable to hold public figures to basic standards of decency. Still, I’m concerned about posts like this, which promote decontextualization as a performance of progressive political ideology.

It’s difficult to make generalizations, so I want to refer to the post above to demonstrate what I mean.

To begin with, most of these posts about the Harry Potter books are coming from an American perspective that doesn’t attempt to address the cultural context of the original books. For example, while Americans tend to think everything is about race, British people tend to be much more sensitive about class. Class intersects with race, because of course it does, but class is widely perceived to be the basic framework of social hierarchy in the United Kingdom, and it’s coded in complicated ways that may be unfamiliar to many Americans.

What’s going on with the “house elves” in the Harry Potter books is that the author is taking the well-known figure of the brownie from Celtic folklore and using it to make a statement about class, specifically the class of people whose labor has always enabled the “great institutions” of the United Kingdom to function properly. Without bothering to talk to them or to listen to what they have to say, Hermione sees this class of people as “slaves,” which the house elves themselves find extremely insulting.

This plot line is resolved as Hermione gradually learns that it’s offensive and counterproductive to claim to speak for an entire group of people whom she believes, as an outsider to that group, to be marginalized. Meanwhile, actual members of the group take up activist work based on their own experiences and achieve real change; but, in the end, the “group” is a collection of diverse individuals who have different opinions regarding their “oppression,” and many of them subtly or actively challenge the notion that a privileged group should be allowed to ahistorically define their entire existence as “oppressed.”

Ron tells Hermione that she’s crazy for caring and that nothing should change because this is the way things have always been, but his traditionalism and intellectual laziness are shown to be just as misguided as Hermione’s naive activism. Harry (who is still a teenager, after all) admits that he can see both sides but doesn’t care about the discourse. Nevertheless, when someone close to him is clearly a victim of discrimination, Harry will stand up to protect them, even if he doesn’t like that person.

I don’t agree with the position that ideology doesn’t matter as long as you treat other people decently, which I think is simplistic and reductive, but I can understand how it works as a thematic element in a series of books written for ten-year-olds.

Rowling herself doesn’t entirely agree with this position either, and she addresses the very real and practical problems of the “I see people as individuals” mentality directly in her work for adult readers, including the book she wrote immediately after concluding the Harry Potter series, The Casual Vacancy. The people writing and reblogging posts like the one above don’t acknowledge this, however, perhaps because their only encounter with contemporary British fiction is a series of kid’s books about teenage wizards written during a decade in which a lot of the conversations we’re currently having about social justice were still evolving.

I should add that these books only got as popular as they did because of their cinematic adaptations. These movies are gorgeous and artistically well-crafted, but they tend to flatten and even erase the nuances of the novels. The posts on Tumblr that are critical of Rowling don’t hold the directors and producers accountable for failing to emphasize the progressive themes of the books in order to achieve a broader commercial appeal, nor do they challenge the systems of privilege that have limited the contributions of minority voices to the cinema industry. Instead, these posts pin all faults of the franchise, both real and imagined, on an individual female writer who was very poor for most of her life (thus her various explorations of the theme of class) while decontextualizing what she wrote decades ago in fantasy novels meant for young readers.

Again, it’s vitally important to think critically about popular culture, and I strongly believe that public figures should be held to basic standards of decency. I am all for critiquing the Harry Potter series and Rowling’s creative decisions. That being said, the trend of posts on Tumblr that hold one progressive female artist or activist responsible for everything that’s wrong in the world by means of aggressive decontextualizations of what she’s actually doing and saying are frightening, especially since they’re starting to recirculate within left-leaning spaces in advance of another election cycle.

In the end, who does it benefit to say that books about respecting difference and resisting authoritarian violence even when not everyone on your side is perfect are “problematic” and are only read by bad and stupid people? Moreover, given that the Harry Potter series is the primary gateway a lot of younger kids have into enjoying books, who does it benefit to say that reading itself is something that’s only done by bad and stupid people?

What Makes Something “Interesting”

I’ve been using the Tumblr Top tool to look at some of the blogs I follow in an attempt to figure out what makes a post interesting to other people. As far as I can tell, viral posts have three things in common. They are…

(1) Specific
(2) Relatable
(3) Nonjudgmental

To give an example, “Nintendo please let us pet the dogs in BotW2!!” is (1) about a specific feature in a specific game, (2) relatable because people like petting dogs, and (3) nonjudgmental because Nintendo isn’t being overtly criticized for not including the feature in the first game.

To give another example, this bizarrely popular post of mine is (1) about a super-niche manga, (2) understandable to anyone who’s familiar with internet culture, but (3) not mocking the manga, furries, or the sort of people who are REALLY into horses.

I have many more examples that fit this model, but I’m not sure what to do with this knowledge. It’s a worthwhile observation, but I have no desire to artificially engineer viral shitposts on Tumblr. If I have ever done or said anything interesting in my life, it has been entirely by accident.

Vetting and Sharing on Social Media

I used to think that, the more followers a blog has, the more popular its posts will be. It only stands to reason, right? I also had this idea that artists have a lot of influence on Tumblr partially because of how the platform privileges images but mainly because of their relatively high follower counts.

I’ve since figured out that what’s actually going on is that a post needs to be “vetted” in order to spread. In other words, a post needs to be reblogged by someone whose taste other people trust. Or, well, “taste” is a strong word, as is “trust.” What I mean is that people are far more likely to reblog a post if someone they’re following reblogs it, even if they’ve already seen it posted on the original blog. If that “someone else” is associated with the same fandom as the post, then it will spread farther. In this case, “fandom” can be very broad; like, say, the “intellectual shitpost” fandom.

At this point I have far more followers than my small blog on Tumblr deserves, but it’s not my follower count alone that enables any given one of my posts to spread. By itself, one of my fandom-related posts might get forty to ninety notes, and it’s only when someone associated with the fandom reblogs it that it will get more than a hundred.

I’ve seen this happen on posts I’ve reblogged as well. Sometimes I’ll reblog something from a few months (or even years) ago, and it will go from having about twenty to thirty notes to having several hundred almost overnight.

Once a post reaches a certain level of critical mass, the number of notes alone will indicate that it’s already been vetted, and it will also be picked up by the site’s promotional algorithms. Before it can go viral, however, a post first needs to have community support.

I feel like the same applies to Twitter – albeit to a lesser extent, as Twitter’s septic open wound of an algorithm aggressively prioritizes a handful of tweets while hiding most of the rest, even if you turn off the “best tweets first” feature. As far as I can tell, Twitter doesn’t have the same “recommended for you” algorithm that Tumblr has, in which the posts liked by your mutuals – and the posts posted by people followed by your mutuals – will sometimes appear at the top of your feed. Rather, Twitter has figured out what types of tweets are most likely to provoke a reaction (generally negative) from you and show those tweets to you over and over until you either like them, hide them, or blacklist whatever keyword or hashtag they’re using.

Regardless, I’ve noticed that there’s still something of an influencer culture on Twitter, whereby people are more likely to respond to or retweet something if it’s already been vetted by someone they trust, even if they already follow the OP.

Meanwhile, Instagram is testing a feature that will hide the number of likes a post has received specifically for the purpose of protecting the mental health of their users, and I for one could not be more relieved.

Every Seven Days





This comic was drawn by Elizabeth D. (@mushroomys on Twitter) and written by me, Kathryn Hemmann (@kathrynthehuman on Twitter).

I used to think the Japanese horror film Ringu was super scary, and the Hollywood version creeped me out as well. As I’ve gotten older, however, I’ve begun to find both movies silly and charming, especially since I would love to have a ghost friend come to visit through my television screen.