to be a better amateur

At the beginning of this chat I had with Noah Millman, you’ll note my caveat: I speak as a dedicated but decidedly amateur student of artificial intelligence. Noah makes a similar announcement. I was thrilled to be invited by him to discuss issues of the philosophy and theory of knowledge of AI, and I had a great time chatting with him. I announced my amateur status because I felt compelled to: whenever I write about more quantitatively oriented issues, people try to check my card — they make some sort of aggressive statement about my lack of expertise. Sometimes these statements are accurate, sometimes inaccurate, but the essential message is always the same: numbers-based ways to understand the world are meant to be discussed by a certain credentialed minority. I think that’s a terrible mistake, and in fact that’s why I was eager to have this discussion with Noah. I believe it’s essential for people with non-STEM backgrounds to be conversant in these topics, as they’re so important for the future. I think an informed conversation on AI between a guy with a background in writing and the humanities, and a guy with a background in history, finance, and the arts can be fruitful and useful.

To lay out my (beginner, amateur, but informed and passionate) understanding of AI, I’ve read Doug Hofstadter’s Godel, Escher, Bach twice; I’ve read a significant majority of Artificial Intelligence: A Modern Approach by Stuart Russell and Peter Norvig; assorted books in cognitive science, ranging from popular like Steven Pinker’s The Stuff of Thought to academic like Randy Gallistel’s Memory and the Computational Brain; a couple hundred articles, popular and academic; and a good deal of natural language processing that I utilize in my own academic research. Let me say straight up: there are large chunks of essentially all of this that I don’t understand, and the actual computer science that real understanding would require eludes me. There’s huge chunks that I just don’t grasp because I can’t follow the algorithms or the code or similar. Like I said, I’m an amateur. Just one who wants to learn as much as his amateur brain will allow.

There are some topics on which I am not an amateur. I’m not the type to act as though expertise is all a lie, which seems to me to create a tyranny of the ignorant. I will claim some expertise in the fields of writing assessment, particularly quantitative approaches; second language writing specifically and applied linguistics generally; writing program administration; and standardized tests of higher education, a topic which has occupied most of my attention for the past two years. There’s no bright line between things that I define as matters of my professional competence and those that I see as the interests of a beginner, but I maintain the distinction all the same.

I have, for the past six years of graduate education, gradually brought myself to a fitful and inconsistent understanding of statistics and research methods appropriate to my research interests. As I’m sure is common, this didn’t really come from some grand scheme to get quantitative. I just found that I had certain questions that I couldn’t answer without using numbers, and as time went on, I needed to know more and more. Which means that I know how to do a few things that are quite sophisticated and don’t know how to do some very basic things. When I want to do a simple confidence interval, for example, I often find myself reaching for a textbook. There’s something embarrassing about that, I guess, but I don’t mind too much; the point’s not to pass someone else’s test. It’s just to know how to ask certain questions, or to know how to find out, or how to ask. It’s like anything else: you study and you think you know something, and then you learn more and you look back on your old understanding and you say, boy, I didn’t get it back then, but now….

Which is not to say that I am not subject to the insecurity that comes with attempts to develop quantitative skills. There’s so much ingrained disrespect for the liberal arts, and such a schizophrenic set of attitudes about quantification within them. Oftentimes, it feels like you just can’t win: your work isn’t serious if it doesn’t involve numbers, but if you incorporate numbers into a subject they see as unworthy, or in a way they see as unworthy, that’s ridiculous, too. In such a context, it doesn’t surprise me that many humanities people simply wash their hands of the whole thing, and say “they aren’t going to respect me anyway, so why not just do my own thing?”

I want to stress that a majority of the STEM-oriented people I’ve worked with (and in the last couple years in particular, I’ve worked with many) have been friendly, approachable, and generous. For the past two years I’ve worked with a series of international graduate students, almost all of them in the STEM fields, and the collaboration has been among the most meaningful and satisfying elements of my recent life. My statistics professor and those I’ve worked with in the statistical consulting service here, as  well as my private R tutor, have been patient and kind with me. There have been a few people in the STEM disciplines here, and many more who claim to be online, who have been… less generous. That’s life, I guess. People have weird ownership issues over this stuff.

One thing I’ve gained: I am much less subject to mathematical intimidation than I once was. A lot of people (and it’s far from just in academia) will just try to wing claims by you by throwing in some numbers and statistical terminology. What I’ve developed is a lack of fear of really interpreting those claims, thinking them over, and performing a critical review. I won’t always know if they’re right or wrong, but I won’t fear looking deeper and saying if I’ve found problems. I’ve gained the confidence to inquire more deeply, and the framework for understanding how to l earn more.

Sorry this is so scattered. I’ve tried and failed to write a post here a thousand times about the humanities, numbers, and the future, but I am apparently incapable of writing coherently on the subject. I guess I will just say this: I write about statistics and research methods here not because I know everything but because I so certainly don’t. I am feeling my way through, thinking my way through, and day by day getting a little better. That has never meant that I have left the liberal arts behind, or that I have come to embrace a purely quantitative or positivist way of knowing. I’ve just had these questions, and have wanted to find  answers to them, and I think that I can do a service by talking through some of these issues from the standpoint of someone who has been growing and has more growing to do.

Neil Degrasse Tyson sometimes says that his shelves are filled with books from history and literature and the arts, but that professors in history and literature and the arts usually don’t have books on the sciences. I’m not sure that’s as true as he thinks; he’s obviously a brilliant man and a great science communicator, but he sometimes seems incurious about people. But either way: I am determined not to be one of those types, and would be moved by curiosity to learn more even without a philosophical commitment to doing so. I am convinced that there must be a way to pursue these interests without denigrating or sidelining the traditional values and methods of the humanities, and without suggesting that only numbers can tell us more about our world. My goal is simply to learn more, to gather more expertise where I can, and in those fields where I am sure to remain an  amateur, to be a better amateur.

book review: Astra Taylor’s The People’s Platform

The internet, we are all expected to believe, is revolutionary, in several different meanings of the term. In the span of a decade or two, the digitally-connected technologies we refer to as the internet expanded from being populated by a few thousand academics, government officials, and cultish amateurs to a ubiquitous part of contemporary life. No industry has been untouched by this rapid explosion of infrastructure and attention, and none could afford not to engage its customers online. For researchers and teachers, the internet has completely transformed the way we investigate problems and solve them, and has opened up even the most remote classroom to more information than the greatest  libraries in the world could once hold. But it’s not merely the communicative, economic, or academic changes that give the internet its outsized reputation. That reputation also depends on the revolutionary potential of these technologies, their ability to act as agents of change that can empower the little guy against entrenched authority, enable grass roots organizing, even spark revolutions that overthrow dictators. This portrayal of the online world, as a force not just for greater communication or commerce but for emancipation, has spilled out from the cheerleading technology press such as Wired magazine and into general interest publications like The New York Times, The New Yorker, and The Atlantic.

But quietly, a counter-narrative has begun. As the online world has matured, and the initial rush of the potency of these technologies has subsided, critics of the digital utopian narrative have begun to emerge. Astra Taylor’s 2014 book The People’s Platform is a clarion call in that new tradition, a book that could be to digital skepticism what Rachel Carson’s Silent Spring was to environmentalism. And it offers us a new perspective on the digital tools that we use in our day-to-day lives, a way to appreciate their power while recognizing that they are neither intrinsically good nor bad, but rather can be used creatively or destructively by individuals and society.

Taylor is not the first to throw cold water on the revolutionary potential of the internet and its subsidiary technologies. Nicholas Carr’s The Shallows (2010) argued that, when access to knowledge is ubiquitous and nearly instantaneous, we lose some of our independence and ability to think critically. Evgeny Morozov, a longtime critic of digital-era utopianism, published The Net Delusion (2012) in reaction to claims that social media had created the conditions for the Arab Spring and similar political movements, arguing that these simplistic takes distorted history and failed to recognize how authoritarian governments, as well as populist movements, can take advantage of new technologies. Jaron Lanier, an early pioneer of virtual reality turned techno-skeptic, wrote You Are Not a Gadget (2010) and Who Owns the Future? (2013) to document the ways that the new digital era has undermined individuality and the ability of artists and thinkers to make a living, respectively.

All of these writers explored themes similar to Taylor’s, and yet none of them, in my estimation, produced as powerful of a critique. Without being a demagogue or dismissing the very real gains that the internet has brought, Taylor systematically, rigorously undermines the triumphalist narrative. She compares the hype about how the internet has changed the world to the reality, and finds the reality far less positive and more disturbing—in journalism and the media, in the arts, in the economy, and in politics. Throughout, she looks at the outsized claims made by digital utopians, academics and writers and Silicon Valley gurus who have claimed that the internet is a great leveler that will bring power and fame (if not fortune) to the masses. The predictions of prominent commentators like Chris Anderson, former editor of Wired, the academics Clay Shirky and Lawrence Lessig, and tech-industry powerhouses like Larry Page of Google are subject to meticulous review, and found wanting. Rather than being a tool of egalitarianism and liberation, the internet is revealed as a feudal system, one which has actually entrenched old power structures and cut the financial legs out from under entire industries. “In fact,” writes  Taylor, “wealth and power are shifting to those who control the platforms on which all of us create, consume, and connect…. they pose a whole new set of challenges to the health of our culture” (9).

Take journalism. Journalism is a topic of obvious and unique importance in democratic societies, which require reporters and pundits to act as watchdogs over the government and to help alert the public to  risks and problems. The internet has surely increased the ability of the average citizen to engage in this kind of activity, as Taylor acknowledges. Everyone with broadband access can potentially become an amateur journalist, sharing photos, starting a blog, and holding politicians accountable. Taylor summarizes the popular conception of this revolution, the belief that “the Internet has freed us from the stifling grip of the old, top-down mass media model, transforming consumers into producers and putting citizens on par with the powerful” (69). It’s a pleasing idea, and as someone who has engaged politically online myself, on blogs and social media, a seductive narrative. But as Taylor demonstrates, the truth is far less reassuring. Although it’s certainly true that it’s easier than ever for regular people to share their opinions, that doesn’t mean that reporting is any easier. Old school journalism—the brick-and-mortar work of asking questions and finding out facts—remains a very expensive proposition. With plummeting revenues from classified advertisements (thanks to free online alternatives like Craigslist), traditional newspapers have dramatically scaled back their investigative reporting units, including closing extensive foreign bureaus that have provided essential information on foreign events. Even the most dedicated of amateurs could not hope to replicate the type of reporting that led, for example, to the Watergate scandal, reporting that took time, money, and access. “Hit by a double whammy of technological change and a global recession,” writes Taylor, newspapers the country over “cut staff, slashed sections, or closed shop” (80). This has left many major areas of the country without any real professional journalism, meaning that there is no one to keep government and industry accountable.

The internet is often described as an attention economy, and yet as many aspiring online journalists will tell you, getting attention is not the same thing as making a living. Consider  the story of Baltimore-area journalist Stephen Janis and his now-defunct website The Investigative Voice. In many respects, Janis would seem to have done everything right. After having been laid off by the Baltimore Examiner, one of many once-celebrated newspapers that has been economically devastated in the internet era, Janis started his site out of a perceived need for what Taylor calls the “nitty-gritty beat coverage that no one wants to invest in anymore” (84). The site was an immediate success, breaking big stories, drawing a modest but enthusiastic audience, and finding a comfortable niche for itself locally. But despite the quality of its content, The Investigative Voice struggled in terms of access and in terms of sustainability. Lacking the immediate potential to go viral and attract lots of clicks, it was very hard for the site to generate ad revenue. And though he was a well-known reporter with real connections in the city, Janis found it much harder to get access to important people and information without the imprimatur of a paper like the Examiner. Taylor extensively quotes figures like Lawrence Lessig who claim that institutions like newspapers are dinosaurs in the new era, and yet when it comes to having the kind of clout necessary to open doors, it seems that institutions still matter. Working long hours for almost no money, Janis and his collaborators eventually called it quits, and The Investigative Voice was shuttered. It’s a sadly common reality on the internet: success in terms of praise, recognition, and attention in no way ensures financial stability.

Nowhere is Taylor more convincing than in her discussion of how the internet has (and has not) changed life for artists, musicians, writers, and filmmakers. Herself an established, respected documentary  film director, Taylor has great credibility to demonstrate how the reality of professional arts and media has not matched up with utopian predictions. Taylor talks openly about the problems with the older models of media production and show business, describing them as closed, hierarchical systems that typically rewarded only those who were already connected and successful. One of the strengths of Taylor’s book is her refusal to romanticize the old systems as she critiques the new. As she writes, “in discussions of digital culture, complex dynamics are reduced to stark, binary terms” (169). Her own habits are far more nuanced. But even with nuanced, the portrayal of the basic economics of the culture industry is stark and frightening. Widespread digital piracy has severely reduced revenues in the music industry, for example, and digital streaming sites like Pandora and Spotify have not come close to replacing them. A similar dynamic has begun to happen in the world of filmmaking, with technologies like Bit Torrent and faster broadband access making movie downloads easier and easier. And despite the fact that piracy is often described as an attack only on rich corporations, Taylor demonstrates how independent, low-budget artists like herself are feeling the squeeze. She interviews Jem Cohen, himself an acclaimed independent filmmaker, who reflects his disappointment on finding that a low-budget documentary he directed had appeared on file sharing sites before it had even debuted publicly. “Sometimes we need to remind ourselves,” Taylor quotes Cohen as writing,  “that the relationship between those who make creative work and those who receive it should be one of mutual support” (167). But with illegal file sharing simple and ubiquitous, and the online cultural attitude largely laissez faire, few on the consumer side hold up their own end of the bargain. The result is not the level playing field we were promised but instead an online economy where the Googles and Facebooks rake in millions and artists are increasingly unable to make ends meet.

Taylor’s analysis expands into politics, education, and the labor market, demonstrating how internet technologies have concentrated power in the hands of a few large entities and undercut the ability of individuals to make a living. But there is some hope. Taylor does not believe that the genie of technological change can be put back into the bottle, and would not try even if she did. Rather, she advocates for seeing technological change in terms of potential—the potential for both good and bad. “A more open, egalitarian, participatory, and sustainable culture is profoundly worth championing,” she writes, “but technology alone cannot bring it into being” (232). Instead, we must recognize that internet culture will be what we make it. Taylor advocates for a new spirit of conscientious consumption of online media, and she sees a new spirit of digital skepticism as emancipatory rather than pessimistic. If the people who read, learn, debate, shop, create, and engage online work together to make the internet into the accessible, egalitarian space we were promised, “only then,” writes Taylor, “will a revolution worth cheering be upon us” (232).

I read books I enjoyed more in 2014 than The People’s Platform. But I didn’t read any books that were more important. It’s hard to imagine a more timely argument, or one more worth making.


cautionary tales: too many aphorisms

As I and others have written about endlessly (and as you’re likely bored of hearing), the curious economics of online politics and culture writing leads to too much and too little at the same time. We produce a huge amount of content because hitting click targets requires endless churn, particularly given the need to stay on top of the stream. Yet we also have a much-discussed numbing sameness; rather than producing endless variety, the huge volume of content being created produces homogeneity. Too much and too little: too much getting written, too little difference.

The scramble to differentiate your bit of content, especially in a world where websites mean nothing, has lead to the headline tricks everybody hates and similar annoyances of promotion and sharing. But it also results in many writers adopting exaggerated styles. I like a lot of oversized, inflected styles, and think the preference for flat “plain language” writing style is boring aesthetic conservatism. But you’ve got to really commit to it and you’ve got to be able to pull it off. Too often, people trying to stand out from the #CONTENT pack pepper their writing with grand statements, would-be aphorisms that stick out awkwardly and distract. So consider this from Grantland’s series of team-written essays organized around a theme, a series I generally enjoy. In order:

Alex Pappademas: To be a critic is to have a dysfunctional romance with a thing you love. [with bonus lampshading]

Sean Fennessey: Anger is a hammer. It blunts and flattens. It doesn’t allow for nuance or daintiness. It hits and it can hurt.

Amos Barshad: The Score was so big that it destroyed lives.

Danny Chau: Some days all you need to achieve a fullness in life is a fullness in stomach.

Alex Schulz:  I have no doubt that “Hey Mama” is the best mother-centric song in existence.

Chris Ryan: it was like the moment at a party when you’re trading bullshit anecdotes with someone you don’t know very well and they suddenly tell you about serving time in Lompoc or something. Shit gets dark.

I find reading this to be just like being stuck at a party with someone who’s done too much cocaine. Everything is a pronouncement, wisdom from the mountain. It’s a kind of sweaty desperation to make what you say worth listening to by tying everything to What It All Means. Who talks like that? Well, a lot of writers, at a lot of publications, these days.

And just so you know I’m not just picking on Grantland here, check out Steve Hyden’s response. Hyden and I do not see eye to eye on music, to put it lightly, but his missive satisfies the purpose of the group writing exercise without resulting to fortune cookie pronouncements. His first line has an aphoristic quality, but it’s firmly restricted to the actual topic at hand, and his section concisely expresses what it means to express through a direct and conversational expression of his ideas. Not flashy, but so effective.

Like all writing advice, this can be discarded at will and at whim. Some of my favorite writers  can do the aphoristic thing well. But you gotta have real chops. Not everybody gets to be Buckminster Fuller.

moving funding drive

Hey guys, ever since my graduation a few weeks ago, people have asked about setting up a tip jar or similar. So I’ve set up another GoFundMe, which will fund a UHaul and some various moving expenses for my upcoming move to parts unknown at the end of June.

I’ve been a bit reluctant to do another funding drive. First, because whenever I do, it really brings the boo birds out of the woodwork, and they’re not exactly shy to begin with. Asking people for help is always a type of vulnerability.  But second, because I’m really not sure how much longer I’ll be doing this, and I don’t want to solicit donations and then stop and have people feel tricked. This hasn’t been fun for some time. Well, I guess it’s never really been fun. But it seems less productive than ever, and to be frank with you, I just don’t think the online world is set up for stuff like this to be useful anymore. It’s not like I’ve ever mistaken my blogging for Doing Something. I’m not that naive. It’s just that I can’t fulfill the basic function of “the world’s messed up and here’s why and what do you think about that?” I like arguing and always have, but I like arguing about what I’m actually saying, and right now you spend tons more time insisting that you aren’t saying what others claim you are, and that’s just no fun and no good for anybody. I hear this basic complaint from all kinds of writers I know.

This might sound like another of my complaints about left-wing communicative practice, but it isn’t; this is a non-ideological phenomenon. I try never to look back on the good old days (mostly because they weren’t that good) but as someone who’s done this for too long, I’m not afraid to say that things have never been less charitable out there. I don’t mean that in the sense of people having sympathy for your actual argument, but in the sense of people just trying to understand what your argument is in the first place. My default assumption now is that many people will look for the most unflattering, damning reading of anything I write, so I write everything as this tedious lockbox of doubling back and self-protective asides and constant explicit statements of what I’m not saying. It’s just so ugly, aesthetically, stylistically. But the alternative is being harangued for days about a position I don’t hold. Not worth it anymore. The deal right now seems to be: OK, you can put your words out here, but the crowd is going to sift through it for the absolute least charitable reading of those words, and that’ll become the default way they’re read– so do you still want to do it?

I don’t mean to complain so much. This has always been a privilege. Really. My great privilege. And maybe this is just a classic case of an old guy who’s been left behind in an always-changing culture! I wouldn’t bet against that. Either way.

I never, ever say never. I stopped at my old blog and started again and people thought that was really goofy. And as you know I am moved to write only by compulsion. It wouldn’t surprise me if I kept going, and this won’t be the last post. But I can’t tell you there’s much more to come. So I’m just saying that if you’re interested in donating, you have to do so with the understanding that it’s an attaboy for what I’ve done in the past, and not payment for future work. I don’t want anybody to donate under a false pretense.

I won’t stop writing. I do have a bunch of freelance works in the pike. It’s great, but I gotta tell you, I don’t know how freelancers do it. In particular, the need to develop and redevelop pieces without knowing if they’ll ever be published, and thus paid for, is tough. It’s particularly difficult because it often takes a long time to get paid after your work is published. It’s taken me a long time to realize that from agreement to payment is a matter of months rather than weeks. The conditions are rough out there in an online world full of people who just want to write and who are constantly undercutting each other. I have a couple of pieces going in big-deal print publications, and the difference in labor conditions is so stark. It’s not just how much better the money is. It’s the fact that I’ve got contracts, which include kill fees that obligate them to pay me something should they decide they don’t want to run what I’ve written. Extensive edits of writing that you’re getting paid for are part of the deal, it’s natural and perfectly fair. The problem is that with online stuff now you’re stacking your original work on top of revisions and edits and yet you have no guarantees that it’s going to turn into something at all. I’ve been working on a bunch of stuff lately, but I’m still just a part-timer. I’m genuinely in awe of people who can pull it off as full-time freelancers.

Anyway: if you can donate and want to, I would really appreciate it. Here’s the link. If you can’t or don’t want to, I understand that, too. And if you want to throw a little shade, I’m open to that too. I have always seen this as a vaguely ridiculous enterprise that I take very, very seriously. If I wasn’t open to being made fun of, I would be some kind of hypocrite. It’s all love, you guys, it’s all love.

If you want an endorsement to motivate your donation, well, let me share this wonderful comment from longtime reader Dillards Homecoming Dress:

“Hiya, I am really glad I have found this information. Today bloggers publish just about gossips and internet and this is really annoying. A good web site with interesting content, that’s what I need. Thanks for keeping this site, I will be visiting it. Do you do newsletters? Cant find it.”

I couldn’t have said it better myself!

“our situation is unique”

I generally find radical transparency kind of creepy, but I think Gawker Media opening up its internal discussion about a union effort is useful, so I encourage you to check it out. [edit: useful to me. Maybe not so useful to individual employees who feel intimidated about it, it occurs to me.] The particulars of Gawker Media’s union drive are not my business. But whether or not a workplace like Gawker Media can be unionized is everyone’s business. Especially in an industry like online writing. Because the lives of employees in that industry, if you ask me and others, are about to get markedly worse.

I’ll just say this. Stef Schrader says “our situation is unique.” I don’t blame her really, as that’s a trope in union organizing. Nothing about the situation as described in the comments is unique, most certainly including people claiming that the situation is unique. In fact, all of the complaints I’m seeing from those who oppose the union are absolutely commonplace to organizing efforts, again including the notion that those complaints are unique to that context. In particular, meta-issues of fairness, communication, and transparency as as common to union organizing as you could imagine. So too are the claims that people voting no are themselves deeply committed to unions and organizing in general even as they oppose the current effort in specific. That’s something longtime union organizers hear all the time. None of this means that the complaints are wrong, or that the current union push is a good idea. Like I said, it’s not my business. It also doesn’t mean that people are insincere when they say that they support unions. It just means that frequently a big impediment to a real union is the hypothetical union, the next union, the future union.

I guess the union efforts of the 21st century are going to look a lot like those of the 20th.

the problem with experimentation

I thought that this piece by J.K. Trotter on Tom Hardy’s past partial admission, then sort-of-denial of having sexual encounters with other men was interesting. It simultaneously made me feel a little sad that Hardy (or more likely, his people) would feel the need to be aggressive in defining the story, while also making me understand why they would react that way. After all, people made a really big deal about it, as Trotter notes. It demonstrates that, while gay people have faced and continue to face a unique level of discrimination and threat of bodily harm over their sexual identity, there remains something uniquely controversial about same-sex attraction or sex between people who don’t identify as gay. (Or queer, or similar.)

Two points: one, I think it’s a shame that so many of the pieces that discuss such issues use the terminology of experimentation. It’s natural, I suppose, but it really limits the world of same-sex activity by people who don’t identify as gay. What if a straight-identifying person wants to have sex with someone of their gender without viewing it as an experiment? What if someone wants to avoid self-identification in that way entirely, but is not in any sense experimenting?

Second, the way in which same-sex sexual activity is inherently newsworthy (whether on a large-scale for celebrities or small scale for one’s personal social circle) stems from some progressive attitudes as well as regressive ones. The latter, traditionally, view same-sex sexual activity as sinful, or dirty, or shameful, or otherwise pathological. But though they view same-sex sexual experiences positively, some progressive (I stress: some) mimic one aspect of regressive attitudes — namely, that same-sex attraction or activity are inherently a really big deal. In other words, a lot of good, decent, well-meaning people have replaced the notion that same-sex sexual activity is a terrible shame that is unmentionable and worthy of condemnation with the notion that same-sex sexual activity is a wonderful act of discovering one’s identity that should be celebrated. Obviously, the latter is worlds better than the former. But both treat such acts as existential — as defining the person in one particular way. And self-definition is, well, terrifying, in a lot of ways, and difficult and personal and private. What we should pursue is the right of the individual to determine how big of a deal it is for him or herself — for some to be defined through their sexual identity and some not to be.

Under those conditions, the language of experimentation makes sense; it’s a way to linguistically preempt the notion that one’s identity should be inextricably bound to who one has sex with. It’s useful, in other words, even if it’s somewhat distorting. But I think better alternatives are possible.

don’t lampshade me, bro

In what I truly hope is the nadir of pop fans whining about the mere existence of people who don’t like what they like, Rob Harvilla deploys a tactic I’m seeing more and more of lately: preemptively acknowledging a broader controversy as a way to avoid having to comment on it, when the subject of your piece demands engagement. Harvilla is mad that a single indie rock guy dared to publicly express his  distaste for a popular artist. Because pop hegemony is now so complete, and the social pressure to like pop music so intense, Harvilla has to trot out every cliche and produce some obligatory, exhausted pro-pop shaming. This is, undoubtedly, a part of the great Poptimism vs. Rockism “debate”– a debate as real and evenly matched as the Harlem Globetrotters vs. the Washington Generals– even if Harvilla would prefer not to cast it in those terms.

Rather than confronting that facet of his argument, though, he’d prefer to avoid it. So he lampshades that debate: he nods briefly in its direction as a way to placate criticism for avoiding it, but doesn’t actually do anything to satisfy his need to talk about it. It’s a very neat trick: I don’t have a way to respond to this kind of criticism, or else I just don’t want to have to be bothered to respond to it, so I’m going to throw in a few words that wink at the fact that it exists and carry on my day. I see this all the time from professional opinion writers now, and it’s so, so lazy. “I know that this criticism exists, now let’s move on” is not cutting it, you guys. What’s your response to that criticism?

As for the debate itself, well, I think it’s as tired as Harvilla claims it is– and yet he still finds it necessary to embrace its most tedious cliche, which is the poor lamented downtrodden millionaire pop star. Taylor Swift has millions of dollars; she’s an idol to millions of people;  her records receive not just critical respect but critical acclaim; she is literally broadcast into all of our homes. There are very few laurels our species gives out that we have not already awarded to Taylor Swift. And Bejar goes so far to try and ward off criticism like Harvilla’s. It’s as mild and unassuming as criticism gets. But in today’s world of total pop hegemony, even that’s too much for Harvilla: someone else out there doesn’t like something he likes, so it’s time to take to the battlements and punish the apostate.

This is just true in my own life: when people tell me they don’t like what I like, I  say “It’s not for everyone.” If it’s a friend or someone who I think is on the fence and could find an explanation of what I like helpful, sure, I’ll tell them why I think the stuff I like is good. And I’m not going to change my mind and say “you’re right, it’s bad, only pop is good.” But always, I’m willing to say: it’s not for everyone. Like its inverse, “it’s not for me,” saying “it’s not for everyone” is a way to acknowledge the wonderful diversity of legitimate tastes. That’s what makes art great, difference, difference of opinion and of method and of style and of genre and of goals. When I tell people online that I don’t like Taylor Swift? They tell me that I’m out-of-touch, snobby, elitist — “you like things other than the things I like, so you are a bad person” — or even worse, they tell me that I’m lying, and that I don’t really like the things I say I like — “no one can possibly like things other than the things I like.”

I will never in a million years understand it. Why is the existence of differing opinions about music so immensely threatening to people?

it’s all in there

My discussion with Jay Caspian Kang, about online activism and the future of the left, has drawn to a close. I really want to thank Jay for the invitation and for the spirited pushback. It’s been a great opportunity to flesh out my ideas, and to do so with direct and muscular disagreement, which is always useful as a writer. I also want to say that I think I am ready to move on from this line of criticism. I have one last piece on these issues that I’ve been working on, for some time, for another venue, but it’s more a personal story of my own evolution and much less a critique of current practices. (And honestly I’m not sure if I’ll ever be able to get it into a form they’ll  want.) At this point, I’ve made and remade my argument, everyone remotely interested knows how I feel, and I don’t want to pile on or have it become my only shtick. (I have many other shticks I’m more than happy to flog.) So, in the future, I think that this conversation with Jay is a good way to understand what I mean with this line of thinking– if people are, in fact, interested in knowing what I’m thinking, which isn’t always the case.

Ultimately, I have only tried to stress that there is a difference between sharing a vision of a better world with people and agreeing with them about how best to achieve it, and to make a case that today seems like a radical departure: that being good is not a instrument of doing good. I’ve made the case and it will stand or fall as wisdom or folly in time.

Mad Max’s moderate feminism and radical egalitarianism

max furiosa rifle

Spoilers ahoy.

At the end of writer and director George Miller’s Mad Max: Fury Road¸ a rebooted continuation of his classic post-apocalyptic series, a quote appears: “Where must we go, we who wander this wasteland, in search of our better selves?” The quote is attributed to “The First History Man,” a nod to the time before the apocalypse in question. The sentiment, then, comes not from the world of Tom Hardy’s Max Rockatansky and Charlize Theron’s Imperator Furiosa, the warrior woman at the center of the film’s plot and themes and the indelible figure of the movie. It comes from our world, from the pre-apocalyptic world, which means it confronts us in our figurative wasteland as thoroughly as it does those characters in their literal one. What do we have to do in a world that, though lush and bountiful in comparison to the starved world of Mad Max, is still filled with injustice?

We might see some of that challenge in the negative response to the film from a particular, particularly troubling perspective. The film has earned, and thoroughly deserves, a reputation as a modern action masterpiece, a hyperkinetic journey that proves the continuing relevance of practical effects and character-driven storytelling. Rapturous reviews have implored audiences to go see the film, in order to reward the faith of Miller and his team in the ability to create a summer spectacle that has heart, vision, and integrity. But dissent has bubbled up from a noxious source: the Men’s Rights Activists, or MRAs. The MRA movement believes that feminism has corrupted contemporary gender relations, relegating men to the status of second class citizens and upsetting a natural order where men are born leaders, warriors, and workers, and where women are better served in roles of domestic nourishment. MRAs have made news lately for loudly decrying the plot of Fury Road, in which Theron’s noble warrior and a cast of powerful women drive the action and make the most noble sacrifices. To MRAs, this constitutes an inherent degradation of the character of Max and through him, of men writ large.

(Some have complained that the MRA rage over the film is largely a media exaggeration, thinly-sourced and replicated endlessly. Maybe so! But, I mean, this guy exists. It’s not a wholesale invention.)

Some of the film’s champions have played into this narrative, with many reviews calling the film an inversion of the traditional action film trope of heroic men rescuing at-risk women. Deadspin’s Will Leitch, for example, writes that “Max himself is oddly passive and unimportant to the plot: It’s the women, particularly Theron’s Furiosa, who drive the action and make all the difference,” standing in contrast to “idiotic men and their overcompensating toys, killing each other and everyone else, just as they’ve done since the beginning of time.” That seems to confirm the MRA’s take on the plot, though hardly their political stance towards it. Certainly, such a movie could be made and made well, a radical tale in which men are revealed as inherently incapable of reform. I’d watch that movie with interest.

But that isn’t the movie Miller made. It’s just inaccurate, for example, to call the men passive characters. Max takes many crucial actions in the film, as does Nicholas Hoult’s renegade “Warboy” Nux. Without either of them, the caravan of heroes would never have survived. Indeed, the film’s screenplay is as comprehensively egalitarian as I can imagine: every single character within the group of protagonists plays some essential role in the conflict. Yes, Furiosa is the linchpin of it all, the one whose courageous decision starts the plot into motion, and the most effective combatant and driver in a world where fighting and driving are everything. And it’s indeed great to see a blockbuster action film that is so unambiguous and direct in its portrayal of heroic, competent women. But it seems to me to be a misreading to say that the many potent women characters in the movie succeed by replacing the men. The hero of Mad Max is really a family of heroes. The movie’s commitment to a truly communal vision of heroism is perhaps its most radical, most affecting stance.

Watch this scene.

God, I love this movie.

Yes, in this scene, a man in a group of women advocates for the eventual course of action. But he’s been brought to that place by the decisions of a woman, acting on behalf of other women. And the decision is not his alone. Multiple women join in the dialogue, and the person they are trying to convince, the closest thing the group has to a leader, is a woman. People make their appeal; they state their point of view. The group comes to a decision. This isn’t some Amazonian warrior woman leading by imperious decree. It is, instead, a story of a family of spontaneous heroes who, in a world begging them to focus only on their own survival, find within themselves the courage to sacrifice for the good of others. Watching the film a second time, I felt a kick of aggravation at the endless “Chosen One” narratives that are heaped on us again and again in modern movies. Max Rockatansky is the opposite of a Chosen One. He is a guy who wants to care only about survival, and yet finds within himself angels enough to put his life in danger in the defense of others. I think of Ratatouille’s claim that a great cook could come from anywhere, and realize that the claim here is the same: heroism emerges from the flux of life in the hearts of those who are brave enough to choose it in the face of adversity.

These themes are explored in a brilliant essay by Maria Bustillos. Bustillos has, in a low-key and patient way, explored the relationship between feminism and reconciliation for years now. See, for example, her review of Hanna Rosin’s The End of Men, in which she writes: “I believe that each of us — all human beings who share the same seemingly limitless abilities, and the same unfathomable doom — should be able to develop his or her potential and live freely and on equal terms in a condition of mutual respect and support.” This statement is remarkable in that it is simultaneously natural and unobjectionable, on its face, and yet in context risky, as Bustillos is pointedly contrasting this with the zero-sum school of feminism that she accuses Rosin of. (Accurately.) In the context of contemporary dialogues, such a stance could be easily misrepresented. Some could take Bustillos’s claim as the equivalent of #AllLivesMatter or similar weaksauce derailing, attempts to neuter passionate political rhetoric with waves to vague universal claims as a replacement for the specific demands of outraged people. That isn’t Bustillo’s project, as I understand it. Her goal seems simple and radical, uncomplicated yet challenging: to find within contemporary culture the blueprints for the better society that we must build in order to survive. And she recognizes that we can only make that world together. “Max leaves her at the end of the movie, still the quiet loner who shows no emotions,” she writes. “But I think he’ll be back.” I hope to god George Miller proves her right.

No, men aren’t sidelined in Mad Max. They aren’t considered irredeemable, either. Redemption is in fact that movie’s strongest theme. Max is plagued by visions of the people he has failed to save in his life, a series of hallucinations that strike him at the worst time and contribute to his stance of proud hopelessness. He is granted at least a small reprieve in the course of a film where he helps many women, even though these women are perfectly capable of helping themselves. Nux, meanwhile, is a character that should be as hard to rehabilitate as possible, an angry young man constantly hopped up on chemicals who endured a lifetime of brainwashing and was raised only to be a killer. Yet he is judged and, ultimately, redeemed. When Furiosa leads her caravan to her old clan, a pack of keen-eyed elder warrior women, they initially distrust the two men traveling with her. But Furiosa makes her case, telling them that the men she travels with have helped her and her friends, that they are worthy. So the wise warrior women accept them into their band.

The moment is crucial: yes, men are capable of being redeemed, even in a world ruined by men. But first they must be evaluated. There has to be a reckoning of their individual characters. After all, redemption requires judgment. In order to be redeemed, one most wrestle with one’s past. When Furiosa presents her companions to her clan, she is required to make her case, to assuage their worries, by telling them about the specific actions and character of the men in question. In a similar way, we as thinking, progressive people must be willing to grapple with the past and present of gender relations before we can feel like integrated and valued members of an equal society. None of us are required to answer for the crimes of our gender, and despite MRA rhetoric, essentially no men ever are. But all of us must take stock of the continuing horrors of patriarchy if we are to be part of a feminist, equitable world, and we must be willing to be interrogated on our contribution to the building of that world. Redemption is possible, but only with a willingness to be judged and a commitment to being our better selves.

Mad Max: Fury Road refutes the MRA worldview, then, in two ways at once. It refuses to play to the zero-sum gender narrative that they’ve imagined, where women acting as leaders and warriors must necessarily leave men in the (figurative and literal) dust. But it is unflinching in its portrayal of a world destroyed by men and their violent, rapacious acts. A modern masterpiece, Fury Road doesn’t compel us to hate its titular character or men in general. The film embraces equality, but it’s a hard-won, brutally honest, and adult kind of equality, not the greeting card variety. Without ever falling into moral didacticism or the stereotype both critics and supporters have made of it, the new Mad Max film shows us how rich, entertaining, and challenging blockbuster films can truly be.

all things go

PhD Daddy and Me

Well folks, grad school has come and gone.

It’s beautiful on campus. I grew up on a college campus; my earliest memories are of playing under my father’s desk while he met with students. My parents met on campus. My maternal grandfather ran the post office and general store at a college; my paternal grandfather was a professor. I have spent a majority of my adulthood in school. While it’s a bit embarrassing to say, I really don’t function well anywhere else. Campus is really the only place I’ve ever fit in or made sense.

Now I’m in a spot where it’s unclear if I’ll get to return. I am a good academic. I love to teach and I can really do it. I really can. I really come alive in the classroom. And I’m built to be a researcher, as I am a compulsive writer (in the old school sense of it being out of my control) and habitual reader. I think my stuff is good. But I hardly need to tell you that many talented academics are left out in the cold in current labor conditions. I am hardly giving up. I have a very strong CV, and I’m still applying now, and there’s some jobs I’ve interviewed for lately that would be awesome. I will also give it another go next year if it doesn’t break for me this spring or summer. If it doesn’t work out for me my second go around, then we’ll see. Obviously, my compulsions to write, and my style in writing, do not always help. Like the lady sang, “I need to tame this wild tongue if I’m to touch these white streets,” but all these years later I’m still learning, still regretting. I left my last class a few weeks ago hoping that it won’t be the last I ever teach. I am not complaining. I am in a really rare spot of freedom. I can go anywhere and do anything. I just may not get to stay on campus any longer. If not, I’ll figure it out. You don’t always get what you want in life!

I’ve made a lot of mistakes. I’ve made a lot of mistakes.

I have more regrets than I can count, but the last six years of MA and PhD, the decision to commit myself in this way, aren’t among them. I am a far healthier and more functional human being than when I started, although there too the work is ongoing. Going to grad school was the best decision I ever made. Looking back, I think of the work. The endless hours of reading books, of chewing through terrible old facsimile PDFs, of drafting and redrafting papers, of staring at a paragraph of text from some 19th century Scottish rhetorician and trying to make it make sense, of laboring over a graphing calculator as I agonizingly dragged myself to competence in stats, of grading huge stacks of papers, of mildewy lecture halls, of the stacks in the library, and always of the paper, the paper and the ink. The work is what I was in for. I know that sounds impossibly pious, and I promise that I’m not trying to prove my virtue. I’m only saying that, as the person I am, the work is ultimately the  only satisfaction, the only tool to quiet my unquiet mind. I only regret that those who built this smokeless  fire inside of me and left too soon were not there to see any of it. If I get to stay, I will give thanks every day. If I leave, I will be OK, and I will leave behind a folder on my computer that houses within it 2000 nights of grinding, grinding, grinding away, for my own self, 2000 nights of purpose. And I did it all for you, dear father, nurturing mother, for you.