Aaron Bady and Mike Konzcal
- fdeboer AT purdue DOT com
Recent Blog Posts
Aaron Bady and Mike Konzcal
“In 1975, Derek Bok, president of Harvard, asked Dean K. Whitla, director of the Office of Students, to verify the widespread belief that undergraduates were leaving Harvard-Radcliffe as writers no better than when they entered. Whitla ran a meticulous study of first-year and fourth-year students at five institutions and concluded that the ability of Harvard-Radcliffe students ‘to present an organized, logical, forceful argument increased dramatically over the college years’ (35). Whitla’s unexpected finding was followed by what I will call the Bok maneuver. Forced to report to Harvard’s Board of Overseers the unpopular news that their undergraduates really were developing their writing skills, President Bok said the gains were not ‘substantial’ enough, and that ‘many students showed no improvement’ (13–14). Bok’s maneuver has remained common in attacks on US education. The USS Academia is off course, the argument goes, and any evidence to the contrary is belittled, or just jettisoned.
The authors of Academically Adrift: Limited Learning on College Campuses
execute the Bok maneuver skillfully, with an additional twist. The evidence they
dismiss they gathered themselves. They compared the performance of 2,322
students at twenty-four institutions during the first and fourth semesters on
one Collegiate Learning Assessment task. Surprisingly, their group as a whole
recorded statistically significant gain. More surprisingly, every one of their
twenty-seven subgroups recorded gain. Faced with this undeniable improvement,
the authors resort to the Bok maneuver and conclude that the gain
was ‘modest’ and ‘limited,’ that learning in college is ‘adrift.’” — Richard Haswell, review of Academically Adrift, February 2012 issue of College Composition and Communication (63:3).
I don’t like bloggers much.
This might seem odd to you, given that I am a blogger. But that’s not really odd when you consider that I don’t like myself much. Still, category “blogger”: what exactly is that supposed to be? Why should I allow myself to dislike such a vague, meaningless category? And fair enough. (This is one of the reasons I don’t much care for myself.) But if you’ll permit me the irrationality… I don’t like bloggers much. And I can articulate different reasons for different varietals of bloggers. DC politico types, AV Club culture bunny types, Manhattanite try-not-to-look-like-you’re-trying-hard-while-you-desperately-try types…. It’s a virtual bestiary. Like I said, this is not my best feature.
One thing that seems common to bloggers of many stripes: the idea that the world is full of only lucky duckies and chumps. If you’re perceived to have too good of a gig (like, say, tenured professor) you must necessarily be a lucky ducky who is living off of largess and must, in the name of all that is good, be taken down a peg or three. On the other side of the coin, if you have made a choice that seems difficult to understand from the outside (like, say, to attend graduate school) you are a chump, who deserves only to be mocked for making such an obviously stupid mistake. As a doctoral student, I am very used to this argument. You’ll note that they can change places; for a very long time, going to law school was the purely practical ideal, a downright mercenary act in comparison to the airy, pretentious romanticism of going to graduate school. Now, it’s earning mockery that transcends that for people getting their PhD in French poetry.
What animates both of these is the same impulse: people who, when you get down past their loud derision for others, don’t feel very good about themselves or what they’re doing. That’s true of these writers and it’s true of their readership. That’s the dynamic that has made Gawker such an economic powerhouse for so long: the use of other people’s perceived unhappiness to distract you from your own. Trust me: no one who writes for Gawker, or any of these sites which act as arenas for the endless cultural competition of overeducated white people, wants that to be their final gig. Nobody yearns for that job; nobody grows up dreaming of being particularly bitchy on the Internet. Hard to say what Gawkerites or similar actually think of as the right life. Sometimes I think that they only respect other people who are professional bloggers. Sometimes I think that they are happy only with people who live quietly miserable existences in dispiriting cubicle hell. After all, that level of quiet sadness is something people understand, something they can compartmentalize.
How to respond to that is something of a challenge to me, the only kind of challenge that really matters, the challenge of greater empathy. Things are hard out there. They’re not equally hard for everybody but they’re hard in many places and people are doing their level best in spite of that hardness. What makes me more sorry than anything is that so many have responded to that difficulty by lashing out at other people who are trying for something more. We have such a cannibalistic culture now. The idea of wanting something, particularly if that something is beautiful and bigger than yourself, is considered pathetic. And the only thing worse than wanting and failing is wanting and succeeding.
So it’s nice to read that Will Wilkinson is going to school to get his MFA, and it’s nice to read an announcement that is so free of irony, defensiveness, or guile. I find it quite gratifying in the current climate to read someone say “I want more out of life, I want to make something beautiful, I recognize that the odds are long but it’s worth trying, and I’m willing to take my swings despite the long odds and poor economics of the decision.” Good for him, and shame on all of the people who would mock that impulse or that directness and purpose.
Some people are getting overly worked up about this study, showing a high correlation between machine scoring and human scoring of certain writing tasks. Some of it is glee from people in the university-hating media set; there’s also some rending of garments by those in the humanities who love nothing more than the excuse to rend their garments. As is typically the case, the reaction is out of proportion with the evidence.
First: this is not really news. For the types of writing tasks tested in this study, computer scoring has long been highly correlative with human scoring. You occasionally hear that writing can’t be reliably assessed quantitatively, again sometimes by people who want to squash writing and the humanities as respected disciplines and sometimes by people who are afraid of quantitative assessment in the humanities. The truth is that, for that kind of test and with careful construction of a grading rubric, inter-rater reliability between human scorers can be extremely high, with high .8s/low .9s fairly common. For short order essays on particular prompts, without research requirements and oriented towards five paragraph essay formats, organizations like ETS have got reliability down pat. It’s thus little wonder that computers can achieve similar correlation. (Such computerized assessments are notoriously susceptible to awarding high scores to deliberately nonsensical essays, but some have reasonably responded that you have to have a pretty strong grasp on the elements of sentence construction and paragraphing to create that kind of false positive anyway.)
I’m excited to say that I’ll be teaching a learning community of first generation college students next semester. (That is, students whose parents did not attend college.) Most people don’t associate that demographic with educational disadvantage the way they do with low socioeconomic standing and racial minorities. In fact, first generation college students suffer on a variety of metrics, including dropout rates, grades, time to graduate, employment post-graduation, and obtaining graduate degrees.
At Purdue, learning communities are small groups of students who share some relevant characteristic. In addition to first generation college students, there are learning communities for student veterans, engineering majors, students from a particular language or ethnic background, etc. Students take three classes together, usually live in the same residence halls, and get additional out-of-classroom academic support services. I’ll be taking them on activities outside of the classroom myself, typically with some sort of academic purpose, but often enough just for social cohesion and fun.
The learning communities program has a great track record of getting better results for students. Additionally, participating fits my research interests perfectly. I worked with a somewhat similar program at my master’s institution, and I really enjoyed it. I’m excited for next fall.
I could, if I felt like it, take this latest anti-university screed from the Atlantic apart. Laura McKenna’s piece is one of the most tired, cliched articles I’ve read in years, a collection of the classic warmed-over complaints about ivory towers and uninterested professors. It works purely on argument by assertion, speaks only in generalizations and stereotypes, and seems almost offended by the idea that arguments have to be defended by evidence.
I could, for example, point out that teaching demonstrations are a big part of the hiring process in many or most fields. I could point out that there are dozens of journals, hundreds of conferences, and thousands of articles dedicated to practical pedagogy. I could point out that organizations like the Modern Languages Association have been agitating for more focus on teaching in hiring and promotion decisions for years. I could bring up that it’s precisely the professors and the organizations that represent them that have pressed most vocally, ardently, and consistently against the rise of the all-adjunct faculty, fighting the fight that McKenna pretends she is interested in fighting. Or I could simply resort to anecdote and point out that I have personally met hundreds of professors who are absolutely dedicated to the educational mission, and who don’t deserve to be painted with a vast generalization that is shameless in its lack of evidence.
But it wouldn’t matter. It wouldn’t matter, because the Atlantic has declared war on the university, and nothing as tired as facts, evidence, or counterargument are going to slaw it down. It’s now almost a daily event: the Atlantic publishes a piece excoriating the academy and the people who inhabit it. I don’t exaggerate when I say that it seems to happen several times a week. The complaints are usually the same, the arguments are usually advanced by people with a clear bias (like Megan McArdle, who hates academics with such an intense passion I’m afraid she might take a swing at me the next time I see her), and there’s no attempt at all to hide the agenda behind them. Search their archives; there are literally hundreds of posts just like this one from the last several years alone.
I have to believe that this is a coordinated effort. The leadership at the Atlantic is advancing a specific agenda. It’s a matter of basic logic and common sense to assume that one organization that consistently advances the same complaints and the same conclusions is working in a coordinated fashion. I can only guess why the magazine’s leadership has decided to make the website a repository of anti-university propaganda. Certainly, the Atlantic is staffed by the kind of (faux) high-brow journalists who believe that only they should be granted the laurel of creating knowledge; each of these pieces speaks of nothing so much as the author’s bitterness that other people make knowledge in other ways. The kind of lukewarm neoliberalism that animates the magazine’s staff is also likely a culprit. Neoliberals, after all, have always had warm regard for conservatives and hatred for leftists, who they assume populate the academy. Or perhaps it’s the magazine’s owner, an avowed neoconservative who has never been above using the magazine to advance his own ends.
Whatever the case, it’s happening. Limp denials can’t overwhelm the preponderance of the evidence. The question is, does the staff of the Atlantic care? Do they ever ask themselves why the publication has become a propaganda outfit in this way? Whether it’s appropriate for the magazine to be so showily, irresolutely biased? I doubt it. Usually, with a publication like that, everyone is so busy getting high off their own self-regard and gravitas, self-criticism simply doesn’t exist.
So you can file this one away as random, picayune, pointless, of interest to almost no one, etc.
Years ago, when it came out, I read the book The Gatekeepers by Jacques Steinberg. The book was an (at the time, anyway) unprecedented look inside the admissions process of an elite college. I read it for a variety of reasons. The college in question is Wesleyan University, the college where I grew up. My father was a professor there for decades, my mother worked in the general store and post office, which was managed by my maternal grandfather, etc. etc. I knew some of the figures in the book; Ralph Figueroa, the book’s central figure, went to my church. Greg Pyke, the dean of admissions at the time, was the father of a couple of my friends from high school. More, it came out at a time when college was very much on my mind. I was going through some things. I couldn’t get into college, and I was angry about that, and envious towards those who could. The people profiled in the book were just about my age.
Anyway, I found the book fairly interesting. But one thing really stuck in my craw. And now, as I have picked it up and started to reread it (for no good reason), I’m again kind of annoyed.
“’I ended up with a dream job,’ he says from behind a desk decorated with a massive grizzly skull and a glass statue of a bear. But the last few months had been more like a nightmare.”– Jessica Grose, in a cliche so pure I almost wonder if its some sort of code
Pop culture has become inescapable.
I have no historical context with which to compare the current dominance of pop culture in our media, so I will restrict my consideration to its status today. I can only say that, if you consume digital media at all, you read about pop culture. Sites and publications devoted specifically to pop culture are innumerable. Pop culture has completely saturated general interest and news publications, even those which doggedly cling to a highbrow ethos. The Atlantic, The New York Times, The New Yorker, NPR and network television…. I cannot name a prominent part of the media establishment that does not devote considerable time and resources to the analysis and celebration of popular art. Indeed, though the Times is one of the few remaining pieces of big media to cover theater, ballet, opera, orchestral music, and other aspects of traditional “high” culture, its coverage of those are dwarfed by its coverage of popular movies, TV, music, and video games. You cannot be a consumer of media and opt out of pop culture.
It’s not just any pop culture, either. What has traditionally been thought of as “geek culture,” and what I will generally refer to as “fandom culture,” is now at the absolute pinnacle of prominence, economic power, and attention. Sci-fi, comic books, video games, cartoons and anime and manga, all of it is examined in minute detail, and celebrated, in the most respected and well-read magazines, newspapers, and websites. I can say without exaggeration that fandom culture is now the single most powerful force in entertainment and media. The movie, television, and video games industries ruthlessly compete for the fandom dollar. Websites devoted to those cultures are endlessly analyzed by the big media companies. Not only can and will any comic book character be made into a movie, it is becoming exceedingly hard to get a major movie made if its source material isn’t already treasured by the fandom community. Major fan conventions, like San Diego’s Comic Con, are treated like religious revival meetings, where some of the most powerful figures in Hollywood show up to demonstrate proper deference to the chosen people.
And yet, somehow, a curious lie (and that is all it can be called) persists: the idea that pop culture generally and fandom culture specifically are somehow denigrated or disrespected, and that their fans are somehow an oppressed group.
So a Wired post in “A Geeks Guide to the Galaxy,” I think by David Barr Kirtley, and a post on SyFy Channel’s official blog by Marc Bernadin, both quote Michael Chabon insisting that writing professors are biased against genre fiction in general and science fiction in particular.
“I had a lot of shameful, cowardly answers for that question. Like, I had been taught early on in college and graduate school that I wouldn’t be taken seriously if I wrote genre fiction, and not only would I not be taken seriously, but people just really didn’t want to read it, like, my workshop mates and my workshop leaders. I had workshop leaders who just out-and-out said, ‘Please do not turn science fiction in to this workshop.’ That was discouraging, obviously, and if I had had more courage and more integrity, I might have stood up to it more than I did, but I wanted to be read, and I wanted to receive whatever benefits there were to be received from the people I was in workshop with, and the teachers I was studying from.
Bernadin, in particular, grouses on about how this is bigotry, ending his post by saying “stupid professors.” I’m used to academic bashing. And Chabon and the bloggers and commenters would be right if the broader field of composition was discouraging genre fiction. But they’re not right; in fact they’re completely wrong. Incorporating pop culture, genre fiction, and video games are an absolute obsession in composition studies right now. Trust me; as someone who is more of a traditionalist, I and others like me actually feel a bit of pressure to introduce those things into our pedagogy. There are thousands of classes on science fiction being taught in the academy. There are dozens of journal articles on fan fiction and online fandom communities. There are conferences just on Joss Whedon and MMORPGs. This stuff has penetrated our field on the highest level. What Chabon is saying simply is not an accurate depiction of the field anymore.
I’ll leave aside the continuing issue of the strange contention that genre fiction and its fans get no respect, when they are the single most powerful force in the entertainment industry. Here’s the larger question: why did nobody perform a reality check? The Wired blog is a professional blog. The SyFy blog is a professional blog. People are getting paid for this. Why didn’t they do fifteen minutes of Googling and find out if this was still an accurate portrayal? Blogs are over a decade old now. They have saturated our media and our now among our most powerful and well-read media institutions. And yet there remains no consistent standards of evidence on blogs. I remember several years ago, reading the car blog Jalopnik. One of their writers, again a professional, said that electrical cars couldn’t be that good for the environment because the electricity has to be generated somewhere. As commenters swiftly pointed out, it’s far more efficient to generate electricity at a power plant than it is to run an internal combustion engine. But they shouldn’t have had to. Why on earth would someone getting paid to blog not do even minimal research on a contention he was making? That lack of consistent standards and failure to research is still all too present in the blogging form, and I see no evidence that it will soon improve.
via Flickr user Amalia D
I’m with those people who feel that certain protectors of the essential term and idea “irony” often go too far. Certainly more things are ironic than the most cautious of its policemen are willing to entertain. But still, there’s a whole subset of bad writing that is plagued by an almost defiantly poor handle on what the term means. Such as:
“It was ironic that a movie about a man who could leap so high would land with such a thud with moviegoers.”
I could take this apart a bit, but I’m suddenly filled with the need to go lie down.
Update: As a bonus, the writer of that piece says that the movie website Badass Digest is “aptly titled,” but gives us no indication whatsoever why that is an apt title in this context. Because John Carter is a badass? Because Andrew Stanton is a badass? Help me, Obi-Wan Kenobi….
Serial pedant and curmudgeon Jonathan Franzen has come out against Twitter, and engendered the typical reaction. I don’t agree with Franzen on almost anything, despite our shared anti-Twitter stance, and would not define Twitter’s problems in the same way as Franzen. As is so often the case, it strikes me that the counter-arguments are more telling than the argument itself.
One common defense of Twitter that I find unsatisfactory is the one offered at Text Patterns by Alan Jacobs, an enthusiastic Twitter partisan. In essence: Twitter is a medium and is therefore neither good nor bad.
Twitter is a platform and a medium, not an organized and coherent body — it’s not like a book, for instance, which can be said to have a single overall character. Imagine what you would think if someone said, “Email is all about fitting in.” Or “The telephone functions as banally as a school hierarchy.” Or “The telegraph relies on people’s desire to be the same.” Media platforms are what you make of them, and the history of each reveals that its makers expected it to have a relatively narrow set of uses and were surprised when people exercised their creativity to find remarkably varied uses.
To a degree, hey, I’m on board there. Various media have different strengths and weaknesses, and should not be judged by the standards of other media. But surely if Twitter, as a medium and platform, is not an appropriate receptacle for hatred, it can’t responsibly be a receptacle for love. And yet Dr. Jacobs. claims to love Twitter all the time. This strikes me as straightforward “have your cake and eat it too” stuff. (I’m reminded of Bill Waterson, in the Calvin and Hobbes Tenth Anniversary Book, arguing passionately that comic strips are a medium, and that individual works within a medium shouldn’t be judged by perceptions of the medium as a whole– and then later in the same text saying “comic books will always be deeply stupid.”)
More, I don’t think this defense withstands much scrutiny. Surely, it is fair and appropriate for people to judge what they perceive to be the usual practice within a medium. Sure, Twitter is a medium and could be used for all kinds of things. That doesn’t preclude me from looking around and noticing that it’s mostly links, spam, one-liners, and people agreeing with each other. Which is fine. But the fact that writing on bathroom stalls could be a medium that produces any kind of content doesn’t preclude me from making a (necessarily limited) appraisal of what bathroom stall writing actually is in practice.
Here’s what I’m most adamant about. The other classic defense of Twitter is a related one, and not so much self-defeating as undone by reference to the real world. Pointing to Twitter’s 140-character form, people say in regard to complaints like Franzen’s, “of course, people aren’t writing political manifestos and long form prose and philosophical treatises on Twitter! That’s not what it’s for.” “That’s not what Twitter is for” is the most common defense, I think. Which would be valid, except for the fact that nobody seems to have told most Tweeters.
What I mean is that there’s a large gulf between the way people defend Twitter when making that argument and the way most people actually use Twitter. People certainly seem to think that they can solve political, scientific, and philosophical claims on Twitter. People make sweeping judgments about movies, books, political candidates, ideologies…. I can’t tell you how many several-hundred-word blog posts I’ve written that have been dismissed on Twitter, briefly and profanely. (Followed by the usual “right ons” and retweets.) It happens with blog posts from just about everybody, and with 600 page books, and on and on. If you want to defend Twitter by saying that it isn’t for certain kinds of arguments, you might try telling your friends on Twitter that. They don’t seem aware.
Of course, some may say that the world is not so complicated that it can’t be reduced to 140-character chunks. As for me, I think that quantum mechanics, sociolinguistics, economics, Ulysses and many other subjects need a bit more room to discuss intelligently. But maybe that’s just me.
Update: For balance, here’s a post where Dr. Jacobs evinces some skepticism about Twitter hagiography.
“I have to side with those who believe that emotions are indeed complex enough to merit 600-page novels, and cannot be fully conveyed in an emoticon. I don’t think emoticons and 600-page novels are mutually exclusive; it appears that the universe is capacious enough to include both these phenomena, and I don’t intend to choose sides. But if people start making teams, I know which one I would rather be on.” –David Haglund
Via this post on Splitsider, I read this interesting account of the State’s doomed move to network television. Written by David Lipsky, it’s a well-researched piece of immersive journalism, one made rather poignant with the benefit of 16 years of distance. (The State and its members have had a ton of success, and yet there’s always been a feeling with them of unrealized potential.) As a bonus, there’s a moment of the kind of matter-of-fact business racism that few people evince publicly but many people believe.
Unfortunately, the piece’s effect is considerably dulled for me by the endless em-dashes Lipsky employs. There are literally dozens. (Hyphens: between compounds, such as compound adjectives like “still-nascent technology.” En dashes: to show a range, such as “from 40–50 degrees.” Em dashes: to interrupt or separate a though, such as “I thought– and frankly, I still do– that he was making a mistake.” Note that I’m too lazy to get the typographical difference between en and em dashes right.)
It’s a grating practice, one that ruins the kind of rhythm and momentum necessary for a long-form piece. To be fair, the link I’ve provided is to an at-times imprecise transcription of the original print article, but I doubt that’s a serious factor in the overuse of the construction. Em dashes bring writing to a halt– that’s their purpose– and so they have to be applied sparingly. They’re more cilantro than salt. Used too often, they make reading a piece of writing like watching an online video that never. stops. buffering. I’m surprised all the em dashes survived the editing process, but then I wonder if that’s really what’s going on here; perhaps Lipsky’s editors were averse to constructions that could have supplanted some of the dashes, like parentheticals or the unfairly-maligned semicolon.
Perhaps this is an example of hating in others that which we see in ourselves. I’ve been accused, fairly, of overloading my own prose with boutique constructions. But all writers are hypocrites, after all.
Update: Oh, and: “Pete Dinklage, the dwarf.” Thanks for that, 1996.