Consider this fairly recent NYT piece, titled “A Simple Way to Help Low-Income Students: Make Everyone Take SAT or ACT.” The essay talks about research that shows that making these tests mandatory in high schools raises the participation rate (duh) and in so doing identifies high-achieving students who ordinarily would not have taken the test. See, typically those who are screened out of taking a college entrance exam through self-selection are those who are less college ready and perform less well. But this is far from universal, and there are many potential high-scoring testers who are screened out through fees, lack of parental guidance, or a lack of information about when and how to sign up. The research discussed showed that low-income but high performing students are less likely to take these tests than high-income, high-performing students, and that making the tests mandatory will thus lower the relative disadvantage of those students. Mandating the test is thus a tool for equality – it increases the opportunity for students who are typically systematically excluded from college.
Let’s think about things for a second. First, and to be clear, the research does not show that low-income students are more likely to perform well. The opposite is the case:
(Data’s a bit old, but this is a durable outcome.) So having more low-income students taking the SAT will likely mean finding that many low-income students are in fact not prepared for college, to go along with finding those high-performing kids who we wouldn’t otherwise. Still, obviously I think it’s a noble and necessary goal to help identify talented students from poor families. The point is that it’s odd to think of this as a project for increasing equality as such. We’re simply looking for more “diamonds in the rough,” and hopefully helping to pull them out from their peers – who are thus left even further behind.
Here’s a point to stress: the very purpose of educational testing is to identify inequality. That is, we develop and administer tests precisely to better understand how students are not the same. In fact, the most precisely that tests are, the more unequal we understand the tested population to be. A 10 question test likely has less discriminatory power than a 100 question test, and thus the 100 question test is more likely to differentiate between closely-grouped students – that is, to identify how they are unequal. Progress in educational testing stems from designing instruments that are more sensitive to underlying inequality. That’s the very name of the game.
As I’ve said before, we talk about education as fulfilling two functions that are not just in tension with each other but directly contradictory: education is discussed as a tool for creating greater socioeconomic equality, and as a system for identifying excellence and rewarding it with status and economic opportunity. The problems here should be obvious.There’s a much larger conversation about summative equality and equality of opportunity here, which is too directly political to get into in this space. (I will say that I think equality of opportunity is not really a coherent idea when you pull at it a bit.)
But from the standpoint of educational policy, it’s not clear to me that we really know what we want to be doing. Some people tell me that our goal should just be to move everyone up in terms of absolute achievement, raising averages without necessarily changing relative performance. That might have lots of good effects, but it’s by definition not something that could help with inequality, as what’s rewarded by the labor market is relative educational achievement, not absolute. If everyone who ever went to an Ivy League school was sent to the moon, they’d simply look for the next rung down and hire accordingly. If the purpose is instead to shrink the variance, to narrow the range between the top and bottom of the achievement scale, we’d want to talk about limiting resources to the top-performing kids, and we’d still be looking for differences in what individuals can do. And we have no good reason to think that we can achieve either at scale, because while some interventions have helped different groups at different times, the general bell-shaped distribution of overall achievement on any identifiable quantitative metric of academic success has been persistent and unchanging over time.
As long as we use education as a system for sorting students into different tranches of ability, and as long as that sorting system is a key mechanism for placing people into different levels of income and joblessness, we can’t conceive of our system as being an engine of socioeconomic equality. We might sometimes use testing to identify areas where more resources are needed and distribute them accordingly. But 15 years since No Child Left Behind and the testing-heavy era it augured, we have seen almost nothing in the way of convincing proof that testing is a reliable tool for raising standards and increasing either equality or opportunity. Tests are powerful things, and modern test-development can produce exams of extraordinary precision. But they can’t be useful until we have a clear and coherent vision of what we’re testing for. To get that understanding, we have to begin to pull apart our basic assumptions about education and our economy, to ask ourselves if the system can do what we imagine it can do.
Not too long ago, I felt the need to change the stream of personalities and attitudes that were pouring into my head, and it’s been remarkable.
This was really the product of idiosyncratic personal conditions, but it’s ended up being a good intellectual exercise too. I had to rearrange a few things in my digital social life. And concurrently I had realized that my sense of the world was being distorted by the flow of information that was being deposited into my brain via the internet. I hadn’t really lost a sense of what the “other side” thinks politically; I’m still one of those geezers who forces himself to read Reason and the Wall Street Journal op/ed page and, god help me, National Review. But I had definitely lost a sense of the mental lives of people who did not occupy my various weird interests.
What were other people thinking about, at least as far as could be gleaned by what they shared online? What appeared to be a big deal to them and what didn’t? I had lost my sense of social proportion. I couldn’t tell if the things my friends were obsessing about were things that the rest of the world was obsessing about. Talking to IRL friends that don’t post much or at all online helped give me a sense that I was missing something. But I didn’t know what.
No, I had to use the tools available to me to dramatically change the opinions and ideas and attitudes that were coming flowing into my mental life. And it had become clear that, though I have an RSS feed and I peruse certain websites and publications regularly, though I still read lots of books and physical journals and magazines, the opinions I was receiving were coming overwhelmingly through social media. People shared things and commented on what they shared on Facebook and Twitter, they made clear what ideas were permissible and what weren’t on Facebook and Twitter, they defined the shared mental world on Facebook and Twitter. They created a language that, if you weren’t paying attention, looked like the lingua franca. I’m sure there are people out there who can take all of this in with the proper perspective and not allow it to subtly shape your perception of social attitudes writ large. But I can’t.
It’s all particularly disturbing because a lot of what you see and don’t online is the product of algorithms that are blunt instruments at best.
So I set about disconnecting, temporarily, from certain people, groups, publications, and conversations. I found voices that popped up in my feeds a lot and muted them. I unfollowed groups and pages. I looked out for certain markers of status and social belonging and used them as guides for what to avoid. I was less interested in avoiding certain subjects than I was in avoiding certain perspectives, the social frames that we all use to understand the world. The news cycle was what it was; I could not avoid Trump, as wonderful as that sounds. But I could avoid a certain way of looking at Trump, and at the broader world. In particular I wanted to look past what we once called ideology: I wanted to see the ways in which my internet-mediated intellectual life was dominated by assumptions that did not recognize themselves as assumptions, to understand how the perspective that did not understand itself to be a perspective had distorted my vision of the world. I wanted to better see the water in which my school of fish swims.
Now this can be touchy – mutually connecting with people on social media has become a loaded thing in IRL relationships, for better or worse. Luckily both Facebook and Twitter give you ways to not see someone’s posts without them knowing and without severing the connection. Just make a list of people, pages, and publications that you want to take a diet from, and after a month or two of seeing how different things look, go back to following them. (Alternatively: don’t.) Really do it! The tools are there, and you can always revert back. Just keep a record of what you’re doing.
I was prepared for this to result in a markedly different online experience for me, and for it to somewhat change my perception of what “everyone” thinks, of what people are reading, watching, and listening to, etc. But even so, I’ve been floored by how dramatically different the online world looks with a little manipulation of the feeds. A few subjects dropped out entirely; the Twin Peaks reboot went from being everywhere to being nowhere, for example. But what really changed was the affect through which the world was presenting itself to me.
You would not be surprised by what my lenses appear to have been (and still largely to be): very college educated, very left-leaing, very New York, very media-savvy, very middlebrow, and for lack of a better word, very “cool.” That is, the perspective that I had tried to wean myself off of was made up of people whose online self-presentation is ostentatiously ironic, in-joke heavy, filled with cultural references that are designed to hit just the right level of obscurity, and generally oriented towards impressing people through being performatively not impressed by anything. It was made up of people who are passionately invested in not appearing to be passionately invested in anything. It’s a sensibility that you can trace back to Gawker and Spy magazine and much, much further back than that, if you care to.
Perhaps most dramatic was the changes to what – and who – was perceived as a Big Deal. By cutting out a hundred voices or fewer, things and people that everybody talks about became things and people that nobody talks about. The internet is a technology for creating small ponds for us to all be big fish in. But you change your perspective just slightly, move over just an inch, and suddenly you get a sense of just how few people know about you or could possibly care. It’s oddly comforting, to be reminded that even if you enjoy a little internet notoriety, the average person on the street could not care less who you are or what you do. I recommend it.
Of course, there are profound limits to this. My feeds are still dominantly coming from a few overlapping social cultures. Trimming who I’m following hasn’t meant that I’m suddenly connected to more high school dropouts, orthodox Jews, senior citizens, or people who don’t speak English. I would never pretend that this little exercise has given me a truly broad perspective. The point has just been to see how dramatically a few changes to my digital life could alter my perception of “the conversation.” And it’s done that. More than ever, I worry that our sense of shared political assumptions and the perceived immorality of the status quo is the result of systems that exclude a large mass of people, whose opinions will surely matter in the political wars ahead.
I am now adding some of what I cut back in to my digital life. The point was never really to avoid particular publications or people. I like some of what and who I had cut out very much. The point is to remain alive to how arbitrary and idiosyncratic changes in the constant flow of information can alter our perception of the human race. It’s something I intend to do once a year or so, to jolt myself back into understanding how limiting my perspective really is.
Everyone knows, these days, that we’re living in digitally-enabled bubbles. The trouble is that our instincts are naturally to believe that everyone else is in a bubble, or at least that their bubbles are smaller and with thicker walls. But people like me – college educated, living in an urban enclave, at least socially liberal, tuned in to arts and culture news and criticism, possessed of the vocabulary of media and the academy, “savvy” – you face unique temptations in this regard. No, I don’t think that this kind of bubble is the same as someone who only gets their news from InfoWars and Breitbart. But the fact that so many people like me write the professional internet, the fact that the creators of the idioms and attitudes of our newsmedia and cultural industry almost universally come from a very thin slice of the American populace, is genuinely dangerous.
To regain perspective takes effort, and I encourage you all to expend that effort, particularly if you are an academic or journalist. Your world is small, and our world is big.
Hey gang, I’m back from vacation and excited to get back to work here on this project. Lots of cool stuff in the works for here, including hopefully some audio and video content soonish. (Not a podcast, don’t worry. The world has enough podcasts already.)
I wanted to take a moment and explain why I’m going to be moving away from freelance writing. I’ve had a pretty good run lately; I was in the print Los Angeles Times a couple weeks back and the print Washington Post last week. (You can always check out my published writing by clicking the My Work tab above.) I know we’re all supposed to be too cool to care about print these days but, well, I do care. And I have a couple of heavily-researched pieces coming out in some longer form journals in the next several months, and it looks like I might have a regular column-type thing to indulge my political side. But beyond that, I’m not really interested in freelancing anymore. The truth is that I just find the process so aggravating and dispiriting at this point, and the money so bad, that it’s simply not worth it to me.
I just find, at this point, that the process of pitching, composing, shepherding through edits, promoting, and trying to get paid sucks the life out of me. The commercial interests of publications require editors to ask for things that are tied to the news cycle in the most facile way imaginable. I get it, and I don’t blame them personally. But I’m opting out. And it’s increasingly hard for me to explain to editors what I want a piece to do and say without writing the piece. I’m just really not interested in the “beats” of a piece of nonfiction anymore; the argument, in the sense that people traditionally mean, is just about the least interesting aspect of nonfiction writing. So when asked to reduce my own prospective writing to a series of explicit moves, I’m forced to fixate on the parts that I find least interesting or valuable. What I want is to write in a way that is free of precisely the kind of paint-by-numbers literalism that editors require. Again, not a knock on them. It’s just not in my interests anymore.
Meanwhile, the money generally sucks. I am very grateful for the LAT publishing me in their print edition, for example, and I knew what the rate was going in. But writing and editing a thousand-plus word piece for one of the biggest newspapers in the country got me $200. So many younger writers I know think that the higher profile, more established places are where the money is, but often that’s not true. Not anymore. And if I don’t enjoy it and the money’s not good, what’s the point?
I also don’t have a lot of hills to climb anymore in terms of places I want to be published. At this point even my (many) dogged critics can’t really claim that I can’t get published in major magazines or newspapers. And it’s not like they changed their tune once I did, anyway. I started writing for big pubs in part as a way to prove to my detractors that, contrary to what they said, I could get published in respectable places. When I did, they didn’t retract their old insults. They just switched to new ones. So there’s little appeal there, at this point.
And, finally, I’m just exhausted by people not reading. I’m just exhausted. The WaPo piece is an expression of 100% straightforward left-wing values; it’s a critique of corporations and an endorsement of the idea that only the left can guarantee true freedom. I do write my fair share of left-on-left critiques, but this piece really is not that. It’s simply an articulation of basic left principles in a frame designed to make them more appealing to the unconvinced. But the piece has predictably attracted criticism from the left, people insisting that I’m a reactionary even though I’m making a standard left critique of corporate power. Some have claimed that it’s a defense of the Google memo writer, when in fact I explicitly justify Google’s actions in the very first paragraph. The great bulk of the piece was written six weeks ago, before that memo existed, and that situation is tangential to my larger point. Meanwhile, others saw the headline and immediately assumed that this was a defense of the Charlottesville protesters – which would have been remarkable, given that the piece had come out on Friday, before the event. Either of these misconceptions could have been cleared up simply by reading the piece. But this is, increasingly, a bar that many refuse to clear.
This is a long-winded way of saying that I’m happy to have this outlet, where my audience is small and sympathetic and where I can avoid so many of the headaches involved in professional freelancing. Never say never, obviously, and I’ll pop up here and there. But what was always a bad bet has only gotten worse since I started doing this and I just don’t really have it in me to continue the slog. I need to focus on academic writing, book projects, and this website. Thanks for coming along.
Group-level differences in cognitive ability are in the news again, thanks to the quickly-notorious Google memo on the company’s diversity efforts. You can find a lot of stuff written about these differences from qualified people already and I’m not gonna add to the pile, other than to say that from where I’m sitting, if there are any sex differences in intelligence or math ability, they seem not to exist in early childhood and there are plausible cultural and social reasons that they would appear by high school. The science on personality trait differences seems less clear to me but then those constructs are also less concrete. In all of this I’m pretty much in keeping with the liberal mainstream.
But I do want to voice a caution, here, because there’s a natural but unfortunate tendency to make an unjustifiable corollary to arguments of these kind. Regular readers will know that I reject the idea of biological or genetic explanations for academic differences between races. Instead I follow most progressive people in thinking that the differences are socioeconomic and environmental in origin. There, too, I’ve often seen people make the same bad leaps: they tend to reject the idea of innate or genetic differences in individual academic ability or intelligence too. It’s not hard to understand why; talking about genetic differences in intelligence at all may seem like fruit from a poisoned tree, and why not just reject the whole idea altogether? But understanding the difference in group-level claims and individual-level claims is hugely important, both analytically and morally. It’s the difference between contributing to stereotypes that have contributed to marginalization and injustice of vulnerable groups, and accepting the reality that not all individual people are equally gifted in all areas.
And the data here is really, really clear: there are profound differences in individuals in academic or intellectual ability; these differences are generally quite durable over the course of one’s life, although of course there is some variability, as there is in any measurable psychological trait; and there is very strong evidence that a major portion of this difference comes from genetics. I don’t think that boys are smarter than girls or that black kids are less intelligent than white. I do think, and think both the empirical record and common sense shows, that not all people are equally talented in different intellectual domains, and that if you believe that the brain is the product of evolution, we should expect a significant amount of that difference to be genetic in origin, which is in fact what twin studies, adoption studies, and GWAS data show. I’ve written about all this in this space many times before.
You can think about this clearly if you just eliminate the comparison between groups that are supposedly different and look only at within-group distribution. So, for the purposes of this debate, look at women and their various metrics for intelligence and academic success, whether generally or in math/STEM/computer science. Forget about comparisons to men for a moment: within that group, on any properly validated intelligence metric, we find a normal distribution of ability. That is, there’s a mean, and there’s a distribution of about two thirds of the data points within a standard deviation from that mean, and about 95% of the data points within two standard deviations, and the distribution is just about symmetrical. Some women are better than other women on the SATs, IQ tests, quantitative reasoning tests, etc., and in predictable ways. The same exact condition applies when looking at distributions of black students, Asian students, students from Turkey, students who attend public schools, students who are left-handed, students who play Little League, etc. – real, persistent, and predictable differences of ability between individuals.
Now these individual differences don’t have much to tell us about diversity efforts like those at Google, which for the record I support, other than to say that Google is probably looking for those in the very top reaches of these distributions no matter what. But they say a hell of a lot about how we should approach education from a policy level. Policy has to reflect our empirical understandings of reality, and right now, ours doesn’t, as it is based on the false notion that all students can be brought to meet arbitrary performance standards, that there are no intrinsic limits to how well any individual student can perform, and that the purpose of schooling should train every student to be a Stanford-education Silicon Valley superstar. That’s the kind of cheery, optimistic, utterly-unachievable policy goal that comes from thinking that, because there aren’t genetic differences in intelligence between men and women or between races, there are no such differences between individual people either. That’s wrong and destructive and we can’t allow our necessary efforts to oppose bigotry to lead us in that direction.
For a lot of great thoughts on how to ethically consider genetic influences on individual intelligence, I recommend the work of the brilliant Paige Harden.
If you found value in this blog post, please consider supporting this project financially by clicking on the Patreon or Paypal link in the sidebar to the right.
Hey gang, I am officially on vacation for the first time since I started my job last September. Posting this coming week will be light, though I expect to have at least a couple pieces up. Thank you for your continued support of the ANOVA. I’m having lots of fun and hope you all are too.
Matt Bruenig critiques the concept of the “Success Sequence” quite convincingly here. There are a lot of just-so stories in our culture about what it takes to be a success. Typically, these stories are confusing the lines of causation all over the place, failing to see that confounds and covariates are doing most of the explaining.
I sometimes get anxious emails from parents, wondering what they need to do to make sure their children are going to be OK academically. And because of networking effects and the nature of who reads this small-audience education blog, I can mostly tell them accurately that they don’t really have to do much of anything; they’ve already set up their children to succeed simply by virtue of having them. Here’s the real Academic Success Sequence:
- Be born to college-educated parents.1
- Be born to middle-class-or-above parents.
- Be born without a severe cognitive or developmental disability.
- Don’t be exposed to lead in infancy or early childhood.
- Don’t be born severely premature or at very low birth weight.
- Don’t be physically abused or neglected.
If you are one of those lucky enough to tick off these boxes, congratulations. You’ve got the vast majority of the accounted-for variance breaking in your favor. Is everything accounted for? No. We’ve got a lot of variance in cognitive and educational outcomes that never seems to be systematically explainable. I actually think that’s a good thing – perfect determinism is contrary to the fight for human meaning – but it’s important to say that this variance is not only not currently accounted for, it is likely never-to-be accounted for. This is what the behavioral geneticists call the “gloomy prospect“: the possibility that large portions of unaccounted-for variation in psychological traits like intelligence are the product of truly non-systematic events, like particular psychological traumas, getting a concussion, meeting the right person, having the right conversation at the right time….
Thus it’s the case that some people can “win” in all of the above categories and still suffer from real hardship in life, just as some can be on the wrong side in many or all of them and flourish. Still: if you’re an educated, employed parent raising a healthy child in a stable home environment, the odds are strongly in the favor of that child’s eventual academic success. Of course, none of this stuff is stuff that individuals can control, and much of it is not stuff that parents can control either – particularly given that the parents were once the children whose outcomes were similarly conditioned….
Now many people will say, well yeah, of course these things matter. But what do we do beyond that stuff? How do we set our kids up to succeed? I’m not going to say that nothing you do matters. But in terms of moving the quantitative indicators that people are, sadly, most fixated on are stubborn and hard to move. Some things appear to work – intensive one-on-one or small-group tutoring seems to me to have the most promising research literature – but we’re playing with small effect sizes here, particularly in comparison to the influence of the factors listed above. Of course you want to bend as much of the variance in a positive direction as you can. But the effects tend to be so small, and thus so subject to being offset by minor random fluctuations in uncontrolled variation, that it’s just not worth worrying about them. The best thing you can do for your kid is to be present and kind and supportive and then stop stressing out.
The great irony is that we’ve seen this growing culture of panic on the part of bourgie parents about their child rearing practices at the exact historical moment that we’ve learned conclusively that these practices just don’t mean very much.
In particular, the Baby Einstein stuff, trips to museums, violin lessons, edutainment software – my understanding is that there just is little to no rigorous research that shows that this stuff works to move the needle on SAT scores or GPA or similar, once you control for the kinds of confounds listed above. Does that mean that this stuff doesn’t matter, that you shouldn’t do them? Of course not. Children should all have the opportunity to lead intellectually enriched, challenging, and varied lives. I’m very grateful that I had that chance myself. But you need to appreciate them for their own sake and on their own terms, not as a means to goose test scores. And obsessing over getting your kid into the right preschool is pointless too, as is worrying over selective high schools. It may make you feel like the right kind of parent to fixate on this stuff; it may, more cynically, help you feel competitive with other parents. But extant evidence suggests it just doesn’t matter. What does matter is giving your child commitment, love, structure, and a moral education, because life is about so much more than where you go to college.
Of course, many people in our society are not lucky enough to have been born into the kind of advantaged position described above. Given that fact, you’d think that our system would be set up to minimize the impact of these unchosen factors. Instead we work to maximize their impact and call the resulting system “meritocracy.”
Louis Menand in The New Yorker:
The funny thing about the resistance all these writers put up to the idea that poems can change people’s lives is that every one of them had his life changed by a poem. I did, too. When I was fourteen or fifteen, I found a copy of “Immortal Poems of the English Language” in a book closet in my school. It was a mass-market paperback, and the editor, Oscar Williams, had judged several of his own poems sufficiently deathless to merit inclusion. But he was an excellent anthologist, and I wore that book out. It changed my life. It made me want to become a writer.
I had an almost identical experience, with an anthology put together by XJ Kennedy, a poet, essayist, translator, and all around man of letters. That’s my copy pictured here. In sophomore year of high school my old Latin teacher Mrs. Montgomery (gone, now, but never forgotten) had wanted to share a poem with me, and had dug around in her closet to find this old, little-loved and forgotten literature collection. It was divided into three sections: fiction, poetry, and drama. In time I would read the whole thing cover to cover, but at the time I obsessed over the poetry section. Growing up in a arts- and literature-obsessed home, I had gotten plenty of exposure to poetry, but this was the first time I really felt like I had the time and inclination to truly explore the form on my own. I got a real poetry education from that book, and learned not just Keats and Housman but Linda Pastan’s “Ethics” and Chesterton’s “The Donkey” and Amiri Baraka’s “Preface to a Twenty Volume Suicide Note.” I read it under my desk during algebra class and in the cafeteria and on the bus rides home from cross country meets, and today the cover is held on with masking tape, because I wore the damn thing out. When high school was over, I stole it.
I am, as you know, skeptical of the degree to which quantitative educational metrics like test scores can be changed by teachers and schools. But this carries with it the essential qualification: that test scores are not the measure of education’s value. Because I read and talk about quantitative research, and because I acknowledge that these tools are broadly predictive of all manner of eventual academic outcomes, I am often in agreement with those who view education in a reductive light. But my objections to that reductive thinking are as real and important as my objections to those who think that all individual students can be brought to the same levels of achievement on standardized tests. Indeed, precisely because differences in academic ability are real, we must take seriously all the things that education can do which are not expressible in a test score. I doubt that this book made the slightest difference to my SAT scores. Yet like Menand’s, my life was forever changed.
To the Muse
by XJ Kennedy
Give me leave, Muse, in plan view to array
Your shift and bodice by the light of day.
I would have brought an epic. Be not vexed
Instead to grace a niggling schoolroom text;
Let down your sanction, help me to oblige
Him who would leash fresh devots to your liege,
And at your altar, grant that in a flash
They, he, and I know incense from dead ash.
This past week, the Los Angeles Times was kind enough to run a revised version of an argument I had made here in the recent past – that Republican support of colleges and universities has collapsed, likely because of constant incidents on campus that create a widespread impression of anti-conservative bias, and that since our public universities are chartered and funded as non-partisan institutions, and because Republicans control enormous political power, our institutions are deeply threatened. I stand by that case.
I have gotten the usual grab bag of responses, most of them unmoored from specific principles about who should be able to say what on campus, and some of them directly contradictory with each other. As is typical, the number one rhetorical move has been to insist that student activists are only targeting the worst of the worst, Milo Yiannopoulos and Richard Spencer and the like. The idea is that people with mainstream views are entirely free to say whatever they want without issue because they don’t directly threaten marginalized people. That idea is factually incorrect, as anyone with the barest grasp on the facts should know.
- Student activists at Amherst University demanded that students who had criticized their protests be formally punished by the university and forced to attend sensitivity training.
- At Oberlin, students made a formal demand that specific professors and administrators be fired because the students did not like their politics.
- The Evergreen State College imbroglio involved students attempting to have a professor fired for criticizing one of their political actions.
- At Wesleyan, campus activists attempted to have the campus newspaper defunded for running a mainstream conservative editorial.
- A Dean at Claremont McKenna resigned following student backlash to an email she sent in response to complaints about the treatment of students of color.
- Students at Reed College attempted to shut down an appearance by Kimberly Peirce, the director of Boys Don’t Cry, removing posters advertising her talk and attempting to shout her down during her presentation.
- At Yale, students called for the resignation of Erika Christiakis for an email she wrote about culturally insensitive Halloween costumes and for the resignation of her husband Nicholas Christiakis for defending her.
- At the University of California Santa Barbara, the student government voted for mandatory trigger warnings, which would enable any student to skip class material that they decided was offensive.
- Laura Kipnis, a feminist professor at Northwestern, was the subject of a literal federal investigation because she published an essay students didn’t like.
- Mount Holyoke canceled the Vagina Monologues on campus under student pressure.
- American Sniper, a perfectly mainstream American blockbuster, was temporarily pushed off campus by student activists.
- Activists are Western Washington demanded the creation of a 15 person panel that would engage in surveillance of students, professors, and administrators in order to monitor everyone involved on campus for any expressions or actions that body deemed “racist, anti- black, transphobic, cissexist, misogynistic, ablest, homophobic, islamophobic, xenophobic, anti-semitism, and otherwise oppressive behavior.” That body would have the ability to discipline campus community members, including firing tenured faculty.
- A yoga class for disabled students at a Canadian university was canceled after students complained that yoga is a form of cultural appropriation.
There are more. You are free to support any or all of these student actions. But you are not free to pretend there is no trend here. Exactly how many of these incidents must pile up before people are willing to admit that many campus activists pursue censorship of ideas and expressions that they don’t like?
The obsession with Milo and Richard Spencer makes this conversation impossible in left circles. Those people are discussed endlessly because leftists believe that doing so makes it easy to argue – “what, you want Milo to be free to harass POC on campus?!?” But in fact because most conservatives on campus will simply be mainstream Republicans, this side conversation will be almost entirely pointless. What really matters is the way that perfectly mainstream positions are being run out of campus on a regular basis. And of course with a list like this we can be sure that there are many, many more cases that went unnoticed and unreported in the wider world.
You would think it would be easy for progressives and leftists simply to say “I support many actions that campus protesters take, but these censorship efforts are counterproductive and wrong.” But that almost never happens. That’s because in contemporary life, politics has almost nothing to do with principle, or even with political tactics. Instead it has to do with aligning yourself with the right broad social circles. To criticize specific actions of campus activists sounds to too many leftists like being “the wrong kind of person,” so they refuse to criticize students even when their actions are minimally helpful and maximally counterproductive. That in turn ensures that there’s no opportunity for the students to reflect, learn, and evolve.
There have, of course, been many leftist professors who have been the subject of censorship too. I have written about these cases and fought for those professors over and over again. They come not from student pressure but from administrative fecklessness, which is to be expected, as the administrators that sometimes accede to student censorship demands and those who silence leftist professors are working under the same philosophy: a corporate desire to avoid controversy and to protect the campus as a neoliberal institution. That students so often petition these same administrators to silence on their behalf speaks to the failure to truly grapple with the nature of administrative power.
Awhile back I laid out my frustrations with this conversation. In particular, almost no one who defends campus activist attempts to censor has ever articulated a coherent policy about who is and is not allowed to say what.
Whatever else, defenders of activists attempting to censor opinions they don’t like have to stop claiming that these censorship efforts only target the most extreme cases. Because that is simply, factually false. Stop obsessing about the most extreme cases and grapple with the clear and growing attempts to censor mainstream views on campus. It’s an important conversation to have. Or you can keep shouting “Milo!” over and over again because that’s easy and doesn’t force you into any difficult choices or conversations. That will ensure that we have no coherent defense against bias claims while the Republican party sets out to dismantle our institutions, brick by brick.
For a large academic project I’m working on, I’ve been trying to do something that is rather rare: discuss cultural studies and its practices in the academy in a nuanced and evenhanded way. Unfortunately, cultural studies and related fields have become the Battle of Verdun in our culture war, and typically any support is sorted by critics into “SJW bullshit” and any criticism into “reactionary proto-fascism” by supporters.
This is unfortunate because like all fields cultural studies has its strengths and its weaknesses. Has cultural studies been stereotyped and caricatured by its critics, reduced to a set of entirely unfair associations and impressions, forced constantly to defend the worst excesses of individual member, and in general been equated with its most controversial work while its most powerful and generative goes largely undiscussed? Absolutely yes. Is there also a powerful culture of groupthink and political conformity in the field, a social system of mutual surveillance where everyone constantly monitors each other for the slightest possible offense, and a set of publishing incentives that actively encourage obscurity and indigestible prose? I think the answer is also yes. But as long as the field is a battlefront in a much larger political-culture war, very few people will feel comfortable nuancing these distinctions, sorting the good from the bad/
What’s hard for people outside of academia (and some within academia) to understand is that cultural studies has a habit of, if you’ll forgive the term, colonizing other fields in the humanities and social sciences. As bad as the reputation of these assorted fields has gotten outside of the academy, and as tenuous as funding is, they have been remarkably successful at insinuating their views and language into other fields – sometimes in good ways, sometimes bad.
I self-identify as an applied linguist, specializing in educational assessment, and I spend most of my time these days reading, researching, and writing work that many would identify with the field of education. But I came up through programs in writing studies/rhetoric and composition, and I retain an interest in that field. I left it, spiritually if nothing else, because I am interested in quantitative empirical approaches to understanding writing, language learning, and assessment, and it had become clear that there was no room for empirical approaches as commonly defined in the field, at least beyond case studies of a handful of students or texts. I don’t think my own path is particularly interesting, but I think it is interesting and relevant how composition changed over time.
See, a lot of the origin story of rhet/comp/writing had to do with its methodological diversity. In the 1960s and 1970s, scholars who valued teaching writing and wanted to do it better were stymied in their English departments, where most faculty considered the study of literature preeminent and pedagogical work unimportant, especially when it came to writing. (This is an origin story, remember, so exaggeration and generalization are to be expected.) At an extreme, some professors who were interested in researching writing pedagogy were told not to bother to put pedagogical articles into their tenure files. These scholars, concentrated particularly in large land grant public universities in the Midwest, decided that they could never be taken seriously within literature-dominant programs and set out to create their own disciplinary and institutional structures.
Core to their new scholarly identity was methodological diversity. Their work was empirical, because investigating what works and what doesn’t when teaching students to write is a necessarily empirical practice. Their work was theoretical, because much of writing pedagogy involves considerations of how students think as well as write, and because the basic tool of humanistic inquiry is abstraction. Their work was also often literary, as many of these professors were trained in literature, retained interest in that field, and saw literature as a key lens through which to teach students to write. Their work was historical, as they often used the ancient study of rhetoric as a set of principles to guide the teaching of writing, supplying a time-tested array of habits and ideas to the somewhat nebulous subject-domain of writing. I could go on.
So you have someone like my grandfather, who predated the field but was something of a proto-member at the University of Illinois, whose large published corpus includes pragmatic pedagogical advice for how to teach students to read and write, essays on poetry that would appear comfortably in a literature journal, research articles where he hooked students up to polygraph machines to better understand how anxiety impacted their writing habits, and political treatises about why the humanities teach us to oppose war in all of its forms. The ability to do so much as a researcher, and get published doing all of it, always seemed very attractive to me.
I had always envisioned a field of writing studies that was as methodologically and philosophically diverse as its lingering reputation. There would be an empirical wing and a cultural studies wing and a practical pedagogy wing and a digital wing, etc…. There’s no reason these things would be mutually exclusive. But as I found as I moved through my graduate programs, in practice cultural studies pretty much ate the field, or so is the case that I’ll be making in this ongoing project I referenced earlier. That’s a big case to make and it requires a healthy portion of a book-length project to make it fairly. I can tell you though that if you pull a random article from a random journal in writing studies you will likely find very little about writing as traditionally understood and a great deal about hegemony, intersectionality, and the gendered violence of discourse. Empirical work as traditionally conceived is almost entirely absent. Today I talk to people in other wings of the humanities who tell me, straight out, that they can’t understand how composition/writing studies is distinct from cultural studies at all.
Why? Well, academia is faddish, particularly as pertains to the job market, and the strange forms of mentorship and patronage that are inherent to its training models means that there are network effects and path dependence that dictate subfields. But more, I think, the moral claims of cultural studies make it uncomfortable to study anything else. Because these critiques tend to make methodological differences not abstract matters of different legitimate points of academic view, but rather straightforwardly moralizing claims about the illegitimacy of given approaches to gathering and disseminating knowledge.
I want to preface this by saying that I know “cultural studies professors say it’s bigoted to do science” sounds like a conservative caricature of the humanities, but it is absolutely a position that is held straightforwardly and unapologetically by many real-world academics. I’m sorry if it seems to confirm ugly stereotypes about the humanities, but it is absolutely the case that there are prominent and influential arguments within the field that represent quantification as not just naive “scientism” but as part of a system of social control, a form of complicity with racism, sexism, and the like. I know this sounds like a story from some bad conservative novel, but it is not unheard of for rooms full of PhDs to applaud when someone says that, for example, witchcraft is just another way of knowledge and that disputing factual claims to its power is cultural hegemony.
The idea that conventional research and pedagogy are straightforwardly tools of power are abundant. +Take Elizabeth Flynn:
…beliefs in the objectivity of the scientist and the neutrality of scientific investigation serve the interests of those in positions of authority and power, usually white males, and serve to exclude those in marginalized positions….
Feminist critiques of the sciences and the social sciences have also made evident the dangers inherent in identifications with fields that have traditionally been male-dominated and valorize epistemologies that endanger those in marginalized positions.
This might sound pretty anodyne, but in the context of academic writing, it’s extreme. In particular, the notion that empirical methodologies actually endanger marginalized people is a serious charge, and one that is now ubiquitous in fields that are social sciences-adjacent. There are those in academia who believe not just that empirical approaches to knowledge are naive or likely to serve the interests of power but actively, materially dangerous to marginalized people. And there are those who prosecute this case within our institutions and journals quite stridently and personally.
This results in some awkward tensions between pedagogical responsibility and political theory. Patricia Bizzell exemplified the perspective that the purpose of teaching is to inspire students to resist hegemony, rather than to learn, say, how to write a paper – and that professors have a vested interested in making sure they stay on that path:
…our dilemma is that we want to empower students to succeed in the dominant culture so that they can transform it from within; but we fear that if they do succeed, their thinking will be changed in such a way that they will no longer want to transform it.
This strange, self-contradictory attitude towards students – valorizing them as agents of political change who should rise up and resist authority while simultaneously condescending to them and assuming that it is the business of professors to dictate their political project – remains a common facet of the contemporary humanities.
The broad rejection of research as a process of learning more about a world outside our heads, and of pedagogy as an attempt to share what we’ve learned therein with students, is quite prevalent. Take the late James Berlin, offering up a critique of these supposedly-naive assumptions:
Certain structures of the material world, the mind, and language, and their correspondence with certain goals, problem-solving heuristics, and solutions in the economic, social, and political are regarded as inherent features of the universe, existing apart from human social intervention. The existent, the good, and the possible are inscribed in the very nature of things as indisputable scientific facts, rather than being seen as humanly devised social constructions always remaining open to discussion.
Well. I am a rather postmodern guy, actually, compared to many, but I confess that I do believe that certain structures of the material world are inherent features of the universe. Though I am always open to a good discussion.
There are many other critics of the pursuit of knowledge as commonly understood in the field’s history, such as Mary Lay, Nancy Blyler, and Carl Herndl, and some of them are quite adamant in their rejection of the inherent hegemonic impulses of conventional research. This post will already be quite long, so I don’t want to get off on a tangent about postmodernism and change. I’ll just quote this apt observation from Zygmunt Bauman:
behind the postmodern ethical paradox hides a genuine practical dilemma: acting on one’s moral convictions is naturally pregnant with a desire to win for such convictions an ever more universal acceptance; but every attempt to do so just smacks of the already discredited bid for domination.
In any event, by 2001 John Trimbur and Diana George would write “cultural studies has insinuated itself into the mainstream of composition.” By 2005, Richard Fulkerson would say plainly, “in point of fact, virtually no one in contemporary composition theory assumes any epistemology other than a vaguely interactionist constructivism. We have rejected quantification and any attempts to reach Truth about our business by scientific means.” And so a field that in my grandfather’s era enjoyed great epistemological and methodological diversity became a field that only told one kind of story.
I don’t mean to exaggerate the uniformity. There are of course critics of the “cultural turn,” whether from empiricists like Davida Charney and Richard Haswell or theorists like Richard Miller and Thomas Rickert. And there is a diversity of subjects in writing studies research. But to a remarkable degree, the epistemological assumptions of cultural studies rule the field, and indeed the way that diversity is achieved is through applying a cultural studies lens to different subjects – a library full of dissertations on the cultural studies approach to Dr. Who, the cultural studies approach to Overwatch, the cultural studies approach the the communicative practices of the EPA, the cultural studies approach to Andrew Pickering’s theoretical construct of “the mangle.” I don’t dismiss any of these projects as projects; I am generally committed to radical cosmopolitanism when it comes to other people’s research interests. But I maintain a belief that the field would be healthier and more capable of defending its disciplinary identity (and its funding) were it to include more straightforward pedagogical work, historical work, and empirical work. But there is genuine fear among graduate students and early-career academics over whether one can wander too far from the field’s contemporary obsessions.
In applied linguistics/second language studies, a fairly close sibling field, I have seen less of a field-wide colonizing and more of a split into two very different camps, which do not have conflict as much as mutual incomprehension. This may be largely an idiosyncratic reading for me, colored more by my personal perceptions than anything else. But I do know that there are people who share the SLS banner whose work cannot talk to each other in any meaningful way. In grad school there were a large number of students whose approach to second language research was entirely in keeping with the mania for critical pedagogy, with student after student writing papers about how second language students should be encouraged to resist the hegemony of first-language practices and recognize the equal value of their own English dialect. At an extreme, this leads you to the position of someone like Suresh Canagarajah, who has long argued that research that compares the linguistic habits of second language speakers to first language counterparts is inherently judgmental and thus inherently offensive.
Meanwhile, these grad students, whose work was almost entirely theoretical and political in its methods and typically eschewed quantification altogether, would attend seminars next to students in language testing, corpus linguistics, or phonology whose work was almost purely quantitative. One group would cite Freire and Foucault while the other would run regressions and hierarchical linear models. This never erupted into real interpersonal conflict; it just meant that you had people whose work was not compatible in any meaningful way. This was always my frustration and fear myself in writing studies: when I spoke the language of effect sizes, ANOVAs, and p-values, I could make my work comprehensible to people from a large variety of fields. When I spoke to people outside the field about work I had read concerning, say, what Bourdieu could tell us about the rhetoric of play in Super Mario Bros, we both ended up at a loss. I know that sounds like a terribly cutting value judgment of that kind of work, but I don’t intend it to be. I mean simply that over time I became too frustrated by how incomprehensible the work I was reading for school appeared to anyone outside of a small handful of subfields. If I understand the field correctly as an outsider, this is a similar dynamic to that of anthropology, where evolutionary anthropologists engage in some of the “hardest” science possible while in the same departments many cultural anthropologists reject their work as inherently masculinist, naively positivist, and hegemonic.
From my completely anecdotal standpoint, the political and cultural side of second language studies is growing, the quantitative side shrinking or breaking off to join other broad disciplinary identities. I might be wrong about that. But either way, I am left to ponder whether these trends threaten the long-term existence of these fields – and by extension the humanities writ large, which are now dominated by a narrow set of political theories that insist on the inherent immorality of many conventional ways of looking at the world and thinking about it. As I have said, I value many things that have emerged from cultural studies. But those within the field sometimes seem eager to confirm every ugly stereotype the outside world has about hectoring, obscure, leftist academics, and there appears to me to be little in the way of professional or social incentives to compel professors to think and speak in a more pragmatically self-defensive way. One of my core beliefs about the academy is that how we talk about our research and teaching matters, that we can act as better or worse defenders of our fields and institutions if we pay attention to what the wider world values. But we are not currently doing a good job of that. At all.
Perhaps this is all just a long fad, and times will change for writing studies and the humanities writ large. There are trends like digital humanities which cut in the opposite direction, though they are ferociously contested in academic debates. I suppose time will tell. I worry that by the time some of these trends have worked themselves out, there will not be much left of the humanities to fight over.