cultural studies, ironically, is something of a colonizer

For a large academic project I’m working on, I’ve been trying to do something that is rather rare: discuss cultural studies and its practices in the academy in a nuanced and evenhanded way. Unfortunately, cultural studies and related fields have become the Battle of Verdun in our culture war, and typically any support is sorted by critics into “SJW bullshit” and any criticism into “reactionary proto-fascism” by supporters.

This is unfortunate because like all fields cultural studies has its strengths and its weaknesses. Has cultural studies been stereotyped and caricatured by its critics, reduced to a set of entirely unfair associations and impressions, forced constantly to defend the worst excesses of individual member, and in general been equated with its most controversial work while its most powerful and generative goes largely undiscussed? Absolutely yes. Is there also a powerful culture of groupthink and political conformity in the field, a social system of mutual surveillance where everyone constantly monitors each other for the slightest possible offense, and a set of publishing incentives that actively encourage obscurity and indigestible prose? I think the answer is also yes. But as long as the field is a battlefront in a much larger political-culture war, very few people will feel comfortable nuancing these distinctions, sorting the good from the bad/

What’s hard for people outside of academia (and some within academia) to understand is that cultural studies has a habit of, if you’ll forgive the term, colonizing other fields in the humanities and social sciences. As bad as the reputation of these assorted fields has gotten outside of the academy, and as tenuous as funding is, they have been remarkably successful at insinuating their views and language into other fields – sometimes in good ways, sometimes bad.

I self-identify as an applied linguist, specializing in educational assessment, and I spend most of my time these days reading, researching, and writing work that many would identify with the field of education. But I came up through programs in writing studies/rhetoric and composition, and I retain an interest in that field. I left it, spiritually if nothing else, because I am interested in quantitative empirical approaches to understanding writing, language learning, and assessment, and it had become clear that there was no room for empirical approaches as commonly defined in the field, at least beyond case studies of a handful of students or texts. I don’t think my own path is particularly interesting, but I think it is interesting and relevant how composition changed over time.

See, a lot of the origin story of rhet/comp/writing had to do with its methodological diversity. In the 1960s and 1970s, scholars who valued teaching writing and wanted to do it better were stymied in their English departments, where most faculty considered the study of literature preeminent and pedagogical work unimportant, especially when it came to writing. (This is an origin story, remember, so exaggeration and generalization are to be expected.) At an extreme, some professors who were interested in researching writing pedagogy were told not to bother to put pedagogical articles into their tenure files. These scholars, concentrated particularly in large land grant public universities in the Midwest, decided that they could never be taken seriously within literature-dominant programs and set out to create their own disciplinary and institutional structures.

Core to their new scholarly identity was methodological diversity. Their work was empirical, because investigating what works and what doesn’t when teaching students to write is a necessarily empirical practice. Their work was theoretical, because much of writing pedagogy involves considerations of how students think as well as write, and because the basic tool of humanistic inquiry is abstraction. Their work was also often literary, as many of these professors were trained in literature, retained interest in that field, and saw literature as a key lens through which to teach students to write. Their work was historical, as they often used the ancient study of rhetoric as a set of principles to guide the teaching of writing, supplying a time-tested array of habits and ideas to the somewhat nebulous subject-domain of writing. I could go on.

So you have someone like my grandfather, who predated the field but was something of a proto-member at the University of Illinois, whose large published corpus includes pragmatic pedagogical advice for how to teach students to read and write, essays on poetry that would appear comfortably in a literature journal, research articles where he hooked students up to polygraph machines to better understand how anxiety impacted their writing habits, and political treatises about why the humanities teach us to oppose war in all of its forms. The ability to do so much as a researcher, and get published doing all of it, always seemed very attractive to me.

I had always envisioned a field of writing studies that was as methodologically and philosophically diverse as its lingering reputation. There would be an empirical wing and a cultural studies wing and a practical pedagogy wing and a digital wing, etc…. There’s no reason these things would be mutually exclusive. But as I found as I moved through my graduate programs, in practice cultural studies pretty much ate the field, or so is the case that I’ll be making in this ongoing project I referenced earlier. That’s a big case to make and it requires a healthy portion of a book-length project to make it fairly. I can tell you though that if you pull a random article from a random journal in writing studies you will likely find very little about writing as traditionally understood and a great deal about hegemony, intersectionality, and the gendered violence of discourse. Empirical work as traditionally conceived is almost entirely absent. Today I talk to people in other wings of the humanities who tell me, straight out, that they can’t understand how composition/writing studies is distinct from cultural studies at all.

Why? Well, academia is faddish, particularly as pertains to the job market, and the strange forms of mentorship and patronage that are inherent to its training models means that there are network effects and path dependence that dictate subfields. But more, I think, the moral claims of cultural studies make it uncomfortable to study anything else. Because these critiques tend to make methodological differences not abstract matters of different legitimate points of academic view, but rather straightforwardly moralizing claims about the illegitimacy of given approaches to gathering and disseminating knowledge.

I want to preface this by saying that I know “cultural studies professors say it’s bigoted to do science” sounds like a conservative caricature of the humanities, but it is absolutely a position that is held straightforwardly and unapologetically by many real-world academics. I’m sorry if it seems to confirm ugly stereotypes about the humanities, but it is absolutely the case that there are prominent and influential arguments within the field that represent quantification as not just naive “scientism” but as part of a system of social control, a form of complicity with racism, sexism, and the like. I know this sounds like a story from some bad conservative novel, but it is not unheard of for rooms full of PhDs to applaud when someone says that, for example, witchcraft is just another way of knowledge and that disputing factual claims to its power is cultural hegemony.

The idea that conventional research and pedagogy are straightforwardly tools of power are abundant. +Take Elizabeth Flynn:

…beliefs in the objectivity of the scientist and the neutrality of scientific investigation serve the interests of those in positions of authority and power, usually white males, and serve to exclude those in marginalized positions….

Feminist critiques of the sciences and the social sciences have also made evident the dangers inherent in identifications with fields that have traditionally been male-dominated and valorize epistemologies that endanger those in marginalized positions.

This might sound pretty anodyne, but in the context of academic writing, it’s extreme. In particular, the notion that empirical methodologies actually endanger marginalized people is a serious charge, and one that is now ubiquitous in fields that are social sciences-adjacent. There are those in academia who believe not just that empirical approaches to knowledge are naive or likely to serve the interests of power but actively, materially dangerous to marginalized people. And there are those who prosecute this case within our institutions and journals quite stridently and personally.

This results in some awkward tensions between pedagogical responsibility and political theory. Patricia Bizzell exemplified the perspective that the purpose of teaching is to inspire students to resist hegemony, rather than to learn, say, how to write a paper – and that professors have a vested interested in making sure they stay on that path:

…our dilemma is that we want to empower students to succeed in the dominant culture so that they can transform it from within; but we fear that if they do succeed, their thinking will be changed in such a way that they will no longer want to transform it.

This strange, self-contradictory attitude towards students – valorizing them as agents of political change who should rise up and resist authority while simultaneously condescending to them and assuming that it is the business of professors to dictate their political project – remains a common facet of the contemporary humanities.

The broad rejection of research as a process of learning more about a world outside our heads, and of pedagogy as an attempt to share what we’ve learned therein with students, is quite prevalent. Take the late James Berlin, offering up a critique of these supposedly-naive assumptions:

Certain structures of the material world, the mind, and language, and their correspondence with certain goals, problem-solving heuristics, and solutions in the economic, social, and political are regarded as inherent features of the universe, existing apart from human social intervention. The existent, the good, and the possible are inscribed in the very nature of things as indisputable scientific facts, rather than being seen as humanly devised social constructions always remaining open to discussion.

Well. I am a rather postmodern guy, actually, compared to many, but I confess that I do believe that certain structures of the material world are inherent features of the universe. Though I am always open to a good discussion.

There are many other critics of the pursuit of knowledge as commonly understood in the field’s history, such as Mary Lay, Nancy Blyler, and Carl Herndl, and some of them are quite adamant in their rejection of the inherent hegemonic impulses of conventional research. This post will already be quite long, so I don’t want to get off on a tangent about postmodernism and change. I’ll just quote this apt observation from Zygmunt Bauman:

behind the postmodern ethical paradox hides a genuine practical dilemma: acting on one’s moral convictions is naturally pregnant with a desire to win for such convictions an ever more universal acceptance; but every attempt to do so just smacks of the already discredited bid for domination.

In any event, by 2001 John Trimbur and Diana George would write “cultural studies has insinuated itself into the mainstream of composition.” By 2005, Richard Fulkerson would say plainly, “in point of fact, virtually no one in contemporary composition theory assumes any epistemology other than a vaguely interactionist constructivism. We have rejected quantification and any attempts to reach Truth about our business by scientific means.” And so a field that in my grandfather’s era enjoyed great epistemological and methodological diversity became a field that only told one kind of story.

I don’t mean to exaggerate the uniformity. There are of course critics of the “cultural turn,” whether from empiricists like Davida Charney and Richard Haswell or theorists like Richard Miller and Thomas Rickert. And there is a diversity of subjects in writing studies research. But to a remarkable degree, the epistemological assumptions of cultural studies rule the field, and indeed the way that diversity is achieved is through applying a cultural studies lens to different subjects – a library full of dissertations on the cultural studies approach to Dr. Who, the cultural studies approach to Overwatch, the cultural studies approach the the communicative practices of the EPA, the cultural studies approach to Andrew Pickering’s theoretical construct of “the mangle.” I don’t dismiss any of these projects as projects; I am generally committed to radical cosmopolitanism when it comes to other people’s research interests. But I maintain a belief that the field would be healthier and more capable of defending its disciplinary identity (and its funding) were it to include more straightforward pedagogical work, historical work, and empirical work. But there is genuine fear among graduate students and early-career academics over whether one can wander too far from the field’s contemporary obsessions.

In applied linguistics/second language studies, a fairly close sibling field, I have seen less of a field-wide colonizing and more of a split into two very different camps, which do not have conflict as much as mutual incomprehension. This may be largely an idiosyncratic reading for me, colored more by my personal perceptions than anything else. But I do know that there are people who share the SLS banner whose work cannot talk to each other in any meaningful way. In grad school there were a large number of students whose approach to second language research was entirely in keeping with the mania for critical pedagogy, with student after student writing papers about how second language students should be encouraged to resist the hegemony of first-language practices and recognize the equal value of their own English dialect. At an extreme, this leads you to the position of someone like Suresh Canagarajah, who has long argued that research that compares the linguistic habits of second language speakers to first language counterparts is inherently judgmental and thus inherently offensive.

Meanwhile, these grad students, whose work was almost entirely theoretical and political in its methods and typically eschewed quantification altogether, would attend seminars next to students in language testing, corpus linguistics, or phonology whose work was almost purely quantitative. One group would cite Freire and Foucault while the other would run regressions and hierarchical linear models. This never erupted into real interpersonal conflict; it just meant that you had people whose work was not compatible in any meaningful way. This was always my frustration and fear myself in writing studies: when I spoke the language of effect sizes, ANOVAs, and p-values, I could make my work comprehensible to people from a large variety of fields. When I spoke to people outside the field about work I had read concerning, say, what Bourdieu could tell us about the rhetoric of play in Super Mario Bros, we both ended up at a loss. I know that sounds like a terribly cutting value judgment of that kind of work, but I don’t intend it to be. I mean simply that over time I became too frustrated by how incomprehensible the work I was reading for school appeared to anyone outside of a small handful of subfields. If I understand the field correctly as an outsider, this is a similar dynamic to that of anthropology, where evolutionary anthropologists engage in some of the “hardest” science possible while in the same departments many cultural anthropologists reject their work as inherently masculinist, naively positivist, and hegemonic.

From my completely anecdotal standpoint, the political and cultural side of second language studies is growing, the quantitative side shrinking or breaking off to join other broad disciplinary identities. I might be wrong about that. But either way, I am left to ponder whether these trends threaten the long-term existence of these fields – and by extension the humanities writ large, which are now dominated by a narrow set of political theories that insist on the inherent immorality of many conventional ways of looking at the world and thinking about it. As I have said, I value many things that have emerged from cultural studies. But those within the field sometimes seem eager to confirm every ugly stereotype the outside world has about hectoring, obscure, leftist academics, and there appears to me to be little in the way of professional or social incentives to compel professors to think and speak in a more pragmatically self-defensive way. One of my core beliefs about the academy is that how we talk about our research and teaching matters, that we can act as better or worse defenders of our fields and institutions if we pay attention to what the wider world values. But we are not currently doing a good job of that. At all.

Perhaps this is all just a long fad, and times will change for writing studies and the humanities writ large. There are trends like digital humanities which cut in the opposite direction, though they are ferociously contested in academic debates. I suppose time will tell. I worry that by the time some of these trends have worked themselves out, there will not be much left of the humanities to fight over.

Mechanism Agnostic Low Plasticity Educational Realism

I did a brief interview with someone who was writing a story about crowdfunded academic writing, which appears to have been killed by the prospective publication. In the interview the journalist asked me how I would define my basic philosophy on education, which I said was deeply out of fashion with most education writing. What I came up with off the cuff was “Mechanism Agnostic Low Plasticity Educational Realism,” which is I guess as good a gloss as any. This is my alternative to the Official Dogma of Education.

The basic idea is that both the overwhelming empirical evidence and common sense tells us that different people have different levels of academic ability, that they sort themselves into various achievement bands early in life, that this sorting is at scale and in general remarkably persistent over time and across a wide variety of educational contexts, and that our pedagogical and policy efforts will be most constructive and fruitful if we recognize this reality. This is not a claim that people can’t learn, or that they can’t be taught in better or worse ways. It is a claim that the portion of the variability in outcomes in any given educational metrics that can be controlled by teachers or parents is dramatically lower than that which is commonly assumed.

I say low plasticity because the presumed degree to which any individual or group’s educational outcomes can be altered via schooling is usually assumed to be quite high – that is, the “no excuses” school of education philosophy, the “if you believe it you can achieve it” attitude that pervades our discourse, acts as though educational outcomes are highly plastic and subject to molding. And in contrast I suggest that the average level of plasticity in any given student’s outcomes is probably relatively low. Not zero, obviously – there are interventions that work better or worse, and we should work to maximize every student’s performance within ethical reason. And the degree of plasticity is probably variable as well; a child with a severe cognitive disability probably has more severe constraints on their outcomes than one without, just as a student who enjoys the benefits of extreme socioeconomic privilege and activist parents probably has a higher floor than the average. But across the system we should expect much less plasticity in outcomes than is commonly assumed.

I say mechanism agnostic because I am not entirely confident that we know why different people have consistently better academic outcomes than others, but we still know with great confidence that they do. Obviously, a lot of evidence suggests that differences in individual academic performance is genetic in its origin. The degree and consistency of that genetic influence will need to continue to be investigated. But the details of how educational outcomes are shaped, while of immense importance, don’t change the remarkably consistent finding that different people have different levels of academic ability and that these tend not to change much over the course of life. In a policy context that has spawned efforts like No Child Left Behind, which assumes universal ability to hit arbitrary performance benchmarks, this is an essential insight.

The implied policy and philosophical changes for such a viewpoint are open to good-faith debate. As I have written in this space before, I think that recognizing that not all students have the same level of academic ability should agitate towards a) expanding the definition of what it means to be a good student and human being, b) not attempting to push students towards a particular idealized vision of achievement such as the mania for “every student should be prepared to code in Silicon Valley,” and c) a socialist economic system. Some people take this descriptive case and imagine that it implies a just-deserts, free market style of capitalism where differences in ability should be allowed to dictate differences in material wealth and security. I think it implies the opposite – a world of “natural,” unchosen inequalities in ability is a world with far more pressing need to achieve social and economic equality through communal action, as that which is uncontrolled by individuals cannot be morally used to justify their basic material conditions.

Before we get to those prescriptive conclusions, though, we need to get to the empirical observation – that the existence of a broad distribution of people into various tiers of academic ability, at certain predictable intervals and percentages, is not some error caused by the failure of modern schooling, but an inevitable facet of the nature of a world of variability. Until and unless we can have a frank discussion of the existence of persistent differences in academic ability within any identifiable subgroup of students, we can’t have real progress in our education policy.

the mass defunding of higher education that’s yet to come

This graphic represents a crisis.

For it to be a crisis does not depend on you having any conservative sympathies. For it to be a crisis does not even depend on you having an old fashioned sense that college must be an arena of the battle of ideas, the kind of quaint notion that I grew up with and which was never seen in my academic household as at all contrary to socialist beliefs. No, for this to be a crisis requires only that you recognize that Republicans are one of two major political parties in American life, and that the structural realities of our system, and the cyclical nature of elections, ensures that there will be practical consequences of such a dire decline in popularity. Further, it helps if you recognize that in the present era, Republicans dominate American governance, with control of the House, Senate, Presidency, and crucially for our purposes, a significant majority of the country’s statehouses and governor’s mansions. They also have built a machine for state-level political elections that ensures that they will likely control many state legislatures for years to come.

I am increasingly convinced that a mass defunding of public higher education is coming to an unprecedented degree and at an unprecedented scale. People enjoy telling me that this has already occurred, as if I am not sufficiently informed about higher education to know that state support of our public universities has declined precipitously. But things can always get worse, much worse. And given the endless controversies on college campuses of conservative speakers getting shut out and conservative students feeling silenced, and given how little the average academic seems to care about appealing to the conservative half of this country, the PR work is being done for the enemies of public education by those within the institutions themselves. And the GOP has already shown a great knack for using claims of bias against academia, particularly given the American yen for austerity.

Meanwhile, in my very large network of professional academics, almost no one recognizes any threat at all. Many, I can say with great confidence, would reply to the poll above with glee. They would tell you that they don’t want the support of Republicans. There’s little attempt to grapple with the simple, pragmatic realities of political power and how it threatens vulnerable institutions whose funding is in doubt. That’s because there is no professional or social incentive in the academy to think strategically or to understand that there is a world beyond campus. Instead, all of the incentives point towards constantly affirming one’s position in the moral aristocracy that the academy has imagined itself as. The less one spends on concerns about how the university and its subsidiary departments function in our broader society, the greater one’s performed fealty to the presumed righteousness of the communal values. I cannot imagine a professional culture less equipped to deal with a crisis than that of academics in the humanities and social sciences and the current threats of today. The Iron Law of Institutions defines the modern university, and what moves someone up the professional ranks within a given field is precisely the type of studied indifference to any concerns that originate outside of the campus walls.

Universities make up a powerful lobbying bloc, and they have proven to be durable institutions. I don’t think you’ll see many flagship institutions shuttered soon. But an acceleration of the already-terminal deprofessionalization of the university teaching corps? Shuttering departments like Women’s Studies or similar? Passing harsh restrictions on campus groups and how they can organize? That’s coming, and despite showy nihilism from people who will insist that I am naive to imagine there was any alternative, our own behavior will make it easier for reactionary power, every step of the way. I assure you: there is many things that they can do to us that will make life for all of us much worse, and your self-impressed indifference will not shelter you.

In 2010 I wrote of Michael Berube’s What’s Liberal About the Liberal Arts?, “the philosophy of non-coercion and intellectual pluralism that Berube describes and defends so well isn’t just an intellectual curiosity, but an actual ethos that he and other professors live by, and which defends conservative students.” I grew up believing that most professors lived by that ethos. I don’t, anymore. It really has changed. For years we fought tooth and nail to oppose the David Horowitz’s of the world, insisting that their narratives of anti-conservative bias on campus were without proof. Now, when I try to sound the alarm bells to others within the academy that mainstream conservatism is being pushed out of our institutions, I get astonished reactions – you actually think conservatives should feel welcomed on campus? From arguments of denial to arguments of justification, overnight, with no one seeming to grapple with just how profound the consequences must be. We are handing ammunition to some very dangerous people.

David Brooks has a column out today. That means that social media is going through one of its most tired types of in-group performance, where everyone makes the same jokes and the same tired “analysis” of whatever his latest dumb argument is, over and over again. None of the jokes are funny, none of the analysis useful, but this ritual fulfills the very function that Brooks is talking about in his column: making fun of David Brooks is one of the ways that bourgie liberals signal to other bourgie liberals that they are The Right Kind of Person. Brooks, of course, is incapable of really understanding his own observations, given his addiction to just-so stories about character and gumption and national grit. He does not see, and can’t see, the economic structures that dictate so much of American life, nor is he constitutionally capable of understanding the depths of traditional injustices and inequality. If he did, he wouldn’t have the column.

But his critics can’t see something that, for all of his myopia, he always has: that our political divide is increasingly bound up in a set of class associations and signals that have little to do with conspicuous consumption and everything to do with a style of self-performance that few people ever talk about but everyone understands. It is the ability to give such a performance convincingly that, in part, people buy with their tuition dollars.

That this condition makes egalitarian politics a part of elite class formation has gone little discussed in my political home, the radical left. I have been excited to see a recent groundswell of young left-aligned people, and many of them are bright and committed. But almost none of them seem aware of the fact that their ironic Twitter accounts and cultural references and received opinions on all manner of political issues are as sure a sign of their class identity as a pair of wingtips and a blazer once was. And until and unless they understand how powerfully alienated the great mass of this country is from their social culture, we cannot hope to build a mass left-wing movement and with it do good things like defend public education. I agree: it’s the economy, stupid, and we must appeal to them by making the case that things like universal free college are good. But if recent political history tells us anything it’s that no economic policy, no matter how sensible, can win if its proponents refuse to grapple with the politics of resentment. The left, broadly, has not done a good job of that. The professoriate? My god.

I am unapologetically a part of several different elites myself. I would like to think I am at least aware of it, and at least capable of giving the devil his due by saying that David Brooks is not wrong about who sees class, how they see it, and why it matters.

I owe my life to our university system and I am a product of public schools. Public education is the core concern of my professional and intellectual life. Our public universities are under massive pressure and at immense risk. Their enemies are a powerful, well-funded, and relentless political movement that has one and only one remaining impulse, which is to destroy its perceived enemies. Those who should be defenders of public universities have created a culture that is not just indifferent to attempts to effectively defend our values in open debate but who now mock the concept of public debate as a conservative shibboleth. I can see no short-term evolution in the culture of the academic left that will enable us to become effective champions of our own institutions, the monsters are coming, and I am afraid.

programming note

Please forgive the light posting this (holiday) week. We’ll be back to a regular schedule next Monday, and there’s a book review I’m very happy with coming to $5 Patreon patrons tomorrow. I wrote something on Medium here that explains some aspects of my life that have been relevant to my written production. Hope you’re enjoying yourselves and see you soon.

if you’re in school, try the curriculum

I mean that both literally and figuratively.

Back in graduate school, I knew a few people at a couple different institutions who were kind of trapped, who had trapped themselves. They were people who were formally enrolled in graduate programs, and seemingly doing the necessary steps. But they could never mentally commit. In fact, they would frequently denigrate the idea of going to grad school while in grad school. They would make constant “grad students, am I right?” jokes. They would disdain dissertations and seminars. They would attend departmental functions but would never stop making self-consciously ironic comments. They were in these programs but never wanted people to forget that they found the whole thing laughable.

The psychological reasons are obvious; grad school is long, the work is hard, the lifestyle is often isolating and lonely, the employment prospects in many fields are dubious, and there’s an ambient cultural disdain for grad students. Holding the experience at arm’s length, to some small degree, inoculates you from those fears and indignities. The problem is that you’re still there, doing it, and it still takes a hell of a lot of work if you’re ironizing the experience or not, and in the face of that kind of slog, a simple and uncomplicated self-belief in what you’re doing is a far better emotional tool than irony and detachment. The people in grad school who were just enthusiastically engaged thrived. The people who were in-but-out suffered through the program; they had to contend not only with the toil and low pay but with their own constant insistence that it was all a joke.

They were, it seemed to me, in a truly miserable position entirely of their own making.

In life in general, I find, one of the easiest types of self-injury to inflict is to refuse to mentally and emotionally commit to that which you are formally or practically committed. It’s a growing problem in a society that has fundamentally misunderstood what irony is and what it’s for. You get people who don’t know how to function without putting everything they’re doing in scare quotes. And that just kills your ability to deal with the daily indignities of life. You’re obligated to go through with what you’ve committed yourself to, but you can’t really commit. It leaves you with the burden of the work but without the emotional support of genuine resolve. There has to be a space between living like you actually take inspirational Instagram memes seriously and living in a state of constantly mocking the conditions of your own existence.

If you look at Twitter, you’ll see a perfect example of an entire community defined by what I’m talking about. You have people who have tweeted literally hundreds of thousands of times who will then laugh off the importance of those tweets. They’ll say things like “imagine caring about online” when they are never not online. They will meticulously craft a persona that they then represent as meaningless to them. They’ll laugh at someone who brags about their follower numbers, but they’ll also laugh at someone who doesn’t get a lot of retweets or favs. They clearly care but are terrified of betraying that emotional commitment. And I think it plays a really big role in why that platform is such a font of abuse, unhappiness, and conflict. I really do.

I really messed up back in high school, and I’ve always regretted it, though I had some excuses. I wish I could go back and talk to myself. I’d say, look – I know you don’t want to be here. I know this all seems pointless. I know you’re enduring daily indignities. But look – you have to come to school. So why not try working the program a little? I can’t go back, but I can commit to just doing the things I’m doing, unironically and without apology.

Irony’s a vital tool for life, particularly when we are trapped in such terrible systems of inequality and authority. And I get why people find Dave Eggers-style “new sincerity” stuff so obnoxious. But irony has been applied both too liberally and with too little regard for its traditional uses and meaning, and the results inevitably hurt the very people who use irony to avoid hurt. I would counsel people to ask themselves directly: what are you getting out of this refusal to simply do what you’re doing when you’re doing it? What has all of this irony done for you, beyond left you unable to directly communicate what you’re feeling or to simply experience pride and satisfaction in the things you’re doing? You’re stuck here, with all of us, either way. Work the program, you guys. Try the curriculum.

Enjoy the holiday. See you Wednesday.

genetic behaviorism supports the influence of chance on life outcomes

 

I’ve been trying, in this space, to rehabilitate the modern science of genetic influence on individual variation in academic outcomes to progressives. Many left-leaning people have perfectly reasonable fears about this line of inquiry, as in the past similar-sounding arguments have been used to justify eugenics, while in the present many racists make pseudoscientific arguments based on similar evidence to justify their bigotry. Like others, I am interested in showing that there are progressive ways to understand genetic behaviorism that reject racism and which support, rather than undermine, redistributive visions of social justice.

I can’t deny, though, that there are many regressive ways to make these arguments. That’s particularly true given that there’s a large overlap in the Venn diagram of IQ determinists and economic libertarians. I want to take a moment and demonstrate how conservatives misread and misuse genetic behaviorism to advance their ideological preferences for free market economics.

In this post, Ben Southwood of the conservative Adam Smith Institute uses evidence from genetic behaviorism and education research to argue that luck really doesn’t play much of a role in life outcomes. To prove this point, he cites many high-quality studies showing that random assignment (or last in/last out models) to schools of supposedly differing quality has little impact on student academic outcomes. He argues that our understanding of genetic influence on intelligence should influence our perception of how much schools can really do to help struggling students. This is, in general, a line of thinking that fits with my own. But he makes a leap into then suggesting that what we call luck (let’s say the uncontrolled vicissitudes of chance and circumstance that are beyond the control of the individual) has little or nothing to do with life outcomes. He does so because this presumably lends credence to libertarian economics, which are based on a just deserts model – the notion that the market economy basically rewards and punishes people in line with their own merit. This leap is totally unsupportable and is undermined by the very evidence he points to.

To begin with, Southwood ignores a particularly inconvenient fact for his brand of conservative determinism: the large portion of unaccounted-for variation in IQ and academic outcomes even when accounting for genetics and the shared environment (code for the portion of the environment in a child’s life controlled by parents and the family). There is famously (or notoriously) a portion of variation in measurable psychological outcomes that we can’t explain, a large portion – as much as half of the variation, maybe, depending on what study you’re looking at. And this portion seems unlikely to be explainable in systematic terms. Plomin and Daniels called this the “gloomy prospect,” writing

One gloomy prospect is that the salient environment might be unsystematic, idiosyncratic, or serendipitous events such as accidents, illnesses, or other traumas . . . . Such capricious events are likely to prove a dead end for research.

Turkheimer wrote recently:

scientific study of  the nonshared environment and molecular aspects of the genome have proven much harder than anyone anticipated.  But I still feel bad about harping on it, as though I am spoiling the good vibes of hardworking scientists, who are naturally optimistic about the work they are conducting.  But ever since I was in graduate school, I have felt that biogenetic science has always oversold their contribution, tried to convince everyone that the next new method is going to be the one that finally turns psychology into a real natural science, drags our understanding of ourselves out of the humanistic muck.  But it never actually happens.

The gloomy prospect, in other words, represents exactly the influence of what we usually refer to as luck. Southwood claims that genetics explains perhaps .90 of the variation at adult, but this represents extreme upper bound predictions for that influence. Most of the literature suggests significantly more modest heritability estimates than that. So we are left with this big uncontrolled portion, which as Turkheimer says has proven resistant to systematic understanding and which likely reflects truly idiosyncratic and individual impacts on the lives of individuals. Unfortunately for progressives who want to dramatically improve educational outcomes by changing the home environment of children, quality studies consistently find that the impact of changes to that environment is minor. Unfortunately for Southwood, the unexplained portion of academic outcomes (and subsequent economic outcomes) looks precisely like chance, or at least, that which is uncontrolled by either the individual or his or her parents. The last line of his post is thus totally unsupported by the evidence.

But there’s an even bigger issue for Southwood here: no one is in control of their own genotype. It’s bizarre when conservative-leaning people endorse genetic determinism as a justification for just-deserts economic theories. Genetic influence on human behavior stands directly in contrast to the notion that we control our own destinies. How then can Southwood advance a vision of free market economics as a system in which reward is parceled out fairly, given that the distribution of genetic material between individuals is entirely outside of their control? Which genetic code you happen to be born with is a lottery. I happen to not have gotten a scratch off ticket that allows me to have been an NFL player or a research physicist. That’s not a tragedy because I am still able to secure my basic material needs and comforts. But not everyone is so lucky, and for many the free market will result only in suffering and hopelessness.

It is immoral, and irrational, to build a society in which conditions you do not choose dictate whether you live rich and prosperous or poor and hopeless. That is true if this inequality is caused by inheriting money from your rich parents or by inheriting their genes or by being deeply influenced by the vagaries of chance. The best, most rational system in a world of uncontrolled variation in outcomes is a system that guarantees a standard of living even under the worst of luck – that is, socialism.

Correction: Southwood has taken considerable umbrage to this post, which he expressed in a dozen-tweet missive and Medium post. You should read that. I concede that I was uncharitable in how he talks about luck, and I recognize that he sees luck as impact life events. I do not agree with his claim that path dependence and luck do not contribute to life outcomes, and it’s weird that his post title alludes to Gregory Clark’s The Son Also Rises, which demonstrates that wealth benefits from inheritance can persist for far longer than traditionally thought. But that’s immaterial to the question of whether I accurately reflected Southwood’s position on luck and redistribution. So consider this an apology. I should have spoken more carefully and read more charitably and for that I’m sorry.

As for the 90% of variance figure, my wording “perhaps .90” is an accurate reading of a presentation of a range, and I don’t withdraw it. If anyone objects, I am happy to tutor them in reading, for a fee.

public services are not an ATM

Built into the rhetoric of school choice is a deeply misguided vision of how public investment works.

You sometimes hear people advocating for charters or voucher programs by saying that parents just want to take “their share” of public education funds and use it to get their child an education, whether by siphoning it from traditional public schools towards charters or by cutting checks to private schools. The “money should follow the child,” to use another euphemism. But this reflects a strange and deeply conservative vision of how public spending works. There is no “your share” of public funds. There is the money that we take via taxation from everyone which represents the pooled resources of civic society, and there is what civic society decides to spend it on via the democratic process. You might use that democratic process to create a system where some of the money goes to charter schools or private school vouchers or all manner of things I don’t approve of. But it’s not your money, no matter how much you paid into taxes. And the distinction matters.

To begin with, the constantly-repeated claim that charter schools don’t cost traditional public schools money is just proven wrong again and again. People lay out these theoretical systems where they don’t, like you can just subtract one student and all of the costs associated with that student and just shift the kid and the money to another school. But this reflects a basic failure to understand pooled costs and economies of scale. And when we go looking, that’s what we find: after years of promises that charters are not an effort to defund traditional public schools, our reality checks show they have that effect. Take Chicago, where the charter school system has absolutely contributed to the fiscal crisis in the traditional public schools. Or Nashville. Or Los Angeles. I could go on.

But suppose we knew that we could extract exactly as much, dollar for dollar and student for student, from public education for each student who leaves. Would that be a wise thing to do? Not according to any conventional progressive philosophy towards government.

Do we let you take “your share” out of the public transportation system so that you can use it to defray the cost of buying your own car? Can you take “your share” out of the police budgets to hire your own private security? Can I extract my tax dollars from the public highway system I almost never use in order to build my own bike lanes? Of course not. In many cases this simply wouldn’t make sense; how can you extract your share from a building, or a bridge, or any other type of physical infrastructure? And besides: the basic progressive nature of public ownership means that we are pooling resources so that those who have the least ability to pay for their own services can benefit from the contributions of those with the most ability to pay. To advance the notion of people pulling “their” tax dollars out from public schools undermines the very conception of shared social spending. And governmental spending should require true democratic accountability; letting the Bill and Melinda Gates Foundation dictate public education policy, Mark Zuckerberg become the wholly unqualified education czar of Newark, or the Catholic church control public education dollars through voucher programs directly undermines that accountability.

So of course there’s a deep and widening split opening up within the school reform coalition, which has always been filled with self-styled progressives. There’s a major, existential disagreement at play about the basic concepts of social spending and the public good. These have been papered over for years by the missionary zeal of choice acolytes and their crisis narrative. But there was never a coherent progressive political philosophy underneath. The Donald Trump and Betsey Devos education platform is a disaster in the making, but at least it has brought these basic conflicts into the light. These issues are not going away, nor should they, and the “progressive” ed reform movement is going to have to do a lot of soul searching.

diversifying the $5 reward tier

Hey gang, first I’m sorry content has been a bit light on the main site this week. Good things are coming in bunches soon. I have been releasing archival content to all subscribers on the Patreon page at a steady clip. I wanted to let you know that I’ve decided to diversify the $5 patron content a little. It’s not so much that I’m not keeping up with the book reading – it’s been a bit tough but not bad – but rather that I’m feeling a little constrained by the review format. So I’m going to alternate between book reviews and more general cultural writing, reading recommendations, considerations of contemporary criticism, etc. There will still not be any explicitly political content, which I host on Medium.

Book reviews return this weekend at last, though, and thanks for your patience. I’ve got a number of good ones coming up. Thank you for your continued support. If you aren’t yet a Patreon patron, please consider it. Also, thanks so much for the emails, and I apologize if I haven’t gotten back to you. I’ve taken some unexpected heat lately, and the support means more than I can say.

g-reliant skills seem most susceptible to automation

This post is 100% informed speculation.

As someone who is willing to acknowledge that IQ tests measure something real, measurable, and largely persistent, I take some flak from people who are skeptical of such metrics. As someone who does not think that IQ (or g, the general intelligence factor that IQ tests purport to measure) is the be-all, end-all of human worth, I take some flak from the internet’s many excitable champions of IQ. This is one of those things where I get accused of strawmanning – “nobody thinks IQ measures everything worthwhile!” – but please believe me that long experience shows that there are an awful lot of very vocal people online who are deeply insistent that IQ measures not just raw processing power but all manner of human value. Like so many other topics, IQ seems to be subject to a widespread binarism, with most people clustered at two extremes and very few with more nuanced positions. It’s kind of exhausting.

I want to make a point that, though necessarily speculative, seems highly intuitive to me. If we really are facing an era where superintelligent AI is capable of automating a great deal of jobs out from under human workers, it seems to me that many g-reliant jobs are precisely the ones most likely to be automated away. If the factor represents the ability to do raw intellectual processing, then it seems likely to me that the g-factor will become less economically worthwhile when such processing is offloaded to software. IQ-dominant tasks in specific domains like chess have already been conquered by task-specific AI. It doesn’t seem like a stretch to me to suggest that more obviously vocational skills will be colonized by new AI systems.

Meanwhile, contrast this with professions that are dependent on “soft” skills. Extreme IQ partisans are very dismissive of these things, often arguing that they aren’t real or that they’re just correlated with IQ anyway. But I believe that there are social, emotional, and therapeutic skills that are not validly measured by IQ tests, and these skills strike me as precisely those that AI will have the hardest time replicating. Human social interactions are incredibly complex and are barely understood by human observers who are steeped in them every day. And human beings need each other; we crave human contact and human interaction. It’s part of why people pay for human instructors in all sorts of tasks that they could learn from free online videos, why we pay three times as much for a drink at a bar than we would pay to mix it at home, why we have set up these odd edifices like coworking spaces that simply permit us to do solo tasks surrounded by other human beings. I don’t really know what’s going to happen with automation and the labor market; no one does. But that so many self-identified smart people are placing large intellectual bets on the persistent value of attributes that computers are best able to replicate seems very strange to me.

You could of course go too far with this. I don’t think that people at the very top of their games need to worry too much; research physicists, for example, probably combined high IQs and a creative/imaginative capacity we haven’t yet really captured in research. But the thing about these extremely high performers is that they’re so rare that they’re not really relevant from a big picture perspective anyway. It’s the larger tiers down, the people whose jobs are g-dependent but who aren’t part of a truly small elite, that I think should worry – maybe not that group today, but its analog 50 or 100 years from now. I mean, despite all of the “teach a kid to code” rhetoric, computer science is probably a heavily IQ-screened field and it’s silly to try and push everyone into it anyway. But even beyond that… someday it’s code that will write code.

Predictions are hard, especially about the future. I could be completely wrong. But this seems like an intuitively persuasive case to me, and yet I never hear it discussed much. That’s the problem with the popular conversation on IQ being dominated by those who consider themselves to have high IQs; they might have too much skin in the game to think clearly.