g-reliant skills seem most susceptible to automation

This post is 100% informed speculation.

As someone who is willing to acknowledge that IQ tests measure something real, measurable, and largely persistent, I take some flak from people who are skeptical of such metrics. As someone who does not think that IQ (or g, the general intelligence factor that IQ tests purport to measure) is the be-all, end-all of human worth, I take some flak from the internet’s many excitable champions of IQ. This is one of those things where I get accused of strawmanning – “nobody thinks IQ measures everything worthwhile!” – but please believe me that long experience shows that there are an awful lot of very vocal people online who are deeply insistent that IQ measures not just raw processing power but all manner of human value. Like so many other topics, IQ seems to be subject to a widespread binarism, with most people clustered at two extremes and very few with more nuanced positions. It’s kind of exhausting.

I want to make a point that, though necessarily speculative, seems highly intuitive to me. If we really are facing an era where superintelligent AI is capable of automating a great deal of jobs out from under human workers, it seems to me that many g-reliant jobs are precisely the ones most likely to be automated away. If the factor represents the ability to do raw intellectual processing, then it seems likely to me that the g-factor will become less economically worthwhile when such processing is offloaded to software. IQ-dominant tasks in specific domains like chess have already been conquered by task-specific AI. It doesn’t seem like a stretch to me to suggest that more obviously vocational skills will be colonized by new AI systems.

Meanwhile, contrast this with professions that are dependent on “soft” skills. Extreme IQ partisans are very dismissive of these things, often arguing that they aren’t real or that they’re just correlated with IQ anyway. But I believe that there are social, emotional, and therapeutic skills that are not validly measured by IQ tests, and these skills strike me as precisely those that AI will have the hardest time replicating. Human social interactions are incredibly complex and are barely understood by human observers who are steeped in them every day. And human beings need each other; we crave human contact and human interaction. It’s part of why people pay for human instructors in all sorts of tasks that they could learn from free online videos, why we pay three times as much for a drink at a bar than we would pay to mix it at home, why we have set up these odd edifices like coworking spaces that simply permit us to do solo tasks surrounded by other human beings. I don’t really know what’s going to happen with automation and the labor market; no one does. But that so many self-identified smart people are placing large intellectual bets on the persistent value of attributes that computers are best able to replicate seems very strange to me.

You could of course go too far with this. I don’t think that people at the very top of their games need to worry too much; research physicists, for example, probably combined high IQs and a creative/imaginative capacity we haven’t yet really captured in research. But the thing about these extremely high performers is that they’re so rare that they’re not really relevant from a big picture perspective anyway. It’s the larger tiers down, the people whose jobs are g-dependent but who aren’t part of a truly small elite, that I think should worry – maybe not that group today, but its analog 50 or 100 years from now. I mean, despite all of the “teach a kid to code” rhetoric, computer science is probably a heavily IQ-screened field and it’s silly to try and push everyone into it anyway. But even beyond that… someday it’s code that will write code.

Predictions are hard, especially about the future. I could be completely wrong. But this seems like an intuitively persuasive case to me, and yet I never hear it discussed much. That’s the problem with the popular conversation on IQ being dominated by those who consider themselves to have high IQs; they might have too much skin in the game to think clearly.

One thought on “g-reliant skills seem most susceptible to automation

  1. […] G Reliant Skills Seem Most Susceptible To Automation by Freddie deBoer – Computers already outperform humans in g-loaded domains such as Go and Chess. Many g-loaded jobs might get automated. Jobs involving soft or people skills are resilient to automation. […]

Comments are closed.