on Kevin Carey’s piece on the quality of American higher education

I have many issues with this piece on the quality of American colleges and universities by Kevin Carey of the ed reform shop the New America Foundation. Let’s dive deep.

Far from being complacent about higher education, America is the site of a perpetual crisis narrative about our colleges. Carey  writes, “While policy wonks hotly debate K-12 reform ideas like vouchers and the Common Core state standards, higher education is largely left to its own devices.” It’s incredible that someone who works in higher education policy circles can write this: since tertiary education has become a mass phenomenon in the United States, we’ve essentially never not been freaking out about it.

I’ve recently finished a chapter for my dissertation about the perpetual crisis narrative in higher education. Since the GI Bill resulted in a major expansion of access to college education, there has essentially never been a time when the federal government wasn’t getting involved in higher ed, and doing so under crisis rhetoric. The federal government has commissioned major reviews of higher education in the Truman administration, the Eisenhower administration, the Kennedy administration, the Reagan administration, the George HW Bush administration, the George W. Bush administration, and the Obama administration. The notion of a crisis in higher education is common to them. The reason changes — our colleges are unprepared for all of the WWII veterans who will go to school when they get home, we need better learning to fight the Cold War, we need to prepare students to win the space race, we need to compete economically with Germany/Japan/China, etc. — but the notion of a crisis is ever-present. Hell, A Nation at Risk is one of the most hysterical documents I’ve ever read.

And as Carey is surely aware, the last two presidential administrations have attempted to directly change the course of American higher education. The Spellings commission was one of the most direct attempts by government to date at enforcing a particular vision of collegiate education, this one driven by calls for “accountability” and standardized assessment. The regional accreditation agencies have responded, as they did with A Nation at Riskalthough not entirely as reformers would have hoped. Now, the Obama administration is giving this movement teeth, by tying access to federal funds to a series of rankings and making standardized assessments a large part of those rankings. Setting aside the wisdom or fairness of these proposals, they are the opposite of leaving colleges to their own devices.

Carey fails to account for socioeconomic and demographic differences — despite the fact that the PIAAC study provides information on that account. The most important part of doing legitimate and responsible educational comparisons– literally the most important thing– is making sure that you’re comparing like with like. In education research, student performance needs to be placed in context according to the major demographic factors that dictate educational outcomes, in particular income level and parent education level. Anyone reading this, I assume, has already read me go on about the power of these demographic factors at length, but for a recap I’ll just quote Diane Ravitch in her book Reign of Error:

American students in schools with low poverty— the schools where less than 10 percent of the students were poor— had scores that were equal to those of Shanghai and significantly better than those of high-scoring Finland, the Republic of Korea, Canada, New Zealand, Japan, and Australia. In U.S. schools where less than a quarter of the students were eligible for free or reduced-price lunch (the federal definition of poverty), the reading scores were similar to those of students in high-performing nations. Technically, the comparison is not valid, because it involves comparing students in low-poverty schools in the United States with the average score for entire nations. But it is important to recognize that the scores of students in low-poverty schools in the United States are far higher than the international average, higher even than the average for top-performing nations, and the scores decline as poverty levels increase, as they do in all nations.

This is hardly revelatory stuff; anyone talking responsibly about education must acknowledge the determinative power of these factors from the get go. And we know that completing a college degree is itself highly dependent on these demographic factors:


source: the College Board

Now the PISA looks at how college educated adults perform on a test, not on if they finish college. But we have strong evidence that GPA performance in college is influenced by demographic factors. For example, a 1999 study of more than 5,000 college students found that personal background factors, such as parental income and parental education level, had a strong influence on college GPA.

But perhaps Carey would speculate that, against all sense, these socioeconomic factors wouldn’t affect PISA scores the way they affect GPA and dropout rate. But Carey didn’t actually have to speculate; he only had to listen to PIAAC! Here, from page 104 of  the PIAAC study:

Adults from socio-economically advantaged backgrounds have higher average proficiency in the three domains assessed in the survey, than those from disadvantaged backgrounds (socio-economic background is proxied by parents’ educational attainment). Score differences on the literacy scale related to socio-economic background are largest in Germany, Poland and the United States, while they are smallest in Estonia, Japan and Korea. After accounting for other characteristics, the differences in literacy proficiency associated with socio-economic background are substantially smaller. This is because an individual’s educational attainment often mirrors that of his or her parents.

Later, on page 113:

The largest difference in both literacy and numeracy proficiency between adults with at least one parent who had high levels of educational attainment (i.e. from socio-economically advantaged backgrounds) and those with both parents who had low levels of educational attainment (i.e. from socio-economically disadvantaged backgrounds) is observed
in the United States and Germany (57 and 54 points, respectively). These are also the countries with the lowest average literacy score among adults with neither parent having attained upper secondary education. In contrast, Australia, Estonia, Japan and Sweden show the smallest difference (28-33 points) between these two groups of adults. These countries also feature relatively higher scores among adults with neither parent having completed upper secondary education.

In other words, in a piece criticizing the higher education system of a country of great socioeconomic inequality, based on data from a study, the author failed to mention that the study identifies that country as having a large impact from socioeconomic factors. It’s incredible that Carey failed to disclose that; it borders on dishonesty. I could easily see the Upshot running a piece that leads with this information — American Literacy and Numeracy Highly Influenced By Socioeconomic Background, Study Shows– if that was how the data was being spun. To not point out the impact of socioeconomic differences on PISA scores when those differences are laid out by the study’s authors is a major failing.

America is an unequal country, and our educational outcomes are unequal for this reason. That’s not disputed by this data, but rather supported by it.

Carey acknowledges that American students come in lower, then ignores that fact. Carey points out that our primary and secondary students lag far behind in PISA scores, meaning that they come to American colleges and universities further behind their international counterparts. Yet Carey continues to take American universities to task for not reaching parity with international peers. He’s looking at a comparison where some systems have a large head start and then complaining that our system doesn’t win the race. He writes, “Instead, Piaac suggests that the wide disparities of knowledge and skill present among American schoolchildren are not ameliorated by higher education.” But how could they be ameliorated if other country’s students are continuing to learn too? Again, what this data suggests is not that our schools are doing a poor job but rather that  student-side factors are more determinative of educational outcomes like PISA scores than school-side factors. 

Simplistic numeracy tests are very poor measures of college learning; better assessments, like the Collegiate Learning Assessment, show strong student growth. It’s essential to say: the PIAAC study is not intended for the purpose that Carey uses it for. The study is not intended as an assessment of collegiate learning. For example, on page 118, the study says, “The formal education system is not the only setting in which the skills assessed in the Survey of Adult Skills are developed. Learning occurs in a range of other settings, including the family, the workplace and through self-directed individual activity.” And it’s entirely unclear to me that the kind of numeracy and literacy items the PISA uses are an appropriate mechanism to test college students. Do you remember taking a class in “numeracy” in college? You can’t get on schools for failing to teach students things they didn’t intend to teach them. 

In contrast, I think an assessment system like the College Learning Assessment+ is probably a better gauge, in part simply because it’s designed for the purpose of measuring college learning. There are deep problems with value-added modeling, and while I am a qualified supporter of the CLA/CLA+, there are some issues with it as well. (Unfortunately for my dissertation research, it seems that not being an unqualified supporter makes it harder to get access to information. But that’s a discussion for another time.) But it’s an instrument that’s far more suited to this kind of comparison than PISA scores, and one that tries specifically to show how these different skills work in concert, rather than separated and deracinated. Take a look at this regression of CLA scores over SAT scores.


What we see here is encouraging: the gap between the two regression lines is large, indicating that American college students are learning a great deal in school. Indeed, you can read about this learning at length in the Council for Aid to Education’s paper “Does College Matter?“However, we can also see that the gap between those who start out the lowest (on the left) and those who start out the highest (on the right) is not made up by college education; the gap is just too big. Again, this comports with my point above and with my broader take on education as a whole: schooling works, but it can’t close gaps caused by demographic differences, in large measure because those at the top keep learning as well. I don’t doubt that many other countries would have students who end up higher on the CLA+ than ours do — but I also don’t doubt that they’d start out higher, as well. And I also don’t doubt that there are structural economic reasons for this advantage.

The notion that economic competitiveness is dependent on performance on educational metrics is broadly assumed by relatively unproven. As is very common with reform types, Carey argues that we need to fix this supposed crisis because educational performance is strongly tied to economic performance. This is ed reform boilerplate, but it’s never been clear to me how strong the evidence is. I think having a highly educated workforce is probably a good thing economically, but I think we should be clear: American education has never been good in international comparisons, even during boom economic times. Often, people assert that we’ve fallen from our perch as the world leader in education. But not only were we never the best, we’ve never been close to the best. In a comprehensive review of the evidence, David E. Drew demonstrates that as long as people have been making rigorous international comparisons of educational outcomes, the United States has done very poorly. This was true in the 1960s, it was true in both the boom and bust times of the 1980s, it was true in the go-go economy of the late 1990s. Whatever the relationship between student test scores and the economy, it’s not a simple one.

There is a conversation to be had about the quality of our higher education system. I think leaving research out of that conversation, as Carey does, is nuts, particularly given that he’s specifically invoked our economic well being. And surely there is a difference between prestige and the quality of undergraduate education. But I don’t think using the PIAAC data is a  responsible way to approach the question, and I think Carey has failed to address the central problem of comparing like with like in educational data. I worry that he began from a particular conclusion and looked around for evidence to reach it.

This entry was posted in Education. Bookmark the permalink.

4 Responses to on Kevin Carey’s piece on the quality of American higher education

  1. Pingback: Sunday Links for the Sunday Reader | Gerry Canavan

  2. Dai Ellis says:

    Freddie — thanks for sharing the link to the CAE ‘Does College Matter’ piece. I hadn’t seen that yet and it’s interesting to see the stark difference in results between this study and Academically Adrift. I wonder if the CAE study has an apples-to-apples issue: since it’s freshman/senior testing methodology isn’t cohort-based — and since we know there is such high attrition at the average US college — how do they control for lower-performing students dropping out between freshman and senior year? Also, on a quick scan I wasn’t clear if they had a control group (just comparing 18 vs. 21 year old CLA performance for students not enrolled in higher ed)?

    • Freddie says:

      I’m on my phone so more later, but I want to say that Academically Adrift is also not longitudinal, I believe.

  3. Pingback: July Links | The Hyperarchival Parallax

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>