The quality of our colleges and universities – particularly for undergraduates – should be a topic we all care about as a country. College is crucial in educating and preparing young people to succeed in an increasingly competitive global economy.
We’ve seen for some time the disturbing data that America is falling behind other countries in the number of students who attend and complete post-secondary education. Now, new data suggests that many U.S. students who make it to college, and even succeed there, are actually learning very little.
The data comes from the book Academically Adrift, which raises some fundamental and surprising questions about the quality of U.S. undergraduate education. The authors, Richard Arum and Josipa Roksa, are sociologists who analyzed results from essay tests and surveys given to more than 2,000 students at the beginning of their freshman year and the end of their sophomore year. Between 2005 and 2007, data was collected from 24 four-year institutions, including state universities and liberal-arts colleges.
Two key findings have received a lot of attention:
- About 45 percent of the students showed no improvement in critical thinking, complex reasoning or written communication during their first two years in college. (On more recent tests, the students didn’t show much improvement in their junior or senior years, either.)
- Students said most of their courses required surprisingly little effort. They reported studying only slightly more than 12 hours per week on average. Few of their courses required 40 pages or more of reading per week or writing as much as 20 pages over the course of a semester.
Before reading this book, I took it for granted that colleges were doing a very good job. But there is really no measurement or feedback system that tracks results, to help guide students and help institutions improve. Not overall, and not for individual courses of study. What do students in different programs learn, how many graduates get jobs in their field, how much do they earn? The outputs of higher education are a deeply understudied question.
The dismal results presented in Academically Adrift are based on the Collegiate Learning Assessment, a standardized test in which students are asked to make a practical decision – such as, what kind of airplane a company should buy – and explain their choice based on a set of goals and facts about different options. One criticism of the book is that it doesn’t look at subject-matter learning. But I think most people would agree that skills like critical thinking, complex reasoning and writing – the things the test does measure – are pretty important.
Beyond the top-line results, the authors gathered thousands of data points, different variables that you would hope might explain why learning is so limited. Unfortunately, most variables don’t seem to make much difference. The book nevertheless analyzes many of them, making it a hard statistical slog at times.
Not too surprisingly, more learning takes place among students who take demanding courses and who say their professors have high expectations. Science students make better-than-average progress, even in their writing skills.
Overall, the book depicts a culture in academia where undergraduate learning is only a peripheral concern; where the professors don’t want to assign complicated papers because grading them is hard work; where the main feedback is course evaluations from students who dislike writing complicated papers; where there’s an attitude of, “Don’t mess with us and we won’t mess with you.” And there’s no accountability for any of it.
This may be a caricature. There are people going to college who are still doing very hard work. U.S. graduate education is still highly rigorous and leads the world. And many community colleges have made serious efforts to build programs around employers’ needs and to make sure students gain the skills to succeed in the workplace.
For example, Melinda and I were impressed a couple of years ago when we visited the Tennessee Technology Center in Nashville, an institution that provides young adults with technical training and certificates. Its graduation rates are significantly better than those of its peers because it focuses on teaching job skills that are in high demand, and it has adapted to meet the needs of students who are juggling school with work and family.
I’m also impressed by the results in places like Western Governors University. Its low-cost online programs rely on competency-based progression, not class-time or credit hours. It uses external assessments to evaluate student proficiency. And because its students are a little older and possibly more focused in their goals, its graduation rates are high and the salaries its graduates earn are good.
Because of institutions like Tennessee Tech and WGU, I’m optimistic about the potential of innovation to help solve many of the problems with our post-secondary system. But we need more and better information. I’m reminded of a point made by Andrew Rosen of Kaplan, the for-profit education company, that colleges today know more about how many kids attend basketball games and which alumni give money than how many students showed up for economics class during the week, or which alumni are having a hard time meeting their career goals because of shortcomings in their education.
That needs to change.