by Ben Mangrum
29 January 2015
At the beginning of first-year writing courses, I often assign an ungraded essay that prompts students to reflect on the purpose of higher education. For instance, I asked my most recent classes to respond to a 2014 article by Tom Keane, “Is College Worth It?” Keane’s argument is only one among literally hundreds that show up on a simple Google search about whether there’s still “worth” to a college education. Keane acknowledges that the preponderance of research suggests a strong correlation between a college degree and greater success on the job market. The Pew Research study prompting Keane’s article focuses on the millennial generation—that is, adults presently between 25 and 32 years old. The study finds that the median annual earnings for college graduates in this group is more than $17,500 than their peers with only a high school diploma. While Keane acknowledges the clear benefits that such statistics express, he nonetheless argues that such a gap—on average, $752,500 across a working lifetime—is compressed by the internal rate of return (IRR) on a degree. That is to say, college graduates not only have at least four years less of salary than their peers but also have either paid tuition during this time or incurred substantial debt. Both of these factors (non-income years and the cost of attendance) figure in the lifetime “return” on a degree. Given these additional factors, Keane says, “For a less expensive school such as UMass (about $69,000 for four years), the IRR is high—25 percent. But for pricier places costing $240,000 or more, the return is just under 7 percent.” In other words, the cheaper degree offers graduates a higher return on their investment, while the “pricier” ones…well, the benefits are less of a sell.
But before throwing out the “elite” college baby with his pricier bath water, it’s important to note an important flaw in Keane’s account. He takes the median annual income across the nation and applies it equally to “pricier places” and “less expensive schools” when calculating the IRR. Wouldn’t it be more accurate first to find the median annual income of UMass graduates and, say, Dartmouth graduates, and then compare the IRR yielded by the cost of attendance in relation to lifetime earnings? If, for example, graduates from “pricier” schools have a higher median annual income, that gap would obviously affect the internal rate of return for acquiring that degree. The point is that this argument’s blanket use of statistics isn’t particularly telling.
Keane’s argument—despite its flaws—is nonetheless notable within the public debate about the “worth” of a college education for the way that it frames its argument for “greater skills, but not necessarily a degree.” Keane insists that college is “hardly the only way to gain new skills and knowledge. Trade schools, work experience, online courses and just reading can do the trick as well.” In effect, Keane proposes that these alternative forms of education will close the gap with college-educated employment. What if, he asks us to consider, entering the workforce immediately after high school, gaining on-the-job skills, and taking online classes at night turned out to be a greater financial investment? The “jobs of the future,” Keane says, will be predicated precisely upon these less expensive and more vocationally focused forms of education. College, it turns out, won’t always be worth it.
Keane’s argument—like so much of the criticism directed at college education—rejects causality in favor of correlation. A college education doesn’t cause a graduate to have better success on the job market; employers merely want the degree, which Keane describes as “a marker, a shorthand way of saying you’re smart and educated.” The degree is simply social capital, a correlate of preexisting conditions; it doesn’t cause success.
When I assign this article, students often (rightly, to my mind) criticize Keane for the speculative nature of his argument. College provides a conspicuous statistical advantage on the job market, but Keane implies that such a correlation isn’t permanent—it won’t always be the case. As the cost of attendance continues to rise, the IRR on a college degree will negate the market advantages it provides, according to his argument. But part of the problem here, as many of my students have noted, is that Keane speculates that one trend (rising tuition costs) will somehow overcome another (an increased gap in the skills, salary, and employment of college graduates). Why does one trend cancel out the other? There’s simply no data that justifies the negation of the employment advantages created by college by the rising costs of tuition. The trends are not competing for dominance—in fact, they’re interrelated, but that’s an issue for another post. If anything, the research on the relationship between employment “success” and American higher education suggests that college graduation trends will continue to increase the wage gap between college- and high school-only graduates.
But there’s something that this rebuttal fails to take into account (and this problem is why I assign the article to first-year students): What if Keane turned out to be right? His argument, after all, is speculative—but so is the reasonable rebuttal that current trends will continue and college graduates will have increasingly greater lifetime earnings. While that rebuttal troubles Keane’s argument, it leaves its essential structure and assumptions intact. There’s still the lurking problem: What if the long-term advantages on the job market were, say, equal between a college education and alternative forms of acquiring skills? What if a college degree no longer translated into substantial market benefits—would college still be worth it? By leaving the structure and assumptions of Keane’s argument intact, one would have to concede that a college education divested of its market advantages wouldn’t be worth it. The “employment-trends” rebuttal actually leaves higher education in a weaker position than its proponents realize.
Criticisms of higher education such as Keane’s are becoming increasingly common. They’re partly rooted in legitimate crises affecting our colleges and universities—crises including the rate of tuition increases, spending wars between competing schools that leave both in weaker financial situations, the adjunctification of core curriculum instruction, among many other problems. But certain elements of this discourse critical of the American college system are rooted in dangerously mistaken assumptions about education.
First, many of higher education’s critics present life after college as a “market” for which students need preparation. Don’t get me wrong: employment matters. Analyzing the conditions that contribute to unemployment or that could improve family incomes ranks among the most important aspirations of higher education research. Vocational training matters, too, and not just for our politics but also for our educational philosophy. But even acknowledging the importance of “analyzing the conditions of employment” reinforces the fact that the democratic life of a nation depends in large part upon the capacity of its citizenry to question and conduct research, to construct arguments and evaluate large-scale social problems. The “market” construal of society squints at life through an extremely narrow straw, as if what matters is only our employment trends and not our ability to reflect upon its conditions and its discontents. In other words, the “market” view poses a false either/or: either “jobs” and “employment” skills or liberal arts classes. Instead, our society is more democratic—not more unequal, unfair, or unemployed—when its working citizenry is trained to think critically, to conduct research, and to evaluate not only a State of the Union address but also the product placement in a television comedy. By construing existence on the other side of education in market terms, the democratic contours of our shared life are worn away. In this view students become mere employees, not employable citizens.
Second, higher education’s critics often divorce causality from correlation regarding the employment data. According to this assumption, a college degree is just a piece of paper, but the education gained through one’s time at a university or college could have been obtained through other means. Intelligence is not caused by a college education and is in fact formed outside the walls of academe. While there is some truth to this claim, it often is perpetuated alongside assumptions of privilege that presuppose an individual apart from formal structures of education would know, say, what books to read—and how to find, afford, and secure the time to read them. This objection conveys a kind of ignorance about the way that education forms our ability to be educated.
The even more pernicious upshot of replacing “causality” with a “correlationist” view is that the latter is not based upon labor, merit, or cooperation but sheer privilege. Keane argues, for example, “College doesn’t make you smart. Rather, smart people go to college.” I’m not sure what to make of the fluffy word “smart” in this sentence, but I think the point is that college doesn’t give students the ability to succeed at tasks requiring intelligence—whatever happens during those four years merely confirms the intellectual ability gained in the eighteen years prior to higher education. The obvious point here—and Keane is well aware of it, acknowledging that K-12 education, parenting, and socioeconomic background shape this early period of development—is that “smart” therefore becomes a function of privilege. And if that’s the case, the argument should still become one focusing on “access” and not “employment.” Not only should higher education be more accessible—one of its principal crises—but so should a quality K-12 education. The “correlationist” critics, in other words, perform a sleight of hand: they avoid the political issue of bettering public schools and increasing access to higher education by criticizing the degree as “just a piece of paper” that represents a previously attained capacity for success. They criticize the conclusion of their analysis, not its governing conditions.
Lastly, appraising the “worth” of an education according to its employment outcomes implicitly evaluates college in terms of exchange value. Whether or not college is “worth” it, according to many of its critics, is a question of what four years of a student’s time and the cost of tuition will exchange for on the job market. What advantages does a degree provide on the market, whether vocationally (the “skills” argument) or socially (the “networking” and “cultural capital” arguments)? Insofar as that question drives the discourse criticizing higher education, it will never contribute to its reform but will only reinforce the problems that currently plague our post-secondary system. That is to say, when we define “worth” exclusively in terms of exchange value, we establish a cramped and narrow-minded view of the world—one that in turn places suffocating constraints on higher education itself. In contrast to this view of the world, there are more types of value than exchange value. Indeed, there’s something not just narrow-minded but also anti-democratic about defining worth solely in terms of an exchange relation, for that definition incrementally marginalizes those activities and even portions of the public as they yield lesser market returns. Expanding the discussion of the “worth” of higher education to include more than employment outcomes protects against this kind of tyranny by finance accounting.
Ben Mangrum recently defended his dissertation, “Land of Tomorrow: The Postwar Novel and the Rise of the New Conservative Movement,” in the Department of English at UNC-Chapel Hill. In addition to editing Ethos, Ben also writes for the project on politics, the business of sports, contemporary novels, and the academy. His academic work has appeared in such journals as Genre, Philosophy and Literature, and Literature and Theology.