Decades of cultural pressure, soaring debt, and misaligned priorities have flooded the market with degrees while starving the trades — and left millions underemployed, overleveraged, and unprepared for the realities of work.
By Willie Costa & Vinh Vuong
For millions of Americans, decades of single-minded focus have created a massive disconnect between labor supply and demand, between the promises of a better future and the harsh realities of an uncaring market, and between dreams of future upward mobility and the debilitating effects of an unfeeling present. While a college degree was once a golden ticket to upward mobility, higher wages, and long-term career stability — in other words, to status, a concept to which we will return later — the constraints of increasing student debt, a saturated labor force, and increasing rates of unemployment and underemployment are indicators that the labor market has fundamentally changed. The data over the past few years are disheartening: in the United States there are currently approximately 70,000 chiropractors, roughly one million physicians, and 42,000 optometrists. There are approximately 5.38 million people employed in engineering fields, and another 8.4 million or so employed in finance and insurance fields. Yet there are over 19 million college students, many of whom will never even consider entering these fields or many like them. With such a discrepancy between those employed in traditionally “prestigious” fields versus the total number of students — and some schools growing so large that commencement takes multiple days — one can be forgiven for asking whether college is still worth the expense.
Supply and demand would argue “maybe,” and it’s completely dependent on the choices a potential borrower makes.
The rise of “college for all”
America’s obsession with college is a fairly recent development. In the 1960s, for instance, about 25% of Congress had no college degree, and the general public was less concerned with sending their children to college and more concerned with the Vietnam War and the ever-looming threat of Soviet attack. Examinations and audits of the nation’s educational system were (and remain) fairly common, beginning with the Truman Report in 1947 and continuing to the present day. The Truman Report (officially titled Higher Education for American Democracy) was notable not only for its size (nearly 500 pages), but because it was the first time in US history that a sitting president established a commission for the express purpose of analyzing the country’s educational system — a task that had previously been left to states under the Tenth Amendment. The Truman Report established what would eventually come to be known as “college for all” — more appropriately thought of as education for all — establishing community colleges for “all youth who can profit from such education.” Several subsequent reports featured similarly grandiose titles as Truman’s, such as George W. Bush’s A Test of Leadership, published by the Spellings Commission.
But in 1983 a report published by the fledgling Department of Education (which was only four years old at the time) changed everything. Titled A Nation at Risk and published by the National Commission on Excellence in Education, it fanned the flames of concern that American schools were failing — a sentiment frequently embraced to this day — and touched off a nationwide panic and several decades-long reform efforts. The central thesis to this report was sound: the country’s educational system was failing to meet the need for a competitive workforce, yet one of the central pillars of the report was fundamentally flawed: Yvonne Larsen, vice-chairman of the commission, and commission member Gerald Holton, have both stated that they were trying to confirm pre-existing concerns they had about the educational system, rather than completing an objective analysis of the state of American schools. But what the National Commission on Excellence in Education failed to embrace in terms of rigorous research methodology it more than made up for in marketing, couching their arguments in the military language of the time and combining it with the bootstrap mentality of mid-century Americana to produce such memorable quips as:
If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war. As it stands, we have allowed this to happen to ourselves.
And with such laconicism, a star was born.
The need to increase human capital in order to defeat the Soviets became paramount, and education became a matter of national security: bipartisan, sacred, beyond reproach, and deserving of nearly any budgetary request. In many ways A Nation at Risk should be praised for its efficacy: by enshrining the divinity of education within the temple of national security, America set out to become the most highly-educated nation on earth, reminiscent of the scientific and engineering push that characterized the Space Race (and which is still paralleled today). Over six million copies of A Nation at Risk were printed during the first year alone, and the reforms it inspired are still ongoing.
Presidents from both sides of the aisle have rushed to embrace the universality of a college education, but it must be noted that this was only a recent development. During the dawn of the American republic, the president’s role in education policy was virtually nonexistent. Some presidents, such as Thomas Jefferson, expressly argued that a strong educational system was fundamental for a strong nation, but constitutional concerns (specifically the Tenth Amendment: The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people) often served to limit presidential involvement. Presidential leadership on the matter, when it was provided at all, was often a low priority: education was a strictly state and local matter, and the purpose of education in the country’s early history was predominantly for citizenship: after all, the founders had declared independence and the newborn nation had fought one of the most powerful empires at the time to secure it. Presidential interest in what was predominantly a state and local problem remained muted until the rapid technological developments of World War II: these advancements not only promised enormous economic benefit but also created, for the first time in human history, the very real possibility of instantaneous and permanent annihilation. And as technology continued to advance, the country was left with a choice: invest massively into maintaining leadership on the technological battlefield, or be relegated to the ash heap of history. This marked the true turning point of educational concerns on the national level.
Over time the definition of human capital narrowed from “education” in general to “college” specifically. Technical skills fell out of favor, to be replaced with general education classes, standardization, testing, and the classroom-to-debt pipeline of student loans. The underlying assumption that college paid for itself — the so-called “college wage gap,” to be discussed later — became the closest thing possible to a free lunch. And if there’s one thing that’s uniquely and undeniably American, it’s the notion that more is better: “college for all” was born in earnest, and college went from being a path to success to being the only path to success. The importance of a college education, in fact, is one of the few things that Americans of all political stripes can seem to agree on: by 2012, 94% of parents expected their kids to go to college. Moreover, support for a college education across the entire political spectrum came as close as humanly possible to near-perfect agreement: 99% of Republicans expected their children to go to college, as did 96% of Democrats and 93% of independents.
It’s not hard to see why this was the case, given the decades of technological advancement, globalization, etc. College became the nation’s guarantee that there would always be a place in the middle class for our children. High school guidance counselors pushed entire generations of students toward college, yet the irony of this is that guidance counselors themselves don’t know much about college: while counselors are often required to earn a master’s degree in counseling, graduate programs rarely offer even one single class in college planning, and the knowledge gap of guidance counselors often leaves them severely lacking. So, through no fault of their own, the very people entrusted to guide children toward the only “correct” postsecondary option are often themselves equally ignorant on certain issues of it. This knowledge gap is scandalous, and damaging to children during an especially vulnerable time in their lives.
What of the millions of tradespeople in the US? By way of example, consider the elevator and escalator installer and repairer, a trade with a median salary of $106,000 and projected to grow 6% between 2023–2033 (this is faster than the national average). From the standpoint of median salaries, this career is more lucrative than an architect ($96,000), a mechanical engineer ($99,000), or an accountant ($81,000); yet despite the high salary there remains a shortfall of qualified personnel. The US currently has about one million elevators, and Americans travel over 2.5 billion miles per year on elevators and escalators. Many of these are aging and are time-consuming and costly to fix, and the technicians who know how are aging out of the workforce. Downtime can be costly, effectively rendering the upper levels of tall buildings useless until repairs can be made (and nearly 1.1 million Americans end up in the emergency room every year due to stair-related incidents, per a 2017 study). In some cases, those hardest hit are those who can least bear it, such as when both elevators at an affordable housing complex in Fort Lauderdale were out of service for more than a month. Fewer qualified personnel means longer wait times for repairs, causing construction delays.
Yet tradespeople — even those in high-paying professions such as elevator repair technicians and crane operators — still tell their children to pursue a four-year degree more often than not. The shortages across various industries — especially construction, which recently experienced one of the highest shortages ever recorded — is a foregone conclusion. Unfortunately the shortages are likely to get worse: 27% of skilled trades are within ten years of retirement, and once those workers retire their expertise, book of business, brand, and productivity will all retire with them. The effects are beneath notice until one recalls that many of these trades — including plumbers, electricians, carpenters, welders, and the proverbial elevator repair technician — are required for modern society to even exist.
This issue of shortages must also be examined in light of inherited jobs, viz. children who pursue the same (or similar) career path as their parents. While some professions have strong dynastic tendencies (actors and race car drivers come to mind, exemplified by the Barrymore and Andretti clans, respectively), children are oftentimes more likely to pursue similar careers as their parents. Variations exist depending on the gender of the parent(s) and the child, of course, but in some instances the likelihood of inheriting a career parth are truly staggering. For instance, daughters are 362 times more likely to become a commercial fisher if their father was in the industry, but “only” 281 times more likely to follow their mother along the path of becoming a military officer; sons are more than 400 times more likely to become a textile machine operator like their fathers, but only 191 times as likely to become paralegals like their mothers. For some professions, the self-selection career bias is truly noteworthy; for instance, while doctors make up approximately 0.3% of the US population, one study found that nearly 30% of medical school students had at least one parent that was a physician (though it should be noted that these students were not more likely to succeed than their peers, highlighting the importance of hard work and grit regardless of career path). These are not simply hollow statistics, but highlight a strong information asymmetry: if there are so many specific careers (especially trades) that are in high demand, why aren’t more people going into them?
The answer is simple: they don’t know. Inherited careers constrain the free flow of information to the outside world regarding the benefits and opportunities of those jobs, and a general social disdain for the trades chokes said flow of information to a trickle. For trades, this lack of information is doubly tragic given the high rates of job satisfaction: 87% of tradespeople say they would choose the same career over again, 95% feel secure in their job, and 93% feel positively about career opportunities. By comparison, national-level statistics among all workers show that only 50% are “extremely satisfied” with their job and only 3 in 10 who are not self-employed say they’re satisfied with their pay and opportunities for advancement. Only about half of workers are satisfied with their jobs overall.
“College for all” was effective not only because it was a catchy slogan, but because it imbued the issue of postsecondary education with an aura of class, status, and morality. When college became an end in itself, it became a luxury good, and many of the rituals inherent therein reached heightened levels of desirability among the masses. After all, college was one of the most important formative phases of middle class life: it was where one would make lifelong friends or meet their future spouse, or it was where one “discovered themselves” or attained some other such vaguely-defined concept of personhood. Academia became not only a means to an educational end but a culturally desirable bildungsroman of transitioning to adulthood. Even though the increasing number of graduates reduced the financial reward of degrees, this association stuck. The irony is that many formerly “high-status” jobs became so poorly paid that only the wealthy could afford to take them. Even physicians — arguably one of the highest-status of all high-status jobs — have not escaped the realities of economics: while the average salary of a doctor is about $350,000 (and varies wildly by specialty and locale), when compared to inflation the pay of Medicare physicians — arguably the doctors who most frequently serve those Americans most in need of medical care — has spent the past twenty-plus years declining.
The desirability implicit in “college for all” had the additional effect of branding those who did not attend as a social Other, an outcast who did not conform to societal norms and was thus unworthy of acceptance. This stigma would often serve to brand — in the equine sense — those who did not attend college as lower-class failures doomed to a life of want. And the trades? Those were for the troubled kids who had few other options. College might no longer be the golden road to riches it once was, but parents still pushed their children along the conventional educational path — and the children were, by and large, themselves propelled along this path either by fear of a lifetime of poverty or an overwhelming motivation to avoid it. If nothing else, the children would be able to escape the Otherness of not attending college… all the while being reminded that the decisions they make would create consequences they would spend the rest of their lives living with.
That’s an enormous amount of pressure to put on a teenager whose brain has not yet even fully developed.
With so much emphasis placed on problem-solving and critical thinking skills, it’s nearly comical to see graduates hit a proverbial wall once they have spent years attaining the imposed cultural norm of a four-year degree, are unable to find meaningful work, and find themselves frozen — professionally, psychologically, financially, and emotionally — at the very moment they desperately need to make use of those very same problem-solving skills they spent their undergraduate career developing. At least, it would be comical, except that this very real phenomenon often causes the graduate to spiral down a path of deteriorating mental health and poor economic decision-making. Most glaring (and disheartening) is the number of college graduates who, being unable to find meaningful employment, fail to recognize that nothing prevents them from pursuing a trade after graduation. After all, minimum training requirements are just that — minimums — and having a four-year degree is hardly a disqualifying factor for any job. In fact, pursuing a trade after college could actually let one earn more than other workers who do not have a college education, both through raises as well as promotions that might come sooner and/or more frequently than colleagues in the trades who do not possess a degree.
This is the ultimate irony for some: attending college in order to enter a trade could actually leave the graduate better off than if they had pursued either of those paths independently of the other. And the rewards can be immense: a signal and track repairer, for instance — one of the hundreds of trades hidden from the layperson due to the aforementioned lack of information — makes a median annual wage of $82,710; this is approximately 50% higher than the median annual wage of the entire liberal arts field.
This contradiction in capability versus opportunity suggests there is more at work than purely financial matters: blue-collar work — even highly-paid, union-protected jobs — can be seen as a tacit admission that one’s college years were “wasted” after untold amounts of effort were spent working to get through college in order to “make it,” to say nothing of the effort exerted and sacrifices made to get into college in the first place. In lieu of the rational decision to pivot and seek high-paying work where it may be found, many graduates avoid unemployment by settling for underemployment, accepting low-paid, low-skill work instead of seeing opportunities in the trades. While this is illogical — not to mention detrimental to one’s wallet — it at least has the appearance of being temporary. Pivoting to a trade, by contrast, revives the blue-collar stigma since it involves a longer-term effort in terms of certifications, apprenticeships, and the like. There’s also the question of student loan repayment, which oftentimes means the graduate needs money today, rather than after another round of certification or training. The attendant feelings of disengagement or disenfranchisement, while unproductive, are at least understandable. In short, this hesitation to pivot toward available opportunities is more of a psychological problem than an economic one: service and gig work has been normalized as a stopgap to make ends meet, and in a very twisted way is seen as somehow more desirable than changing career paths and admitting that college was not the guaranteed path to success that so many profess it to be.
Of course none of the fears, qualms, or misgivings regarding career-switching to a trade after college are supported by any shred of verifiable data, but fears are by their very nature irrational.
Predictably, this mindset (combined with other economic factors) leads to several important milestones being delayed, including saving for retirement, buying a home, marriage, and having children. In what is perhaps a case of sour grapes, 61% of Americans say the traditional milestones are no longer important. Even the most basic of milestones — moving out — seems like a farce in the present day. More than 50% of young adults in the US live with their parents, a figure that has doubled over the past two decades and is at the highest level since the Great Depression. About one-third of Americans live with at least one parent. And while some are quick to blame covid, it must be noted that these trends existed prior to the Great Global Pause of 2020 and persist today. More than half of Gen Z adults have expressed concerns over cost of living, according to a 2024 survey from Bank of America; worse, a significant number of millennials and Gen Z adults lack emergency savings, putting them one unforeseen expense away from severe economic damage.
When compared to bankruptcy or the psychological toll of having to continually hover over a bottomless chasm of financial ruin, punching a time card and carrying a toolbox for work doesn’t seem too bad…
Academic decisions and their market consequences
Colleges have swelled with students, and over the past couple of decades enrollment figures have increased at rates rarely seen outside of Silicon Valley valuations. The most immediate effect of increased enrollment is simple and easy to understand: larger class sizes. In the early 2000s, some institutions — such as the University of Colorado — had class sizes that were truly monstrous: over thirty-three courses with at least 400 students, and three with more than 1,200. While these are sometimes broken into sections, one chemistry course was so large that the only place where all students could take the final exam at the same time was at the Coors Event Center — the basketball arena.
Do these enormous class sizes have an impact on information retention? After all, the entire point of a college education is to provide students with retainable knowledge from a variety of fields (including the oft-maligned general education classes, unpopular at best and considered a deadweight loss at worst), in the hope that the end product — the proverbial “complete man” — will be better equipped to contribute to labor productivity. That is, after all, the promise of college: the opportunity for advancement and financial success, which is why ranking schools by ROI has become quite popular. But research into the efficacy of large class sizes has been ongoing for over 100 years, and most of those results associate large classes with a lack of interpersonal interaction, weak student engagement, decreased learning, and ineffective interactions. Some research has found that when it comes to not simply teaching facts (which can be regurgitated on exams via rote memorization, and are likely forgotten soon afterward) but conveying problem-solving approaches to students (i.e. teaching those skills that will be of utmost importance in most careers), large classes can result in students thinking less like professionals after completing the course than before they began it.
In other words, there’s a potential for large classes to do more harm than good, especially when it comes to learning outcomes.
These drawbacks can be mitigated via techniques that help such classes “feel” small, but those results are instructor-dependent and difficult to scale. This is exacerbated by the fact that education is a people-intensive business: some 60% of all university budgets go toward paying the salaries and benefits of those who teach, conduct research, support students and professors, and make the institutions operate like the well-oiled machines they theoretically should be. Personnel shortages will vary by discipline, of course — some, like humanities, are more impacted due to declining enrollment and an oversupply of Ph.D.s — but the labor shortages of the broader economy are still present in higher education, where three of every ten workers are aged 55 or older. By comparison, the nationwide average in 2023 was closer to one in five.
These effects are still felt by students after they graduate. For instance, about 37% of workers have at least a bachelor’s degree, but only 178 occupations tracked by the Bureau of Labor Statistics (out of a total of 830 occupations) require one. This means 37% of workers are carrying a degree that only about 21% of jobs require. Predictably, this leads to an oversupply of labor and an increased difficulty in finding a job: a full 52% of graduates with a bachelor’s degree end up underemployed within one year after graduation, meaning they are unwillingly working in low-skill and low-paying jobs (or only working part-time) because they cannot get full-time jobs that use their skills. Ten years after graduation, that figure is still 45%.
While headlines focus on the unemployment rate (almost exclusively on U3, ignoring the much more insightful and complete U6), a key measure of labor market momentum is the hires rate, which tracks a month’s hires as a share of overall employment. This metric has fallen to 2014 levels, while in 2024 the overall pace of hiring in professional and business services (a go-to for a large proportion of young graduates) was down to levels not seen since the Great Financial Crisis. This slowdown in hiring is likely only going to get worse in the age of AI, which threatens to automate 60% of administrative tasks and poses a significant threat to bookkeeping, financial modeling, basic data analysis, paralegal work, graphic design, and basic journalism. Even software development jobs — the longstanding gold star of high-earning knowledge work — could see 40% of programming tasks automated by 2040. Yes, this widespread automation will mean increased productivity. It will also mean decreased jobs, since productivity by definition is calculated as output divided by input: automating tasks with technology increases productivity by shrinking the denominator, much as it has for the past several thousand years.
This mismatch between the number of jobs requiring degrees and the number of students pursuing them leads to some amusing (if unfortunate) results: for instance, in 2012 there were more recent college graduates working as cashiers than as mechanical engineers. Unsurprisingly, the rates of unemployment vary by degree obtained, but when combined with broader market data some disconnects become obvious. The most apparent of these disconnects is the prevalence of “entry-level” jobs that require years of experience (and traditionally did not require a degree), which seems illogical until one considers that when supply and demand are out of balance, some very unwelcome shifts in pricing power — and ultimately, all hiring decisions are pricing decisions — will inevitably result.
This is economics in its most raw form: college degrees were historically associated with specialized knowledge and (virtually) guaranteed access to better opportunities because they were rare. A decades-long surge of college graduates has done little but oversaturate the market, diluting the value of the degrees and turning what was once an advantage into table stakes. But while the market value of a degree (i.e. median earnings) may have decreased over the years, the cost of obtaining that degree certainly hasn’t: one year of tuition in 1963 cost just over $4,949, when adjusted for inflation; by 2022, that cost was roughly $14,960. That’s an increase of over 200%. It must be noted that this change has not been linear: in the past twenty years, college tuition has doubled, yet real median household income has barely increased 20%.
This is a hallmark of diminishing returns per dollar invested… or, in the case of college tuition, diminishing returns per dollar borrowed, which creates an altogether different set of risks and constraints.
The tautological observation is that this disconnect between degree and skill demand need not remain the de facto standard. For instance, approximately 52% of all available jobs are considered “middle-skilled,” meaning some training is required beyond high school, but a bachelor’s degree (or higher) is not. Most of these jobs, unsurprisingly, are trades, and in nearly every industry there is a shortage of skilled workers able to meet demand: this includes crane operators, court reporters, radiologic technologists, and UPS drivers — the de nouveau sex symbol of the working world, with pay and benefits in some cases reaching $170,000. Many of the most in-demand jobs are also unionized, which provides better wage protection than might otherwise be had.
In a rational market, these unmet demands would not exist: demand increases, word gets out, student enrollment at universities drops in favor of trade schools, and the number of workers in these high-demand fields increases. This is to say that the labor market would ideally be self-correcting. Unfortunately, the labor market is currently far from rational, and has been for several decades. College enrollment did actually decrease during the pandemic, but this was not indicative of a broader trend. According to a Harvard study, the market for middle-skills jobs “is consistently failing to clear.” That’s economics-speak for a demand that remains unmet in spite of seemingly endless efforts to address it. This issue is caused by an anomaly that’s often discussed but remains unresolved: millions of aspiring workers have remained unemployed or underemployed, yet employers across several industries find it hard to fill open positions.
The obvious beneficiaries of this supply-demand mismatch are those who work in the very trades that are most in demand, leading to what the Wall Street Journal calls “America’s new millionaire class” — the skilled trades that millions of would-be-gainfully-employed-workers overlook. Not only is the training for these jobs obtained rapidly and at a lower cost than traditional degrees, the time horizon between day-one training and day-one earning is often significantly shorter. That means that many in the trades graduate with less education debt, are able to begin paying it back sooner, and oftentimes are in a much better financial position ten years into their careers.
Some of the issues within the broader labor market are directly attributable to intentional hiring decisions: nearly 40% of employers avoid hiring recent college graduates in favor of older workers, regardless of qualifications. Is this “fair,” in the strictest sense of the word? Perhaps not, but given some of the issues with professionalism and preparedness (yet another skill set that is seemingly left untaught on college campuses or anywhere else), perhaps some of these managers may rightly be afforded a bit of slack. But the time-worn arguments regarding the youngest generation’s laziness and unwillingness to work are nothing but intellectual shorthand for explaining away a stubborn problem. Such arguments are usually recognized by their choral prelude of “the kids these days” (a woeful sentiment that dates back to the ancient Greeks) and a complete lack of corroborating evidence. Not only are these half-witted attempts at career neonaticide false on their face, they are also inconsistent with the reality that the labor market is fundamentally broken, and a misallocation of educational priorities is what broke it.
There are innumerable benefits of attending college, regardless of the cost, that have absolutely nothing to do with finding a job after graduation: developing friendships, building or enhancing a sense of community, participating in sports, etc. If nothing else, college exposes students to a range of ideologies and forces them to learn how to interact civilly with those whose views they may find egregious or repulsive, and this in and of itself has value in that it teaches the fundamentals of what’s required to live in a pluralistic society. Some would argue that college has value because it’s explicitly divorced from the labor market: for students, it’s the last sanctuary before being subjected to market forces, but it’s also where research and learning are encouraged for their own sake.
This message has clearly taken hold. The percentage of Americans with college degrees has steadily increased over the past several decades, and close to 48% of all people aged 25 and older now possess some type of college degree; Census Bureau data provide some noteworthy breakdowns. While schools in the Ivy League are glorified and routinely dominate discussions regarding college enrollments and rankings, in reality they enroll less than 1% of all students nationwide; 70% of students go to public or less-selective private schools. Some schools — including some very prestigious ones — are making a concerted effort to expand opportunity without unduly burdening students. The University of Central Florida, for instance, guarantees admission for students who possess an associate’s or articulated degree from a partner college through its DirectConnect program. Stanford has decided to go one step further and ensure that students from families earning less than $100,000 do not pay for tuition, housing, or food; students from families earning less than $150,000 can still benefit from tuition being covered. Given that the expected cost of attending Stanford for the 2024–2025 school year is nearly $100,000, these efforts are certainly commendable. While these programs are obviously not universal, they demonstrate that at least some schools are aware of the affordability and cost-benefit concerns of a college education and are taking meaningful steps toward a solution.
The reality of the college wage premium
Invariably, whenever the question of whether college is “worth it” is broached, the multitude are quick to point out the earnings differential between those who graduated from college and those who did not. Various statistics exist on the matter — often called the “college wage premium” — and are quoted in lockstep like a mechanical symphony. Data from College Transitions, linked above, show that college graduates are 3.5 times less likely to fall below the poverty level and enjoy a median income that is $36,000 higher than peers who forego degrees. And data from the Federal Reserve certainly support this argument.
Over the span of a career, this difference in earning power can amount to a sizeable difference: a 2018 analysis by economist Douglas Webber suggests that the college wage premium can be on the order of $900,000 over a person’s career, whereas data from the Georgetown University Center on Education and the Workforce place that figure closer to $2.8 million. This seems like a foolproof investment even if one’s actual wage premium amounts to only half of these figures: the more you learn, the more you earn. And this wage gap has only increased over time.
There are, of course, several problems with the wage gap that are often ignored in favor of the brighter parts of the “college for all” narrative. The first and most glaringly obvious of these assumptions is that the student graduates from college to begin with. According to a twelve-year longitudinal study by the National Center for Education Statistics (which tracked the postsecondary outcomes of 23,000 representatively sampled students), 40% of college students never earn a degree or credential within eight years of beginning to take classes. By the time the study concluded, some 74% of the students tracked had enrolled in college, which was 10% lower than a previous iteration of the study. Outcomes were strongly correlated with family income, and 52% of students from the poorest families (those with household incomes below $35,000 annually) had not enrolled in college at all by the time the study concluded. Whether certain students will ever attend college is an open question, as both dropout rates and graduation rates are strongly correlated with race, gender, and income: the odds of betterment via traditional education paths drop for poor, non-white, and part-time students. Overall, nearly a quarter of students take longer than four years to earn a bachelor’s degree.
Graduation rates from first institution attended for first-time, full-time bachelor’s degree-seeking students, by race/ethnicity and time to completion: cohort entry year 2010. Source: National Center for Education Statistics.
It should be noted that “some college” is barely worth anything — even earning 99.999[…]% of the credits required for a bachelor’s degree is not enough to get one, and therefore not enough to reap the benefits of the wage premium — and it’s safe to assume that the students who graduate high school but never attend college (or who drop out) have heard the exact same “college for all” proselytizing as the rest of the nation. Therefore, this is not an issue of messaging. The students who attend college but do not graduate (or do not graduate within a “reasonable” amount of time) are the real hidden costs of the “college for all” mentality: they become burdened with student loan debt — about $38,000 on average — without the benefit of an increase in earning potential. Their ability to absorb that financial hit depends largely on household income, but those most likely to drop out are also most likely to have low household income to begin with. When one considers the numerous sources that show the odds of graduating from college are little better than a coin toss, higher education becomes the most expensive detour of many students’ lives.
The earnings of graduates must also be examined closely. As with all distributions, outliers can skew the data: in this case, high earners skew the wage premium. A study from The HEA Group of nearly 3,900 college graduates found that for 40% of college students, the wage benefits were either a wash or actually negative once the cost of attending college is taken into consideration. And while high earners are very well off — and some even fulfill their wildest dreams of success, turning their degrees into generational wealth in tech hubs and the crucible of entrepreneurship — for most students, “college for all” only serves to make poor kids poorer.
This isn’t to say that there’s anything “wrong” with college, only that the system of rewards and punishments meant to encourage students toward betterment is fundamentally broken. In the words of Harvard professor Michael J. Sandel:
In practice, most colleges and universities do less to expand opportunity than to consolidate privilege. For those who look to higher education as the primary vehicle of opportunity, this is sobering news.
That’s a scalding condemnation of the higher education system if there ever was one, and especially from a highly respected professor at the nation’s most esteemed university. But note that Sandel does not criticize college as an institution or an idea, because there’s nothing fundamentally wrong with the concept of college itself: Sandel is instead criticizing the ineptitude and complacency that’s lured millions of young Americans into a debt spiral from which they might never escape. As with most things in life, it’s in the execution of “good ideas” that the fatal flaws and assumptions are made apparent, and this misguided execution ensures that the best intentions only serve to pave a road into Hell.
In reality, the real value of college depends on the person. The wage premium is an average, and because no one person is exactly average, the wage premium does not tell the entire story. Amid the disheartening realities of degree inflation and the pervasiveness of low-skill and gig work exacerbating the shortage of middle-skill workers (ironically, making the pay for those trades continue to rise far above what most college graduates earn), one thing that “college for all” conveniently overlooks is that not all majors are created equal. Much like managerial hiring decisions, this is another topic that does not seem “fair” but is still the reality of the market at large. It also highlights the inherent weakness of the argument that college is required for success: certain jobs will be rewarded more richly (literally) than others. This complicates the decision of how, when, and if one should go to college by introducing the question of what they should study if they do go.
Once more, data come to the rescue. The choice of college major is an economic one as much as it is one of likes and preferences. Most data will show that certain fields of study — engineering, business, finance, computer science, and the like — have consistently higher median salaries. Yes, the adage of “you can be anything you want to be” holds true: several successful entrepreneurs have backgrounds that are far removed from the industries that made them successful — Brian Chesky, co-founder of Airbnb, was an industrial designer; Evan Sharp, co-founder of Pinterest, has a master’s degree in architecture — but their degrees didn’t make them successful, their adaptability and work ethic did. But most college students will not become entrepreneurs — only 19% of American adults are actively engaged in starting or running a new business — and therefore the economic realities of earning potential must be considered when making decisions about postsecondary goals. And that means college must be evaluated in terms of return on investment, just as with any other financial decision.
In fairness to the successors of the civil servants that created the “college for all” issue in the first place, it would appear that officials are at least dimly aware of the problem: parts of the Inflation Reduction Act, the Bipartisan Infrastructure Law, and the CHIPS and Science Act have caveats aimed specifically at creating middle-skill jobs. A report from the University of Massachusetts Amherst Political Economy Research Institute (PERI), commissioned by the National Skills Coalition and BlueGreen Alliance, found that these laws would create and support nearly three million jobs per year over their lifetime. Many such jobs would be considered middle-skill positions. Roughly two-thirds of that job creation was expected to take place in the construction and manufacturing sectors, and 69% of jobs created by these three legislative efforts would be available to workers without a bachelor’s degree, compared with 59% of the jobs in the US workforce as a whole.
Granted, legislative action is infamously impermanent: laws can sunset or their intended effects can be undone by subsequent legislation once an opposing party has the congressional majority, but doing nothing is not an option: the labor force participation rate (defined as the number of people who are either working or actively seeking work as a percentage of the working age population; it does not count students, retirees, or discouraged workers) among various groups shows cause for concern, if not outright alarm. Put simply, the labor force participation rate has been steadily decreasing for decades, except for workers aged 55+; their participation has actually increased since 2000, and has held relatively constant at nearly 40%.
This decline in participation rate cannot simply be attributed to a lack of available jobs: while it’s true that job openings took a tumble after the 2001 and 2008 recessions, since 2009 nonfarm job openings steadily increased until covid. And even though job openings have declined steadily since their 2022 high, the behaviors of job openings and participation rates do not seem to correlate meaningfully.
Part of this discrepancy may be explained by the aforementioned definition of the labor force participation rate: it does not count students, retirees, or discouraged workers. While it’s true that people tend to go back to school during recessions, the number of discouraged workers is the inverse of this behavior, and increases during recessions. The problem is that, since covid, the number of discouraged workers has not leveled off or resumed a downward trend: it decreased briefly after the official end of the pandemic, but has subsequently hovered near the same level. Does this mean that an increasing number of workers are becoming irreparably dissatisfied with their career potential and are choosing to simply detach from the workforce?
Given the aforementioned data of young adults returning to live with one or both of their parents, and that the new “dream job” for young men is apparently being a stay-at-home son, some data support this disconnection hypothesis.
The student loan time bomb
One of our more popular pieces examined “The Everything Bubble,” or how decades of irresponsible risk management and an addiction to easy money have fueled a speculative bubble across multiple asset classes. The growing level of student loan debt is absolutely a part of that problem. While the average college student leaves higher learning with $38,000 in debt (and obviously with a much higher burden if they attend grad school), at the macro level the numbers are truly staggering: not only are Americans carrying more than $1.61 trillion in student loan debt (about 32% of all non-mortgage/non-HELOC debt), more than five million student loans are in default, and another four million could be in the same position very soon. All told, fewer than 40% of borrowers are current, and nearly one-fourth of the federal student loan portfolio is in default.
This is very concerning for a debt category that’s grown over 570% from 2003 through the first half of 2025.
And things have already begun to get worse: not only can student loans not be discharged in bankruptcy (discharge requires filing a separate action known as an adversary proceeding, which of course creditors can challenge) as of May 2025 garnishments have begun in earnest for borrowers in default. Student loans also have no statute of limitations, meaning that the negative impacts they can pose to credit scores can haunt borrowers for years — sometimes decades (meaning a student loan default today can haunt a borrower trying to get a new car or a mortgage years down the road) — and collections can begin at any time until these loans are current.
A chart from the New York Fed illustrates that the expected jump in delinquencies became a reality after the loan moratorium ended. While no federal student loan was referred to collections for more than four years, beginning in March 2020, the spike in defaults is likely going to have downstream effects on borrowers. After all, student loan delinquencies are reported to credit bureaus, and the subsequent downgrade in credit score (a default can drop a credit score by up to 200 points), will have the aforementioned impacts on a borrower’s creditworthiness for subsequent life milestones. That three-digit number is not something to be taken lightly.
In some ways this time bomb has already started to explode, dropping more than two million borrowers into subprime territory, with over one million losing at least 150 points on their credit scores. This means that more than two million borrowers have suddenly found themselves unable to qualify for conventional car loans, mortgages, or credit cards; these will likely be the newest additions to the debt spiral that typifies subprime borrowers. All told, more than $63 billion in consumer spending could vanish from the economy.
By the end of May 2025, the Department of Education announced it had collected more than $100 million in student loan debt. More collections will likely follow. This news, of course, must be considered in light of rising delinquencies across multiple debt classes. When combined with the abysmal personal saving rate of Americans, it’s not difficult to see how an unforeseen expense can lead to a cascade of failures. And with millions of “fresh” credit hits thanks to student loan collections, the probability of those failures is becoming ever more relevant.
Prior to resuming collections, the Department of Education had explicitly stated “There will not be any mass loan forgiveness.” While the subject of student loan forgiveness is certainly a hot-button issue — and, unsurprisingly, one mired in politics — this is one aspect of the problem that is either overlooked or marginalized. The reality is that “mass loan forgiveness” programs already exist. In certain situations — such as being a teacher, a government employee, or a medical professional — borrowers can have their federal student loans forgiven, canceled, or discharged (functionally, these three terms mean the same thing). There’s also the Public Service Loan Forgiveness program for those employed by a government or nonprofit organization. The tragedy is that the existence and benefits of these programs remain downplayed during conversations of the student loan crisis, illustrating that once again efforts notionally intended to “solve” the student loan crisis focus on the wrong elements of it.
Are these existing loan forgiveness programs ideal? Many would argue no: for starters, PSLF requires 120 qualifying monthly payments under an accepted repayment program, meaning the borrower has already carried the loan burden for at least ten years before they qualify for relief. There’s also the matter of loan forgiveness programs applying only to specific professions, rather than being a generalized path out of student loan debt. But ignoring, for just a moment, the serious moral and ethical implications of students taking on loans they know they will likely be unable (or unwilling) to repay, there’s simple economics at play: debt only has value to the borrower when it can be leveraged for returns greater than the cost of capital. If this is not possible, debt only benefits the lender. For many majors, and therefore many students, this analysis is likely going to result in a negative NPV.
Putting aside the “college for all” narrative and the narrowminded focus of “degrees at all costs” that it has engendered, the student loan burden has shown no signs of subsiding. In fact, quite the opposite has occurred: college has gotten more expensive, many degrees have become less valuable, and for many this burden has extended into middle age. Some borrowers have so demonized the risks of debt that they claim to have missed out on the benefits of investing that money instead (this is called an opportunity cost). Others have simply given up.
So is college worth it?
This is a heavily loaded question, and the only realistic answer is “it depends.” Popular podcaster Joe Rogan called student loans a “scam” and “the dirtiest thing ever.” While this outlook might be a bit reactionary, it certainly encapsulates the mindset of millions of Americans, and maybe even those who didn’t fall into a debt spiral because of student loans.
College can absolutely be “worth it” for those students who pick a degree with a solid ROI at a reasonably affordable school (this will likely mean an in-state public university). Of course, students are free — and should be free — to pursue whatever education they’d like at whatever school accepts them, but pursuing one’s dreams while ignoring reality does no one any favors. For low- or negative-ROI degrees at expensive schools, the outcomes are virtually guaranteed to be bleak. For starters, two-year liberal arts degrees are literally worthless from an economic perspective; 23% of bachelor’s programs have a negative ROI, as do roughly half of master’s programs. More than 20% of doctoral and professional degrees don’t pay off, either. Some college choices are so financially terrible that the school itself has a negative ROI.
To be fair, there has been some positive legislative traction regarding this problem: Senators Katie Britt and Bill Cassidy are among several sponsors of the recently-reintroduced College Transparency Act, which aims to ensure that students and families have better and more transparent information as they consider taking on a literally life-changing level of debt in the pursuit of higher education. This would include modernizing the college reporting system for post-secondary data, including enrollment, completion, and post-college earnings across multiple schools and fields of study. To sum it up:
The College Transparency Act would help bridge that gap so potential students and their families are better able to navigate these significant decisions and are well-positioned to achieve their American Dream.
Seems fairly reasonable, since it’s physically impossible to make an informed decision without information. And since every student is unique, efforts like this should not only promote better outcomes for students, but also for the employers who seek potential hires that are best suited for a particular field.
The easy “answer” to this problem, and the one most often quoted — “stop going to college and go to trade school” — is deluded and overly reductive. Yes, for some, trade school is absolutely the answer: Gen-Z men especially are skipping college and going straight into the trades amid a gender-based divergence in unemployment rates (7% of college-educated American men are unemployed, compared to 4% of women, partly due to the growth in fields like healthcare). Between 2011–2022 the overall share of young college students declined by about 1.2 million, but this figure is sharply divided along gender lines, with about 1 million fewer men enrolling in college versus 200,000 fewer women. Swapping a college degree for a trade certificate is a trend that even some billionaires have suggested will be a growing part of the future. This is especially true for students who are “good with their hands” or are unlikely to thrive in a traditional college environment. But this simplistic thinking misunderstands the problem and ignores that Americans have spent decades being spoon-fed one very narrow definition of success, as an overcorrection from the days when only the very wealthy attended college. It also ignores the reality that education can (and does) occur outside of the classroom. And decrying higher education in toto is willful ignorance at its worst, the sort of head-in-the-sand mentality whose adherents, in a just world, would find themselves receiving emergency medical care from self-taught doctors and living in homes or high-rises designed by self-educated engineers. Solving the problem requires fixing the current issues without overcorrecting again in the opposite direction: college is not the problem, “college for all” is.
This means that trade schools are a solution, not the solution, because one-size-fits-all is how the country ended up in this mess to begin with. “College for all” has gone unchallenged because it’s very profitable (especially for the schools, who have virtually unlimited pricing power) and seemingly well-intentioned… even if an honest discussion about the economic value of college would discourage attendance by those who would benefit. Blue-collar work has become so stigmatized that it’s avoided even by those who might otherwise find it fulfilling, and many parents would consider their children lazy and irrational if they were to look into trades instead of universities. In reality society has different roles that must be filled, and there must be a shift in how work is valued. All jobs are valuable and have meaning — people are free to decry plumbers as lazy and uneducated until the toilet clogs at midnight — and the end result of a proper educational policy should be to maximize freedom of career choice based on personal tastes and goals, creating a system where all educational options (especially traditional colleges and universities) are forced to face real economic competition for enrollment.
One need look no further than China to see the logical end of the “college for all” mentality: the unemployment rate for people aged 16–24 reached a record high of 21.3% in June 2023, as scores of students work extremely hard to get a good education only to end up unemployed. In China, trade schools are even more under-funded than the US and the stigma against blue-collar work is even stronger, but the bill eventually comes due in terms of unemployment and underemployment. Things will likely get worse for the Chinese as the effects of the country’s notorious one-child policy are felt in the coming decades. Perhaps the same militaristic language that made A Nation at Risk so successful can be resurrected to analyze this problem methodically in the face of the actions of what could potentially be our next national enemy.
The goal is not to remove the opportunity to attend college, but the social obligation to attend and the stigma of refusing to do so. The market has already spoken, and eagerly rewards the middle-skill positions that many Americans overlook. Higher education enjoys unlimited pricing power because it must only compete with itself, thanks to decades of social pressures deciding that college was the only “smart” choice. Ironically, it’s education itself that can solve this problem, albeit not at the college level: half of US adults are financially illiterate (in addition to the ~20% of US adults — roughly 43 million — who have low language literacy to begin with), and this ignorance has cost Americans more than $243 billion in 2024 alone. Even Gen Z, who have risen through the education ranks during a time when the true scope of the financial literacy problem was well-known, are not immune: fewer than 50% of them take finance courses in high school. The extent of the problem is truly awful:
- two states have zero financial literacy requirements at all for public school students;
- six lack consistent financial literacy;
- only eleven offer financial literacy in each K-12 grade, including a standalone personal finance course for high school graduation and financial literacy standards for K-8 grade levels.
Given how indelibly connected finance and education happen to be, refusing to give our children a basic understanding of how finance works — before shipping them off to gorge themselves on student loans — seems disingenuous at best. In essence, we have spent decades condemning our children to a lifetime of overwhelming student loan debt for want of a simple analysis of functional requirements: if college is not the “right” decision for a particular student, it’s a moral and social (and yes, economic) imperative that they have properly-funded educational options available to them. This is as much of a personal decision as it is a policy one, and the government has a history of being nothing if not occasionally deaf to the cries of the populace. Much of the solution requires a fundamental shift in social consciousness that is unfortunately not a numerical affair: yes, a liberal arts degree from Harvard is extremely impressive, and the holder of such a degree is deserving of praise for its attainment, but that does not and should not guarantee a high-earning career unless the graduate has other marketable skills. It’s in the development of those “other marketable skills” that many attempts to fix the situation fall by the wayside: college is a piece of the puzzle, not a skeleton key designed to unlock all barriers to success.
In an attempt to force-feed the belief that college is the golden path toward making “the complete man,” we have lost sight of what makes that proverbial “complete man” complete to begin with.
Student loan debt has burdened Americans more than nearly any other type of debt behind mortgages, HELOCs, and auto loans. Approximately 3.6 million Americans have student loan debts in excess of $100,000, yet more than half of graduates don’t end up working in their field of study. If that seems illogical, that’s because it is: we’ve told millions of young Americans that going into debt for an education is a “good thing,” yet the numbers prove that by and large this simply isn’t the case. And while certain legislative efforts seek to close the door on forbearance and tighten the reins of repayment, one sentiment expressed by Joe Rogan (and undoubtedly felt by many others) becomes undeniably clear:
You’re too young to be connected to a $50,000 debt when you’re 18. You don’t know what it means […] The fact that it’s going to follow you around forever and haunt you… I think it’s evil.
The time for polite conversation is over. We are watching a generation drown in debt while the labor market gasps for the skills our schools no longer teach. This is not an accident — it is the predictable outcome of a system designed to funnel public funding, private loans, and personal ambition into an overpriced, underperforming higher-education monopoly. Lawmakers, school boards, and higher-ed leaders have allowed this imbalance to metastasize for decades, and every year they fail to act is another year we rob our children of economic mobility.
We need a legislative mandate that rewrites the playbook. First, require full ROI transparency for every degree program in the country, so students know exactly what their debt will buy them before they sign. Second, make financial literacy a K-12 graduation requirement in all fifty states, equipping every student to weigh the cost of college against alternatives. Third, restore parity in funding between four-year universities and vocational programs, ensuring trade schools and apprenticeships have the resources to compete for talent. Fourth, tie federal education funding to labor-market alignment: if a program consistently produces graduates who cannot find work in their field, it should not receive taxpayer subsidies.
We must also take a page from the world’s most competitive economies — nations like South Korea, Singapore, and Japan — where skill development and career exploration are embedded in education from an early age. These countries integrate aptitude assessments, practical skill-building, and personalized career guidance throughout primary and secondary school, giving students a clear view of their strengths and potential pathways long before graduation. This early, deliberate investment not only boosts job placement rates, it ensures that by the time students finish school, they are prepared to thrive — whether they choose college, trade school, or a direct path into the workforce.
We cannot keep selling the next generation a false promise. The American Dream is not a diploma — it is the freedom to build a life without being shackled to debt for the privilege of working a job that never required the degree in the first place. If Congress, states, and school districts fail to make this pivot now, the result will not just be wasted potential — it will be the permanent erosion of our skilled workforce, our economic resilience, and our claim to being a land of opportunity.