Reversing the Decline of Enrollment in U.S. Higher Education


As futurist and educator Bryan Alexander has reported, enrollment in U.S. institution of higher education has been in decline for a while now.

People responding to this trend tend to fall into a few predictable categories, in my experience:

  • the yawners who couldn’t care less
  • the indignant who knee-jerkedly respond with anecdotes such as “my buddy so-and-so has no degree, started a business and is doing splendidly”
  • the anti-elitists who, for some reason, use the conversation to start advocating for more vocational education and trade jobs (even though there have also been declines in the enrollment in community colleges, which provide a lot of vocational training in the U.S.)
  • the graduates who claim they learned little of practical use in college
  • the people who bemoan the rising costs of higher education in the U.S., indicating the costs aren’t worth the returns
  • and, finally, the people who see the declines as worrisome, even ominous

Although I tend to fall into this last category, I understand the various criticisms of higher education. I graduated with a liberal arts degree that, my father said after graduation, would get me a cup of coffee if I also happened to be carrying around a nickel (these days, of course, I’d need about $3 bucks).

He had a point. I wasn’t exactly in high demand in the job market upon graduation. My professors told me I should go to graduate school and become, of course, a professor. Maybe I should have, but at the time I had ideas about not wanting to be a prisoner of the Ivory Tower. Also, the idea that I’d become a liberal arts prof convincing kids to get degrees in liberal arts had a bit of a Ponzi scheme feel to it.

So, I’ve carved out my own path in the world. It hasn’t always been easy, but I’ve done okay. In this, of course, I’m far from alone.

Higher Education Still Pays

There’s no doubt that post-secondary educations can be expensive these days. In fact, the average total cost of attending a public school for in-state students is $27,330 per year, and attending a private university can sock you for $55,800 per year.

Ouch! That’s a lot of debt to take on if you’re going it alone.

Still, college continues to pay dividends for most people. The median salary for workers with high school diplomas is $38,792 compared to $64,896 for those with a bachelor’s degree. And college graduates are less likely to be unemployed.

Then there are the many intangible benefits that are seldom discussed, things such as an improved understanding of the world, a greater appreciation for well-reasoned arguments, a stronger disposition to read, less gullibility in area of conspiracy theories, and an enriching love of the arts, literature and science.

I’m not saying this is true for all college graduates, and I know non-graduates who are wiser and more erudite than I’ll ever be. Still, on average, there are lot of unquantifiable benefits associated with a post-secondary degree.

College Grads Also Improve the Larger Society

But college pays at more than just the individual level. It offers high rates of return for society in general.

Higher earnings mean a richer society overall, as long as wealth isn’t too highly concentrated in the hands of a few. (I won’t delve in arcana of Gini coefficients here). And, better educations tend to result in higher productivity, lower crime rates, more volunteer work and better health.

More college graduates also mean that any given state is more likely to enjoy economic success. And, of course, more college graduates ultimately mean a higher GDP per capita in the whole of the U.S., which a lot of people associate with higher overall living standards in global comparisons.

So What Do We Do?

Although I’m sure there are various reasons for the decline in enrollment in post-secondary degrees in the U.S., my guess is the primary one is rising costs. People do not wish to–or feel they can afford to–take on crippling amounts of student debt just as they’re starting out in life.

Indeed, a study by the National Center of Education Statistics found that high schoolers are much more likely to go to college if they believe their families can afford it.

How to respond to this problem has become a hot political topic. In fact, the Biden administration originally had a plan to make community college tuition-free for two years, although the proposal was ultimately stripped from the federal Build Back Better bill.

There are, of course, various states that provide free college tuition to some students based on income and merit, and there are some with very few eligibility requirements. In addition, there are 17 tuition-free colleges in the U.S.

But there’s nothing at the kind of scale we need. So, here is the beginning of an idea. Let’s provide a government-financed free online university program that is fully accredited and has no eligibility requirements. This can almost certainly be done in a way that is far less expensive than other initiatives aimed at making higher education affordable.

The program doesn’t need to touch the rest of the messy education system in the United States. Public and private universities can go on being their current dysfunctional selves while we create this one great national project that vastly opens up the educational space without driving anyone into debt.

There are probably lots of ways this could be managed. The program could, for example, contract with many of the best professors in the world to create its online courses. Or, someone could curate the best existing online courses from other universities and incorporate them into a national curriculum. Or, there could be some mix of both systems with some others added in. For example, there could be vetted, open-source elements, as there are with open-source software.

The ultimate goal, however, would be the same: at relatively little taxpayer expense, use massive economies of scale to provide a very good free higher education to all interested U.S. students.

Eventually, it might even incorporate stipends to give students at least some of the financial support they need to spend time getting their degrees.

Aren’t There Already Online Degrees?

Yes, there are already accredited degrees that can be gotten online. However, these are currently a hard-to-navigate hodgepodge where the costs are high for specific university programs (ranging anywhere from $300 to 1,000+ per credit).

In addition, there is, to my knowledge, already one free, accredited online university: The University of the People. I don’t know much about that institution. Maybe there are, however, lessons or even courses that a national program could leverage.

Ultimately, though, this national university would need to be prominently supported and well branded by the government so that it is not viewed as some sort of inferior offering.

(And, yes, there will be those who swear by traditional, residential programs as far superior educational experiences. Maybe they are. But they are also far more expensive and so less realistic to make free to the public. We need to be both practical AND ambitious here.)

The Problems

Getting It to Happen

There are two primary problems I see, and probably many more I don’t. The first is at the creation and implementation phases. As happens in our healthcare system, private and even public entities would cry foul, saying the government is competing with the marketplace. This criticism from moneyed interests might be joined by many professors and university staff members, worried that their livelihoods are being threatened.

In America, at least, these dynamics tend to be the death knell for bold, innovative and potentially impactful initiatives. For the most part, the U.S. doesn’t know how to do such things anymore.

But something like this should happen for the sake our country and its citizens, providing good educations at low costs for people who can’t afford to go into hock for the rest of their lives. Such a program might completely turn around the trend toward declining university educations, and it would bring a massive productivity and financial boost to the nation.

Ensuring It’s Not Used as a Political Tool

Even if we could implement it, however, it could be crippled by politicians and bureaucracies. This is the danger of any national program. Those who constantly warn about the evils of “socialism” (look, we already have mixed economy, and that’s not going to change) have a point in this case.

Some politicians will no doubt bloviate against any perceived “liberal” ideas or “rightwing” viewpoints and so want to micromanage professors, courseware and and curricula. The last thing anyone needs is education micromanaged by amoral or misguided politicians, whom we already have acting at the state levels.

So, this program won’t be worth implementing unless there can be firewall protecting it against political fools and demagogues. In this, I think we might learn from entities such as the Federal Reserve. Although an instrument of the U.S. government, the Fed is independent because its policy decisions do not have to be approved by the President or by anyone else in the executive or legislative branches. Moreover, it does not receive funding appropriated by Congress, and the terms of the members of the board of governors span multiple presidential and congressional terms.

A national online university initiative could not directly imitate the Fed system, of course. These are two very different entities. But we might borrow some ideas in order to protect the system against unwarranted political interference.

In the End

Of course, this may not happen no matter how much it should. Today, America seems chronically handicapped by special interests, demagoguery, fundamentalism and myopic, fact-impervious voting blocs.

But perhaps the pendulum is still capable to swinging back someday, spurring a renaissance of integrity, visionary thinking and can-do-ism. Maybe someday we’ll again see growth in the proportion of Americans getting good educations. I may be jaded in these excruciating political times, but even in me hope springs eternal.

Featured image: The University of Bologna in Italy, founded in 1088, is often regarded as the world's oldest university in continuous operation This is a photo of a monument which is part of cultural heritage of Italy. This monument participates in the contest Wiki Loves Monuments Italia 2020. 

Do You Treat Employees Like Fixed-Program Computers?

Computers didn’t always work they do today. The first ones were what we now called “fixed-program computers,” which means that, without some serious  and complex adjustments, they could do only one type of computation.

Sometimes that type of computer was superbly useful, such as when breaking Nazi codes during World War II (see the bombe below). Still, they weren’t much more programmable than a calculator, which is a kind of modern-day fixed program computer.

The brilliant mathematician John von Neumann and colleagues had a different vision of what a computer should be. To be specific, they had Alan Turing’s vision of a “universal computing machine,” a theoretical machine that the genius Turing dreamt up in 1936. Without going into specifics, let’s just say that the von Neumann model used an architecture has been very influential up the present day.

One of the biggest advantages associated with Turing/von Neumann computers is that multiple programs can be stored in them, allowing them to do many different things depending on which  programs are running.

Von Neumann architecture: Wikimedia

Today’s employers clearly see the advantage of stored-program computers. Yet I’d argue that many treat their employees and applicants more like the fixed-program computers of yesteryear.  That is, firms make a lot of hiring decisions based more on what people know when they walk in the door than based on their ability to acquire new learning.  These days, experts are well paid largely because of the “fixed” knowledge and capabilities they have. Most bright people just out of college, however, don’t have the same fixed knowledge and so are viewed as less valuable assets.

Employers aren’t entirely in the wrong here. It’s a lot easier to load a new software package into a modern computer than it is to train an employee who lacks proper skill sets.  It takes money and time for workers to develop expertise, resources that employers don’t want to “waste” in training.

But there’s also an irony here: human beings are the fastest learning animals (or machines, for that matter) in the history of, well, the universe, as far as we know. People are born to learn (we aren’t designated as sapiens sapiens for nothing), and we tend to pick things up quickly.

What’s more, there’s a half-life to existing knowledge and techniques in most professions. An experienced doctor may misdiagnose a patient simply because his or her knowledge about certain symptoms or treatments are out-of date. The same concept applies to all kinds of employees but especially to professionals such as engineers, scientists, lawyers, and doctors. In other words, it applies to a lot of the people who earn the largest salaries in the corporate world.

Samuel Arbesman, author of The Half-Life of Facts: Why Everything We Know Has an Expiration Date, stated in a TEDx video, “Overall, we know how knowledge grows, and just as we know how knowledge grows, so too do we know how knowledge becomes overturned. ” Yet, in our recruitment and training policies, firms often act as if we don’t know this.

The only antidote to the shortening half-life of skills is more learning, whether it’s formal, informal or (preferably) both. And the only antidote to a lack of experience is giving people experience, or at least a good facsimile of experience, as in simulation-based learning.

The problem of treating people like fixed-program computers is part of a larger skills-shortage mythology. In his book  Why Good People Can’t Get Jobs , Prof. Peter Cappelli pointed to three driving factors behind the skills myth. A Washington Post article sums up:

Cappelli points to many’s unwillingness to pay market wages, their dependence on tightly calibrated software programs that screen out qualified candidates, and their ignorance about the lost opportunities when jobs remain unfilled…”Organizations typically have very good data on the costs of their operations—they can tell you to the penny how much each employee costs them,” Cappelli writes, “but most have little if any idea of the [economic or financial] value each employee contributes to the organization.” If more employers could see the opportunity cost of not having, say, a qualified engineer in place on an oil rig, or a mobile-device programmer ready to implement a new business idea, they’d be more likely to fill that open job with a less-than-perfect candidate and offer them on-the-job training.

The fixed-program mentality should increasingly become a relic of the past. Today, we know more than ever about how to provide good training to people, and we have a growing range of new technologies and paradigms, such as game-based learning, extended enterprise elearning systems, mobile learning and “massively open online courses” (aka, MOOCs).

A squad of soldiers learn communication and decision-making skills during virtual missions: Wikimedia

With such technologies, it’s become possible for employers to train future applicants even before they apply for a position. For example, a company that needs more employees trained in a specific set of programming languages could work with a provider to build online courses that teach those languages. Or they could potentially provide such training themselves via extended enterprise learning management systems.

The point is that there are more learning options today ever before. We live in a new age during which smart corporations will able to adopt a learning paradigm that is closer to that of stored-program computers, one that they’ve trusted their technologies to for over half a century.

Featured image: A rebuild of a British Bombe located at Bletchley Park museum. Transferred from en.wikipedia to Commons by Maksim. Wikimedia Commons.