Who Cashed Our Productivity Paychecks?

Does more labor productivity raise people’s living standards? The conventional wisdom is still “What’re you kidding? Of course it does!” But the evidence on that is pretty sketchy and has been for a while now. So, let’s do a little myth-busting as we explore the so-called productivity-pay gap.

Investopedia nicely sums up the standard line on productivity: “The level of productivity is the most fundamental and important factor determining the standard of living. Raising it allows people to get what they want faster or get more in the same amount of time. Supply rises with productivity, which decreases real prices and increases real wages.”

You can find the same basic claim all over the place, from the The Library of Economics and Liberty to McKinsey to Forbes.

Just one little problem, of course. The data indicates it’s not true, at least not in the ways it has usually been explained.

We’re A Lot More Productive, But Not Much Richer

In the U.S., productivity has been going up for many years. In fact, it rose a little faster between 2019 and 2022 than it did the previous 12 years. Have a look at this data from the Bureau of Labor Statistics (BLS):

Productivity Change in the Nonfarm Business Sector, 1947-2022

Productivity hasn’t grown as quickly over the last 15 years as it had the previous 17. But from 2019 to 2022, it was still growing at a similar rate as it did from 1973 to 1990. Overall, despite an occasional dip here and there, there’s been steady growth.

Sure, there’s plenty of room for economists to complain, but consider the fact that labor productivity more than doubled between 1979 and 2022!

So, if it’s true that “productivity is the most fundamental and important factor determining the standard of living,” then surely our standard of living also doubled in that same time period, right?

The Productivity-Pay Gap

Well, no, not by a long shot. But the answer requires more nuance than that. After all, there’s no clear definition of “standard of living” and productivity itself comes in various flavors. Let’s stick with labor productivity, which compares growth in output to the growth in hours worked, and let’s use inflation-controlled compensation as a more measurable version of standard of living.

Here’s what we get, according to the Economic Policy Institute:

The idea, of course, is that productivity and compensation rose pretty much in parallel up until the early 1980s and then split off from one another. In fact, productivity rose 3.7 times faster!

Which suggests that something’s wrong with the whole conceit and with the fact that so many trusted sources keep claiming they rise in virtual tandem despite solid evidence to the contrary.

How Do We Explain What Happened?

So, how can we explain the productivity-pay gap? There are various theories, but here are three that, while not necessarily contradictory, stress different facets of the gap.

Theory 1: Policymakers Tore Out the Coupling

The EPI itself, which has a somewhat left-leaning orientation, explains it like this: “Starting in the late 1970s policymakers began dismantling all the policy bulwarks helping to ensure that typical workers’ wages grew with productivity. Excess unemployment was tolerated to keep any chance of inflation in check. Raises in the federal minimum wage became smaller and rarer. Labor law failed to keep pace with growing employer hostility toward unions. Tax rates on top incomes were lowered. And anti-worker deregulatory pushes—from the deregulation of the trucking and airline industries to the retreat of anti-trust policy to the dismantling of financial regulations and more—succeeded again and again.”

In other words, the government allowed the system to get misaligned. Let’s use the metaphor of a coupling. In machinery, a coupling is a device for joining two rotating shafts at their ends so as to transmit torque from one to the other. The goal, of course, is to transmit power fairly evenly. In the coupling of productivity and compensation, however, things fell badly out of whack. One shaft kept spinning like a champ while the other started moving in slow-mo. If the economy were a machine, we’d send it to the shop.

Theory 2: We’re Not Measuring It Right

Another theory is that the pay-compensation gap is real but maybe not quite as large as the consumer-price-indexed compensation rates suggest. The BLS provides the following chart.

In this graph, the bottom dotted line is compensation adjusted using the consumer price index, but the light blue line above that is compensation that’s adjusted using something called the output price index, which is arguably more accurate. The authors of the article “Understanding the labor productivity and compensation gap” explain:

Workers are compensated based on the value of goods and services produced, not on what they consume. Using an output price deflator, a measure of changes in prices for producers, instead of the CPI is an alternative that better aligns what is produced to the compensation that workers receive. Each industry has its own unique output deflator that matches the goods and services that are produced in that industry.

By using these “deflators” for a variety of industries, they find that the size of the productivity-compensation gap “decreased in 87% of industries that previously showed productivity rising faster than compensation.”

To be clear, the gap isn’t going away if you use this technique, but it does typically shrink in most industries.

Theory 3: The Rich Got Most of the Pay Raise

The third and, to me, most convincing theory is that average folk had their productivity lunch eaten by their better off brethren.

This is clear when you look at the work by economists such as Erik Brynjolfsson and Andrew McAfee of MIT. In their book Race Against the Machine, they comment on a graph that shows the amazing and growing disparity between real median household income and real GDP per capita (which is one measure of productivity). Below is a more up-to-date version of the one they point to in their book:

They call it “striking” and then make this observation:

There have been trillions of dollars of wealth created in recent decades, but most of it went to a relatively small share of the population. In fact, economist Ed Wolff found that over 100% of all the wealth increase in America between 1983 and 2009 accrued to the top 20% of households.  The other four-fifths of the population saw a net decrease in wealth over nearly 30 years.

Ouch. So, yes, the productivity paychecks are real. And they do raise the standard of living — but not for everybody. Or even most people.

Were Gains by the Rich Earned or Stolen?

Of course, this raises another question: “Did those folks at the top earn that paycheck, or steal it?”

If that’s incendiary phrasing, don’t blame me. Blame the purveyors of conventional wisdom mentioned above. The implication has always been that we all benefit from productivity increases, but, in practice, as Brynjolfsson and McAfee say, “There is no economic law that says that everyone, or even most people, automatically benefit from technological progress.”

Maybe that makes sense? Let’s say a bunch of tycoon types invest in robotics to boost the productivity of the average worker on the line of some manufacturing plant. After the inevitable layoffs of many workers, do the rest of those surviving employees divvy up the compensation of the people who were laid off minus the cost of the machines? 

Probably not. Instead, the benefits accrue to the investors and the senior managers (especially CEOs) who made the decision to invest in the robots. That is, the rich get considerably richer while the surviving workers only get a modest increase. And the folks who were laid off? How much of a cut do you think they’re getting?

Yeah, bubkis. Or, in many cases, they actually lose economic ground.

Multiply this dynamic many times over the course of decades, and median incomes stay flat while GDP per person (which is an average rather than a median) goes up.

So, to answer our question, “They earned it, kind of, sort of, in a way, if you squint hard enough and quash any human instinct for justice and fairness.”

But at least we now have a clue about where benefits of the productivity increases go. That is the beginning of wisdom — and a fine antidote to fiscal fairy tales.

Productivity Chickens Coming Home to Roost

Recently, there has been a decline in U.S. productivity. In fact, some analysts claim that the U.S. has now seen five consecutive quarters of year-over-year declines.

The big question is why. There’s lots of finger-pointing. Some high-profile CEOs blame lazy work-at-home employees for the decline. Others argue, to the contrary, that it is the return-to-work policies that are most strongly linked to productivity declines.

There are plenty of other suspects as well. For example, many people switched jobs during the “great resignation” and so stepped into roles where they had to learn the ropes to become more productive again. Or there’s the rapid return of many employees back into the workforce, a dynamic often associated with temporary reductions of productivity.

There’s also the possibility that higher inflation — combined with pay increases that are insufficient to keep up with it — are simply demoralizing workers. Why should they worker harder for smaller paychecks?

And, of course, there’s the idea that younger generations just aren’t as eager as their older baby boomer counterparts to keep their proverbial noses to the grindstone. It’s less that they’re “lazy” and more that they just aren’t as willing to put up with bossism and toxic workplaces.

CEOs Venting Their Spleen

Meanwhile, CEOs have been venting their spleen about declining productivity, so much so that it feels as if there’s a new “leaked video of a CEO having a meltdown each week,” writes AJ Hess in Fast Company.

On one hand, I get their frustration. Their jobs are, of course, to boost the performance levels of their organizations.

On the other hand, what make these meltdowns both funny and sad is the extraordinary pay gaps between typical employees and their bosses. For example, recent figures indicate that S&P 500 CEOs averaged $18.3 million in compensation in 2021. That’s a whopping 324 times the median worker’s pay!

How did their pay get so exorbitant? Well, one answer is, of course, productivity. That is, they (and other upper-class Americans) have enjoyed the fruits of the productivity bumps of workers whose wages have largely stagnated over the last 40 years.

Which makes you wonder: If the typical worker had been receiving their full share of the benefits of productivity increases since the early 1980s, would we be in a position where “quiet quitting” was even a thing?

Maybe not. What we could be seeing is the productivity chickens come home to roost. If the rich get most of the monetary benefits of productivity increases, then let them do most of the work.

Or, at the very least, they — in partnership with the government — should stop whining and figure out a way to make productivity increases benefit everyone in their organizations, not just the investors and executives at the top.

Is Going Back to the Office the True Cause of the Decline in Worker Productivity?

It runs against the conventional wisdom, but the Bureau of Labor Statistics data suggests that going back to the office is the true cause of the decline in worker productivity we’ve seen recently.

I was writing an article on long-term trends in U.S. productivity when I noticed that if you look at quarterly labor productivity data from the last few years, you see pretty solid productivity growth from 2020 through 2021 but then a hard dip in 2022.

I figured I couldn’t be the first person to draw the obvious conclusion that the return to office is pretty well correlated with a decline in worker productivity, and I was right.

Correlation Isn’t Causation But….

It turns out Gleb Tsipursky, Ph.D., wrote an article about this for Fortune magazine back in February. He even put together a handy-dandy graph based on Bureau of Labor Statistics data.

As Tsipursky neatly sums it up: “U.S. productivity jumped in the second quarter of 2020 as offices closed, and stayed at a heightened level through 2021. Then, when companies started mandating a return to the office in early 2022, productivity dropped sharply in Q1 and Q2 of that year. Productivity recovered slightly in Q3 and Q4 as the productivity loss associated with the return to office mandate was absorbed by companies–but it never got back to the period when remote-capable employees worked from home.”

Maybe There Are Other Reasons

Of course, correlation isn’t causation, and there may be other factors involved. For one thing, the pandemic meant that there were suddenly more people dropping (or being dropped) out of the workforce. In fact, the mini-recession we saw at the start of 2020 could help explain higher productivity numbers.

That’s because as employers shed more jobs, existing employees are forced to take on more work from their former colleagues. Also, new processes may be put in place to keep production relatively stable. Some have called this “cleansing out unproductive inputs,” which certainly sounds harsh but may have some element of truth, at least in the short run.

As the economy recovered, more people were hired back, which might help explain the decline in productivity figures.

Then There’s the Inflation Angle

Inflation demoralizes employees unless employers are matching inflation increases with increases in compensation. A survey from the HR Research Institute recently ask HR professionals, “What do you believe are your employees’ top five sources of financial stress? ” and the number one answer was “inflation issues,” cited by 62% of participants.

It makes sense that if employees feel they are doing the same work for less and less money each month, then they grow less satisfied, engaged and productive. And, although this problem is cumulative over time as employees lose their purchasing power, the highest spikes in inflation occurred in early 2022.

At the same time as this was happening, employees were spending more money on gas as they started to commute back to their workplaces.

Still, Somebody Needs to Tell the CEOs

Whether the increase in labor productivity was caused by an increase in remote work, a sudden spate of downsizings, and/or other factors, the bottom line is that business leaders should be careful not to assume that bringing people back into offices will automatically make them more productive. In fact, if the loss of productivity is being caused or influenced by higher inflation rates unmatched by higher compensation rates, then return to office mandates may make things worse rather than better.

Nonetheless, a lot of CEO seem to think that a return to office program is the way to go. Make It reports, “While half of employers say flexible work arrangements have worked well for their companies, 33% who planned to adopt a permanent virtual or hybrid model have changed their minds from a year ago, according to a January 2023 report from Monster.”

Best Not to Mention the Dog

How CEOs communicate about their desire to get more employees back in the workplace can be a tricky proposition and can result in public relations nightmares if not done well. A case in point is James Clarke, CEO of Clearlink, who was reportedly “slammed on social media after he praised an employee for selling his family’s dog to be able to return to the company’s office.”

I know nothing about Clarke. Maybe he’s otherwise a terrific business leader. But bosses may want need to rethink any allusions to dogs when it comes to return-to-work policies. Most Americans probably like their dogs way more than they like their fellow human beings, especially if those human beings are well-off CEOs forcing people to go back into the office on the perhaps faulty premise that it’ll boost productivity.

But Maybe It’s Not Even About Productivity

Of course, it could be that a lot of CEOs don’t really think it’s about productivity. Maybe it’s more about their own values and attitudes toward work and workers. Insider magazine quotes Joan Williams, the director of the Center for WorkLife Law at the University of California College of the Law: “These are men with very traditional views, who see the home as their wife’s domain and work as men’s domain. These are people like Elon Musk, for whom everything is a masculinity contest, and the workplace is the key arena. They have no desire to continue to work from home. This is not about workplace productivity. It’s about masculinity.”

So, some leaders prefer maculinity over employee performance? Is that the true cause of the decline in worker productivity?

Maybe.

The truth is, it’s complicated. I’m sure there are some female bosses who’d also like to see employees back in the workplace. And research from the HR Research Institute, where I work, shows that even a lot of HR professionals believe that their corporate cultures have suffered due to a massive move to remote work.

I imagine that a lot of this comes down to the specifics at any organization. Every company of signficant size has it own complex ecosystem of culture, policies, work processes and management quality. Business leaders need to make the best decisions they can given all these variables.

But they should keep in mind that a lot of employees might actually be more rather than less productive at home. If that’s true, the guy bosses should put their masculinity aside for the good of the organization. Save it for the handball court, the golf course, the corporate suite at the stadium, or whever they can let their testosterone (and views on dog ownership) flow unimpeded.

Featured image is from Awiseman, posted on Wikipedia at https://commons.wikimedia.org/wiki/File:Goofing_off_in_the_office.jpg

Do You Treat Employees Like Fixed-Program Computers?

When All Programs Were Fixed

Computers didn’t always work they do today. The first ones were what we now called “fixed-program computers,” which means that, without some serious  and complex adjustments, they could do only one type of computation.

Sometimes that type of computer was superbly useful, such as when breaking Nazi codes during World War II (see the bombe below). Still, they weren’t much more programmable than a calculator, which is a kind of modern-day fixed program computer.

Along Came John and Alan

The brilliant mathematician John von Neumann and colleagues had a different vision of what a computer should be. To be specific, they had Alan Turing’s vision of a “universal computing machine,” a theoretical machine that the genius Turing dreamt up in 1936. Without going into specifics, let’s just say that the von Neumann model used an architecture has been very influential up the present day.

One of the biggest advantages associated with Turing/von Neumann computers is that multiple programs can be stored in them, allowing them to do many different things depending on which programs are running.

Von Neumann architecture: Wikimedia

Today’s employers clearly see the advantage of stored-program computers. Yet I’d argue that many treat their employees and applicants more like the fixed-program computers of yesteryear.  That is, firms make a lot of hiring decisions based more on what people know when they walk in the door than based on their ability to acquire new learning.  These days, experts are well paid largely because of the “fixed” knowledge and capabilities they have. Most bright people just out of college, however, don’t have the same fixed knowledge and so are viewed as less valuable assets.

The Programmable Person

Employers aren’t entirely in the wrong here. It’s a lot easier to load a new software package into a modern computer than it is to train an employee who lacks proper skill sets.  It takes money and time for workers to develop expertise, resources that employers don’t want to “waste” in training.

But there’s also an irony here: human beings are the fastest learning animals (or machines, for that matter) in the history of, well, the universe, as far as we know. People are born to learn (we aren’t designated as sapiens sapiens for nothing), and we tend to pick things up quickly.

The Half-Life of Knowledge

What’s more, there’s a half-life to existing knowledge and techniques in most professions. An experienced doctor may misdiagnose a patient simply because his or her knowledge about certain symptoms or treatments are out-of date. The same concept applies to all kinds of employees but especially to professionals such as engineers, scientists, lawyers, and doctors. In other words, it applies to a lot of the people who earn the largest salaries in the corporate world.

Samuel Arbesman, author of The Half-Life of Facts: Why Everything We Know Has an Expiration Date, stated in a TEDx video, “Overall, we know how knowledge grows, and just as we know how knowledge grows, so too do we know how knowledge becomes overturned. ” Yet, in our recruitment and training policies, firms often act as if we don’t know this.

The only antidote to the shortening half-life of skills is more learning, whether it’s formal, informal or (preferably) both. And the only antidote to a lack of experience is giving people experience, or at least a good facsimile of experience, as in simulation-based learning.

The problem of treating people like fixed-program computers is part of a larger skills-shortage mythology. In his book  Why Good People Can’t Get Jobs , Prof. Peter Cappelli pointed to three driving factors behind the skills myth. A Washington Post article sums up:

Cappelli points to many’s unwillingness to pay market wages, their dependence on tightly calibrated software programs that screen out qualified candidates, and their ignorance about the lost opportunities when jobs remain unfilled…”Organizations typically have very good data on the costs of their operations—they can tell you to the penny how much each employee costs them,” Cappelli writes, “but most have little if any idea of the [economic or financial] value each employee contributes to the organization.” If more employers could see the opportunity cost of not having, say, a qualified engineer in place on an oil rig, or a mobile-device programmer ready to implement a new business idea, they’d be more likely to fill that open job with a less-than-perfect candidate and offer them on-the-job training.

Losing the Fixed-Program Mindset

The fixed-program mentality should increasingly become a relic of the past. Today, we know more than ever about how to provide good training to people, and we have a growing range of new technologies and paradigms, such as game-based learning, extended enterprise elearning systems, mobile learning and “massively open online courses” (aka, MOOCs).

A squad of soldiers learn communication and decision-making skills during virtual missions: Wikimedia

With such technologies, it’s become possible for employers to train future applicants even before they apply for a position. For example, a company that needs more employees trained in a specific set of programming languages could work with a provider to build online courses that teach those languages. Or they could potentially provide such training themselves via extended enterprise learning management systems.

The point is that there are more learning options today ever before. We live in a new age during which smart corporations will able to adopt a learning paradigm that is closer to that of stored-program computers, one that they’ve trusted their technologies to for over half a century.

Featured image: A rebuild of a British Bombe located at Bletchley Park museum. Transferred from en.wikipedia to Commons by Maksim. Wikimedia Commons.