I drove through Crotonville yesterday, and it occurred to me, amidst new demands on higher ed to “produce workers,” there must be research on the relationship between the decline in corporate training programs and increasing pressure on universities to become credential factories.
Sadly I couldn’t find much.1 (Comments are open if you know of any.) But it seems I’m not alone in seeing the decline of investment in training as a big problem today.
For decades, large American corporations such as General Motors, IBM, General Electric, AT&T, and others had comprehensive training and management development programs. A talented, stable, skilled workforce was seen as a competitive advantage. Training was seen as vital to corporate culture. General Electric had its 59-acre campus in Crotonville, NY, offering multi-week courses on management and strategy as late as 2010. New IBM employees entered multi-month residential training programs that instilled the "IBM way" of thinking, dressing, and behaving. (By the 1980s, IBM had a 7,000+ person training staff.) AT&T had the Bell System Center for Technical Education; the instructors were top performers in their fields who would rotate back into management roles with a broader understanding of the business. GM, Ford, and Chrysler had their own engineer training schools.
Starting in the late 1970s shareholder value thinking caused the disappearance of these programs. Under pressure to maximize short-term profits, long-term investments in human capital were now considered unnecessary costs. Goodbye corporate training programs, cut and dismantled through the 1980s and 1990s alongside downsizing, restructuring, and layoffs. Leanness was the idea. Buying talent was more efficient than making it.
The collective abandonment of training caused a shift in labor market power. When companies simultaneously eliminated development programs, they could leverage existing market position to externalize training costs entirely. Firms anticipated being able to hire “job-ready” candidates while free-riding on training investments to be made by others. The coordination problem was complete: with most major employers adopting similar strategies, workers faced limited alternatives and were compelled to acquire skills at their own expense. Companies could now dictate terms in a market where pre-trained candidates competed for positions, allowing employers to capture the benefits of skilled workers while avoiding the costs and risks of developing that human capital themselves.
By 2000, people became their own secretaries, typing their own email and doing their own scheduling. Graduates also became responsible for their own training, professional development, and self-marketing. No more relying on your employer to help make you a better employee. Over time, it became the new normal that skill acquisition was the burden of the individual alone.
Universities, seeing an opportunity, said “come to us and pay us for training you used to get on the job.” Many schools already had continuing education divisions; community colleges were already doing vocational training. Business management program exploded with the decline of corporate programs. Collaborative partnerships and shared research funding turned schools into R&D shops that corporations once maintained internally.
The shift created the same information asymmetries between educational institutions and employers that we are seeing today. Universities operate with imperfect information about actual workplace needs, forced to guess at industry demands based on job postings, surveys, and anecdotal feedback. Meanwhile, firms have little incentive to communicate their specific requirements, even if they were good at anticipating them, which AI showed most were not. This opacity serves employer interests, giving them the ability to reject candidates for lacking particular skills (and to complain about universities to their state representative) without having to invest in training themselves.
Today, it is the new normal that public universities should be filling the training role. The higher education sector finds itself responsible for workforce alignment and accountable for employment outcomes, even while operating without clear guidance about what employers need (or will need).2 New federal and state laws are demanding that universities be held accountable for how well they train graduates, though I did not see anything in the BBB about the ways AI is already changing the workforce.
There would be no need for employment accountability measures in the BBB except for the information gap: universities and graduates are chasing moving targets while firms externalize risk, capturing the benefits of an educated workforce without the costs of skill development or the responsibility for clearly defining their needs. Of course, companies find it more cost-effective (and less risky) to hire people who already paid for their own upskilling.
If some fields are in “high demand,” shouldn’t the employers that want the skill be willing to pay for the training?
Even if universities could graduate perfectly job-ready graduates, who should be responsible for guiding them on structured paths for advancement, for giving them a sense of progression and purpose? Who will help them ensure their skills remain current and relevant? Should they expect a culture of deep, informal learning where seasoned employees mentor and transfer knowledge as an integral part of daily operations? Is the university responsible for the whole life of the worker?
More immediately worrying is the rise of “credentialing,” or “micro-credentialing.” A “credential” can be anything from a formal degree, to a government license, to an industry certification, to a product-specific badge from a tech company. Long ago, a hiring manager might have known you, your mentor, university, or your last boss. That system was replaced by a single, blunt credential: the four-year college degree. Now that system has broken down and today a credential is supposed to be a standardized signal to solve a company’s scaling problem. An employer with 300 applicants needs a fast, cheap way to shrink the pile and a credential is a shortcut that says “this person isn’t a total risk.”
But there are persistent mismatches, with taxpayers footing the bill for credentials that aren’t “in demand.”
Isn’t the simpler solution having companies train their own employees again? Again, it would be good to see policy research on this.
How great would it be for new graduates to hear the message: “this company will invest in you,” instead of “here is a list of skills you must go acquire on your own time and at your own expense to make yourself a more attractive commodity.”
I’d like evidence that the current system is not what it seems: a decades-long market failure where the benefits of skilled workers accrue to employers, while the costs of developing human capital are borne by individuals and educational institutions. Cost-shifting has created the usual perverse incentives: companies can free-ride on training paid for by others, while workers bear the risk misaligning themselves with needed skills. Our national underinvestment in training undermines long-term growth and competitiveness.
Maybe we could try a different approach? Rather than continuing to push training costs onto public university systems and debt-burdened students, we could realign incentives for employers to train their own employees? Perhaps tax credits for corporate education programs or portable benefits that travel with workers?
Not to be nostalgic, but driving past the old Crotonville site does make one wonder if we had to choose between developing people and prioritizing quarterly earnings. In remembering that competitive advantage once came not from cultivating talent, it is good to remember also that cultivating skills should not wholly be the cost of the taxpayer.
One can find Marxist scholarship on the decline of “managerial hegemony.”
If you want to go to college for purposes other than a paying job, it seems wrong to punish your alma mater.
My experience at P&G, where I spent my entire career, was that the company made the training investments as a way to build the capabilities needed to excel in the marketplace so there was no conflict between training investments & delivering business outcomes.
This also meant, most of the training was on-the-job training rather than classrooms in any fancy campus that people had to travel to. This approach is true for front line technicians & management employees. And one reason why P&G is able to attract talented employees around the world.
Some colleges in some markets (a very small number from my experience) make the effort to partner in a way that both parties benefit from this but this is the exception rather then the norm.
I’d echo comment by JM: the top consulting firms continue to make substantial investments in training. The training programs that McKinsey runs are superb. One two-week training I attended was worth more than a year of business school. The sessions were full of experiential learning, role-play exercises that I can still recall 25 years later. When companies run their own training programs, the opportunity cost is high - they are taking those employees away from their day jobs. So big incentive to invest and make the programs truly value-additive and not just a credential or signaling. When an academic instruction runs a training program, incentives are to maximize revenue. Before McKinsey I was a submarine officer, and the six-month Nuclear Power School covered more material than two years of a STEM college degree. The military is an interesting case to compare against corporate and academic training, as the military also has high incentives to deliver effective training.