The Stanford study reveals a 13% relative decline in employment for early-career workers aged 22-25 in AI-exposed occupations since 2022. AI displacement is not the only cause. The real issue is the weakening of higher education's core promise: colleges should produce people who are not just knowledgeable but truly educated in a way that connects aesthetically and socially.
The concept of being educated is perceptual. Others make an aesthetic judgment about it in professional settings. Properties include intellectual depth, clear reasoning, and sound judgment signaling trust and potential. Something like the intangible polish that turns a contributor into a leader. It includes the ability to combine ideas, handle uncertainty, and build confidence through presence and communication. In high-stakes settings today, this aesthetic is essential for success. While AI makes technical tasks commonplace, the “educated person” aesthetic endures.
This ideal faces attacks. Critics call traditional education elitist or outdated. They claim that being a technician is enough in an AI world. Critics highlight colleges' flaws. These include high costs and impersonal online courses that favor book learning over real experience such as unspoken knowledge, social networks, and human endorsements. Shifts to remote education during the pandemic made young workers more replaceable by AI. It removed key social elements like professor recommendations, face-to-face teamwork, and hands-on challenges that built the educated aesthetic. Colleges deserve criticism for failing to create such people reliably in this new environment. However, that does not mean we should end the system.
We must push back against the idea that technicians alone matter. Instead, let’s reinvent higher education for the AI era. Colleges must ensure graduates can communicate at the application layer. This is the active space where human ideas meet AI tools. It goes beyond just using an LLM to create content. It means using AI as part of your thinking, like how reading and writing marked educated people in the past.
In earlier times, literacy required mastering the sentence for exactness, the paragraph for flow, and the essay for strong arguments. These tools let people express complex ideas and sway others. While these remain key skills - especially for graduate students - “educated persons” also need practical knowledge of tools like GitHub for team-based updates and changes. They need Cursor or a similar AI IDE for smooth human-AI building. They also need Vercel or another serverless platform for quick, large-scale deployments. These are not extra facts or skills. They form the basics of “being educated” in the new economy.
Hmm. Your Last Mile issues seem a bit like a God-In-The-Gaps argument to me. You say AI is bad at certain things or that it's hard to get them to do it well. That's true, but also temporary and user-dependent. For a good prompt engineer a lot of what you call hard is trivial. Which is why y'all hire folks like us. Basically, it's "hard' to do well... given our current understanding of how to use AI and the inherent limitations of the models themselves (which I assure you, we haven't even scratched).
That said though, the broader employment and educational issues are quite pressing. The simple fact is, we've spent 150 years with the Prussian-American educational model building factory cogs and drones. The skills needed to actually succeed from now on have been actively engineered away from and a war fought against for generations: critical thinking, cross-disciplinary synthesis, generalism and effective argument.
They cast the jobs replaced as "entry level" - that is a grossly misleading euphemism meant to cover up some elephants in the room. The jobs being replaced aren't "low experience" so much as "low skill" and "low creativity".
The jobs AI replaces first are those of people with the least real skill in the world. They may have filled very needed important necessary positions, but not necessarily ones that do a whole lot. Frankly, most of the people who should be worried are women with degrees in fields that aren't based on facts - social work, clerical, empathy stuff, a lot of criminal justice, virtually all social critique and other fields where you job is just to argue with other people like you about the unproveable - AI is gonna eat all their lunches in a year.
I’m trusting no canaries were hurt in the image, which I think I can relax in knowing it is AI! Terrific piece. Thanks
The Stanford study reveals a 13% relative decline in employment for early-career workers aged 22-25 in AI-exposed occupations since 2022. AI displacement is not the only cause. The real issue is the weakening of higher education's core promise: colleges should produce people who are not just knowledgeable but truly educated in a way that connects aesthetically and socially.
The concept of being educated is perceptual. Others make an aesthetic judgment about it in professional settings. Properties include intellectual depth, clear reasoning, and sound judgment signaling trust and potential. Something like the intangible polish that turns a contributor into a leader. It includes the ability to combine ideas, handle uncertainty, and build confidence through presence and communication. In high-stakes settings today, this aesthetic is essential for success. While AI makes technical tasks commonplace, the “educated person” aesthetic endures.
This ideal faces attacks. Critics call traditional education elitist or outdated. They claim that being a technician is enough in an AI world. Critics highlight colleges' flaws. These include high costs and impersonal online courses that favor book learning over real experience such as unspoken knowledge, social networks, and human endorsements. Shifts to remote education during the pandemic made young workers more replaceable by AI. It removed key social elements like professor recommendations, face-to-face teamwork, and hands-on challenges that built the educated aesthetic. Colleges deserve criticism for failing to create such people reliably in this new environment. However, that does not mean we should end the system.
We must push back against the idea that technicians alone matter. Instead, let’s reinvent higher education for the AI era. Colleges must ensure graduates can communicate at the application layer. This is the active space where human ideas meet AI tools. It goes beyond just using an LLM to create content. It means using AI as part of your thinking, like how reading and writing marked educated people in the past.
In earlier times, literacy required mastering the sentence for exactness, the paragraph for flow, and the essay for strong arguments. These tools let people express complex ideas and sway others. While these remain key skills - especially for graduate students - “educated persons” also need practical knowledge of tools like GitHub for team-based updates and changes. They need Cursor or a similar AI IDE for smooth human-AI building. They also need Vercel or another serverless platform for quick, large-scale deployments. These are not extra facts or skills. They form the basics of “being educated” in the new economy.
Great piece !
Hmm. Your Last Mile issues seem a bit like a God-In-The-Gaps argument to me. You say AI is bad at certain things or that it's hard to get them to do it well. That's true, but also temporary and user-dependent. For a good prompt engineer a lot of what you call hard is trivial. Which is why y'all hire folks like us. Basically, it's "hard' to do well... given our current understanding of how to use AI and the inherent limitations of the models themselves (which I assure you, we haven't even scratched).
That said though, the broader employment and educational issues are quite pressing. The simple fact is, we've spent 150 years with the Prussian-American educational model building factory cogs and drones. The skills needed to actually succeed from now on have been actively engineered away from and a war fought against for generations: critical thinking, cross-disciplinary synthesis, generalism and effective argument.
They cast the jobs replaced as "entry level" - that is a grossly misleading euphemism meant to cover up some elephants in the room. The jobs being replaced aren't "low experience" so much as "low skill" and "low creativity".
The jobs AI replaces first are those of people with the least real skill in the world. They may have filled very needed important necessary positions, but not necessarily ones that do a whole lot. Frankly, most of the people who should be worried are women with degrees in fields that aren't based on facts - social work, clerical, empathy stuff, a lot of criminal justice, virtually all social critique and other fields where you job is just to argue with other people like you about the unproveable - AI is gonna eat all their lunches in a year.
Brilliant analysis!