Last Mile Education
The Syllabus Doesn't Matter
Across the country, states are demanding that professors post their syllabi online so the public can scrutinize what is being taught. And yet, in the age of AI, the syllabus is the most irrelevant document in higher education.
Two years ago, when perhaps 99% of course content came from the instructor, delivered via lectures, readings, course materials, the syllabus mattered. Today, over 92% of students use generative AI for coursework. What do people think that means? It’s not just writing papers. Students are prompting and reading on their own, including the stuff the politicians are screening for.
Ask around. Students come to class having already asked ChatGPT (or Claude or Gemini) about the course topic. They’ve gotten explanations, examples, counterarguments, and alternative ways to look at things before the professor says a word.
There is no ‘going to class’ or ‘doing the reading.’ You look things up on ChatGPT, post something on the online discussion board, and have it write a paper for you. Maybe you learn something along the way, but it isn’t from the teacher!
This is a quote from my hair stylist, a small business owner who, for the past two years, has been using ChatGPT to pass his classes at the community college where he is getting his general education requirements “out of the way” before heading to University of Utah.
Every month for the past two years I’ve sat in his chair as he tells me how things are going. When ChatGPT can summarize, no need to watch the films for a film criticism class (though he did when they interested him). All the students were posting Chat-generated words on the discussion board. No need to read the textbook for his American History course. For a course on the history of illegal substances, the best data came from another client, a defense attorney. (I learned a lot too!) He tried going to in-person intro statistics review sessions, but when the teacher gave him a handout that was clearly Chat-generated, he went back to Chat. And so on, eight classes over four semesters.
When this young man transfers to the U, neither the syllabi of the courses he took nor his transcript will indicate whether he learned, what he learned, or how he learned.
I support him 100%. Why? Because his instructors have been phoning it in. So why shouldn’t he? The losers here are the taxpayers who are subsidizing faculty who don’t teach, the legislature, and anyone who believes that a transcript or a diploma should signify something.
Luckily in this case, this student is super smart, already successful, and understands the most important part of education: value people who know a lot about a subject. Because of our conversations, he knows how to look things up on JSTOR and Google Scholar to augment what he learns on ChatGPT, and he will bring that intelligence to his classes at the U.
The AI era has made it clear that there is a revolution ahead in higher ed. As I wrote a year ago, the “last mile” is the valuable last 5-15% of local, specific, fine-grained knowledge beyond what AI can deliver on any subject. For universities, the last mile is the detailed, curated education delivered to students, and is the only education worth paying for.
What are Students Getting from AI?
More than 92% of students are using AI and using it for far more than writing papers. The heaviest users are likely now getting only 50% of the course content from the teacher and the syllabus. For some classes it’s lower (perhaps 20% for a general education course) for some it’s higher (90% for advanced seminars). Looking things up on your own is not cheating. I did it years ago, in the encyclopedia.
Let’s say there’s course on the French Revolution. Why read books from the syllabus when you can ask AI for a broad overview, get clarification on confusing points, and ask tangential questions that interest you? The AI is patient, available at 2 am, and can adjust explanations to any level.
Students learn from AI what is known widely: the basic sequence of events, the key figures, the major interpretations. AI can compare the French Revolution to other revolutions. It can give links to websites with images and artworks. It can generate essay outlines and practice questions.
AI excels at general knowledge, what is generally known and what exists at the level of abstractions. AI can generate explanations about “social upheaval” and “revolutionary ideology,” though not always with specific examples. But it will give enough so that students can master the terminology and perform well on tests. As I learned at the hair salon.
The slightly embittered view is that AI offers what most courses are now providing. Faculty stand at the front and deliver the general knowledge listed on the syllabus, expecting students to show up and take notes. It’s as old fashioned as leafing through the TV guide, your choices limited to what’s being aired that night.
AI offers processed knowledge, what’s already been written, analyzed, and incorporated into its training data. A professor who is a domain expert in the French Revolution can talk to students about sitting in the archives, reading parish records from a village in 1789, seeing what the local grievances were. She can describe what Marat’s bath smelled like. What a guillotine looks like up close. She can describe the only letter known between Robespierre and Danton and what it indicated about their relationship. This is last mile knowledge
Last mile education is the value proposition of a college education. It’s why the syllabus is becoming irrelevant. If students are using AI, the syllabus should be as general as possible. The important question, which the syllabus cannot answer, is whether the professor possesses the domain expertise to do the work of tailoring scholarship to student interests. Does this professor possess knowledge that exceeds what AI can provide? Does she work at the frontier of their field (especially important in the sciences)? Can he teach the last mile?
When students arrive in class having already consumed AI-generated basics, the last mile is where higher education should begin. College was designed build upon knowledge delivered in high school, not repeat it. Students and parents should demand their tuition dollars to be paying for the meeting of specific faculty expertise and student interest.
Let’s take a currently required course, “The U.S. Civil War,” which meets the American History mandate. One student might come in interested in emancipation. Another interested in the cannon invented by Louis Napoleon, and another with Civil War video games. Another wants to learn more about women’s legal rights in the South. The professor will not know any of this when creating the syllabus. He will assume that students will arrive with AI-delivered general knowledge, reading for a deep dive beyond what AI knows.
As I wrote earlier this year, the only defensible faculty role is working at the edge of knowledge. AI can provide more comprehensive coverage of any topic than any human professor. It can draw from millions of sources, synthesize competing interpretations, and generate practice problems indefinitely. So the value added of faculty expertise is local, specific knowledge of what is revolutionizing the field.
The professor can focus on curating readings tailored to each student’s interest and running discussion seminars with the whole class or groups of students. The professor will know each student’s work over time, can tell whether a final project represents real intellectual growth rather than clever pattern matching. The professor can see when a student is ready for more challenging material. The professor can identify the specific conceptual gap that’s holding this student back. This is the kind of attention my hair stylist never got. It’s what I describe happening at lower grades in places like Alpha School.
Why isn’t this happening now? Well it is, in small private colleges. But nobody really keeps tabs on student progress if the student is getting good grades and doesn’t complain. Why should anybody check?
Why the Syllabus Doesn’t Matter
If state lawmakers want to improve education, the first thing to recognize is that the revolution is here and that the only relationship that matters is between the faculty member and the student. Focusing on the syllabus is the wrong approach, if the concerns are what a student is getting from a course.
The right questions are:
Does this professor possess knowledge that exceeds what AI provides? You can’t get this from a syllabus. You can only answer it by looking at the professor’s scholarship. This is the department’s job and a dean’s job.
Does this professor know their students well enough to know that they are actually doing the work? You can’t get this from a syllabus or a transcript. You can only answer it by listening to students, looking at course evaluations, and implementing mentoring structures. This is a dean’s job.
Does this course provide anything that students cannot access elsewhere? You can’t get this from a syllabus. Again, only department colleagues and a dean can answer this.
This is the first time I’ve written about my stylist, though I’ve talked about him to colleagues. His frustrations are at the root of much of what I’ve written this past year. He is the one being wronged, not doing the wrong. I do not want to use the word cheating because it doesn’t capture either the paltriness of the education he is (not) getting or the resilience he has shown in using a new technology to actually learn something while checking boxes.
Let me be clear – I never suggested he use ChatGPT; I have neither encouraged nor discouraged him. But when he explains how things are going I say, “no, you should be getting a hell of a lot more feedback than that.”
If anything, he is a success story. When I see what is coming down the pike, given middle-and high-school student use of AI, I wonder if the next generation of students will be inclined to learn from human beings at all. True faculty experts will be the only people keeping their heads.
The Last Mile Test
There are ways to prepare. First, state lawmakers who want a better education system should invest in last mile teaching. Legislators should stop mandating syllabus standardization. If every Intro Psychology course must cover the same material in the same order, you’re not delivering a last mile education.
States should stop requiring universities to measure learning through standardized outcomes. Instead, trust faculty who know individual students to make judgments about readiness, growth, and potential.
Universities should stop organizing around credit hours and seat time. If students must spend 16 weeks in a course regardless of whether they’ve mastered the material, you’re optimizing for time-filling rather than learning. Instead, let students who’ve mastered general knowledge move forward at their own pace.
What is a college degree when AI is the entity taking and passing all the classes and yet all the reported metrics say everything is just fine?
The syllabi faculty are now posting tell less and less about what is being taught or learned in a class. Same with a transcript. Students have come to understand that just as they can listen to any piece of music at any time they want, watch any movie or TV show any time they want, chat in social media rooms across time and space any time they want, they do not want to be time- and place-bound in their education, even while they want personalized attention.
The job of universities is to meet students where they are, to provide an education beyond what AI can offer, and explain to everyone why we matter.



Bracing as usual, but it reminds me of a psychology professor I had as an undergrad. Her lectures just repeated the textbook, but she was a great speaker and never once did I think to myself, I could have just read this myself and taken a test. She was the human expert in front of me, the one I impressed by visiting office hours and talking about relevant insights from sociology, the one who held me accountable on tests and who would have written me a recommendation had I needed one. The general accessibility of general knowledge didn't begin with GPT. We can ask "what do we need professors for?" but to some extent we could have asked that 30 years ago and I'm not sure the old answers are entirely out of date. I don't mean to imply that AI isn't enormously transformative, and Hollis is a bold and lucid thinker about that, but in thinking about its implications we shouldn't talk as if, pre-GPT, all knowledge was a carefully guarded secret whispered to students behind closed doors after they took an oath of secrecy.
I think there is a very serious flaw with your argument, which starts off with a confusion between information and knowledge and extends to the most crucial but absent point: that the goal of education should be to teach students TO THINK.
If you already know how to think, AI can be a great tool for extending your knowledge, if used with care.
But college students in survey courses don't know how to think and analyze information — and they will never learn that from AI (or, for that matter, from bad professors). That is the value of getting a real education.
In my experience, learning how to think happens in several ways. Two of these include analyzing difficult questions in a group context (a classroom, which includes real discussion) and creating thoughtful arguments and presenting them in writing, otherwise known as papers or essays.
Students will never learn to think just by reading things on their own without any outside intellectual friction and dialogue, and they will never learn to present a thoughtful argument if "their" writing is outsourced to AI.
If we follow your suggestions, students will be robbed of the ability to genuinely learn and think. They will also be robbed of the chance of learning how to write.
You seem to be saying:
"Given the chance, many students will avoid the real work needed to learn how to think, engage in critical inquiry, and to write — and so what? In the world of AI, those skills are now passé, so we should just let them slide through the educational system, which is nothing more than a processing exercise anyways."