Allowing Failure
Milton's marketplace > Mill's
The two big higher ed stories in the last two weeks were about grades and what they mean, if anything. First was the story on grade inflation at Harvard. Everyone gets an A. Then came the bombshell report by UC San Diego professors that students with 4.0 transcripts arrive on campus unprepared for college-level work in math and writing.
Intellectual development is hard to measure. Transcripts are supposed to indicate something “true” about student achievement. But vast bureaucratic and social systems have evolved to ensure scores match expectations rather than reflect actual learning. Now in the AI era, grades may no longer predict anything at all.
Those seeking “truth” on college campuses have been so focused on viewpoint diversity they’ve overlooked the eroding truth of documents that matter most in education, the transcript and the diploma. Across the country, even at elite private institutions, education is being delivered and measured at scale. At scale, a grade that requires an asterisk and an explanation kludges up systems designed for numbers alone. Everyone understands that education at scale means there is no room for an outcome that requires an explanation. That means nobody can stumble. Nobody can fail. And that means nobody takes risks, at least in the realm of education.
Worse, the system nudges students away from risk.
The other day Benjamin Studebaker in a brilliant piece edtitled “The Real Reason They Tell You What to Think in College,” put some of the blame for higher ed’s woes on the “Sunsteinian nudge,” which I think deserves more attention. Nudging is a key part of why there is grade inflation from high school to Harvard.
Richard Thaler and Cass Sunstein’s Nudge: Improving Decisions about Health Wealth, and Happiness (2008) hit the discourse just as Barack Obama was elected. Nudge argues that because people make systematic errors in judgment, we should design choice environments that lead them to better outcomes without requiring deliberation. Think about a school cafeteria, they say. Arrange the food so students pick up a healthy apple without thinking about it. Put healthier items at eye level, less healthy options in harder-to-reach places. “Individuals are not actually prevented from eating whatever they want,” they write, “but arranging the food in a certain way will influence what people choose.” The apple gets selected through environmental manipulation rather than through the student developing better judgment about nutrition. Desired behavior happens automatically, triggered by cognitive shortcuts that bypass reflection.
I suspected nudging would be a cancer in education and it has been. Many correctly objected to the whole nudging idea as paternalistic. Jeremy Waldron wrote in 2014 that nudging “doesn’t teach me not to use inappropriate heuristics or to abandon irrational intuitions or outdated rules of thumb. It does not try to educate my choosing, for maybe I am unteachable. Instead it builds on my foibles.” Nudging is infantilizing. Clearly it has eroded young people’s capacity for autonomous decision-making. How can you develop better judgment without failing a few times?
Students will not develop judgement without chances to fail. They need to be able to choose challenging courses where they might struggle, to pursue arguments they’re unsure about, to switch majors or methods. They need the chance to fail in ways that teach rather than punish. Risk-elimination systems make this impossible. A student who has been nudged through four years of optimized decisions has learned how to be managed.
The UC San Diego report should be the wakeup call that nudges don’t work. The thirtyfold increase in freshmen with math skills below middle school standards is the predictable result of a “nudge” mindset obsessed with engineering “success” while eliminating risk. What else could happen at the end of a K-12 system that refuses to let students fail? The proposed solution of more predictive analytics, more “strategic interventions,” and remedial dashboards doubles down on the “nudge” that produced unprepared students in the first place.
Elite opinion never really objected to the nudge concept because it’s full employment for the managerial classes, especially in higher education. Why are there so many associate deans and assistant vice provosts? Someone needs to do the nudging.
Universities embraced nudge theory and implemented it in every aspect of daily operations. Text messages nudge students about missed classes. Emails nudge them about financial aid deadlines. Learning management systems track which students access which resources and send automated, personalized messages to those who haven’t clicked through. Dashboards give faculty real-time data on student engagement patterns.
The California State University system, where I was a dean, has built an entire nudge infrastructure around “student success dashboards” with predictive analytics to identify disengaged students and intervene before they fail. The research literature celebrates nudges as “low-cost, simple mechanisms” that are “scalable” and create “socioeconomic mobility.” They reduce students’ “mental load” and “simplify processes.”
None of this nudging works in the long run.
The irony is that higher ed’s most prominent critics focused on “truth” elsewhere than transcripts are also serial nudgers. Heterodox Academy and FIRE want more “viewpoint diversity,” more debate, more exposure to uncomfortable ideas. Their proposed solutions focus on incentivizing (i.e. nudging), basically putting viewpoint diversity next to apples in the cafeteria, hoping students will put some on their tray without noticing.


Virtue, as the great John Milton wrote in his free speech pamphlet Areopagitica (1644), cannot be achieved through sheltered innocence.
I cannot praise a fugitive and cloistered virtue, unexercised and unbreathed, that never sallies out and sees her adversary, but slinks out of the race where that immortal garland is to be run for, not without dust and heat.
Virtue requires your ideas being accosted by all sides. You have to make choices in an environment where there are some truly bad options to choose from. “That which purifies us is trial, and trial is by what is contrary,” Milton writes. Without going through this, you don’t have a moral capacity, let alone a moral compass. You can’t become virtuous when temptations are removed. You can only become virtuous by facing temptations (a good apple or that famous one) and choosing well.
I’m fascinated that the preferred marketplace metaphor is not Milton’s but John Stuart Mill’s from On Liberty (1859), which Oliver Wendell Holmes famously distilled into the “marketplace of ideas” idea in his 1919 dissent in Abrams v. United States. Mill argues that people need to exchange ideas and if the opinion is wrong, they lose “what is almost as great a benefit, the clearer perception and livelier impression of truth produced by its collision with error.” The point is that people discover what’s true through the free exchange of competing claims.
Milton’s marketplace is dangerous for civic leaders. You can’t clean up the sketchy parts of town. You need young people to go there, or at least read the banned books:
Since therefore the knowledge and survey of vice is in this world so necessary to the constituting of human virtue, and the scanning of error to the confirmation of truth, how can we more safely and with less danger scout into the regions of sin and falsity than by reading all manner of tractates and hearing all manner of reason? And this is the benefit which may be had of books promiscuously read.
You see why university leaders prefer Mill’s marketplace to Milton’s. Getting rid of bad books and temptations appeals to parents and is easier on the legal bills. We will give you a marketplace of wholesome ideas in the safety of the classroom, for an A!
More Milton is needed in higher education. Milton’s marketplace requires capacity to face error and make real choices, including choosing badly. Universities don’t even try to cultivate the facing of error. It’s safer to embrace the “Sunsteinian nudge” idea that students cannot be expected to develop better judgment, so things must be arranged such that good outcomes happen despite bad judgment. The student gets the apple without thinking about nutrition, ends up in a major without understanding what preparation it requires, graduates in four years without friction. Failure is not an option.
The nudged student never develops the capacity Milton described, to “see and know, and yet abstain.”
The student who is not academically prepared for college now receives scaffolding, support, and accommodations, everything except the one thing Milton advocated: the requirement to struggle with difficulty. I applaud the UCSD faculty pushing back against all of this. They are confronting the entire managerial-therapeutic ethos of the modern university. This philosophy of accommodation, built by choice architects who have lost faith in the student as an autonomous moral agent, operates through predictive dashboards and safe, simplified processes. Restoring virtue requires restoring risk. Dismantling this infrastructure of infantilization means a return to a simple, necessary principle: allowing failure.


You imply this but don't hit it directly: incentives align towards cooperation among administrators, professors, and students. But this is not the noble cooperation of mutual affection, rather it's the insidious debasement of tacit collusion.
Admin wants good recruitment and happy customers, professors don't want complaints, students don't want to work too hard. It's easy in this environment for a tacit agreement to crop up that we all satisfy the average of expectations.
This average however is a target that moves in one direction - barring heroic efforts by two of the three players. In my model to improve a cultural dynamic like this, two of the three players must create the new expectation. I think the magic sauce is in knowing how this is done at a more granular level.
I agree that students need to be able to fail and learn from that. I taught philosophy for 25 years and on the first day of every class every semester, I would talk with the students about being okay with "the F word." At first they laughed, but then when I said "fail" rather than the other f-word, they looked horrified. The thought of getting anything lower than an A or B was beyond their comfort zone, so that always included a 10-15 minute follow-up conversation about what it takes to really learn--and possibly/likely failing on something at some point was part of that.