Discussion about this post

User's avatar
David Gibson's avatar

Bracing as usual, but it reminds me of a psychology professor I had as an undergrad. Her lectures just repeated the textbook, but she was a great speaker and never once did I think to myself, I could have just read this myself and taken a test. She was the human expert in front of me, the one I impressed by visiting office hours and talking about relevant insights from sociology, the one who held me accountable on tests and who would have written me a recommendation had I needed one. The general accessibility of general knowledge didn't begin with GPT. We can ask "what do we need professors for?" but to some extent we could have asked that 30 years ago and I'm not sure the old answers are entirely out of date. I don't mean to imply that AI isn't enormously transformative, and Hollis is a bold and lucid thinker about that, but in thinking about its implications we shouldn't talk as if, pre-GPT, all knowledge was a carefully guarded secret whispered to students behind closed doors after they took an oath of secrecy.

David Fideler's avatar

I think there is a very serious flaw with your argument, which starts off with a confusion between information and knowledge and extends to the most crucial but absent point: that the goal of education should be to teach students TO THINK.

If you already know how to think, AI can be a great tool for extending your knowledge, if used with care.

But college students in survey courses don't know how to think and analyze information — and they will never learn that from AI (or, for that matter, from bad professors). That is the value of getting a real education.

In my experience, learning how to think happens in several ways. Two of these include analyzing difficult questions in a group context (a classroom, which includes real discussion) and creating thoughtful arguments and presenting them in writing, otherwise known as papers or essays.

Students will never learn to think just by reading things on their own without any outside intellectual friction and dialogue, and they will never learn to present a thoughtful argument if "their" writing is outsourced to AI.

If we follow your suggestions, students will be robbed of the ability to genuinely learn and think. They will also be robbed of the chance of learning how to write.

You seem to be saying:

"Given the chance, many students will avoid the real work needed to learn how to think, engage in critical inquiry, and to write — and so what? In the world of AI, those skills are now passé, so we should just let them slide through the educational system, which is nothing more than a processing exercise anyways."

12 more comments...

No posts

Ready for more?