artificial intelligence

The Student We Carry in Our Heads

Years ago, I got to work late and had to grab the last parking spot, right in front of the university print shop. Technically, this was legal, but it was frowned upon; the person who ran the shop had a habit of hanging signs

Read More »

Get the Latest Updates

Subscribe To Our Weekly Newsletter

Magna Digital Library

Years ago, I got to work late and had to grab the last parking spot, right in front of the university print shop. Technically, this was legal, but it was frowned upon; the person who ran the shop had a habit of hanging signs with strategically situated caps outside the main entrance: “Do NOT leave engines idling.” “NO smoking.” “Receipts ABSOLUTELY required.”

Sitting in my car that morning—the engine most definitely NOT idling—I found myself staring at all this signage: “Please be sure door is closed when you enter”; “No returns on rush orders”; “No orders received less than FIVE MINUTES prior to closing.”

They reminded me of something. But what?

And then I knew: they were just like my syllabus.

I started teaching as a fairly naïve 23-year-old graduate student. “Naïve” might be too strong of a word, but maybe not. I honestly believed, when I first taught a class, that all my students were just like me: They cared about their classes. They cared about education. They liked to learn. They would show up to class (on time) and take notes (diligently, as I had); they would turn their papers in on time.

Then the day arrived when my students were expected to hand in the first drafts of their first papers. Here again, I might be using too strong of a word; “drafts” doesn’t really quite capture what some of these students turned in: A single page. Half a page. A paragraph. One student’s “draft” was three sentences scribbled in pencil on a sheet of lined paper—and so short that they’d actually ripped the paper in half so they wouldn’t lose the math homework they’d done on the rest of the sheet.

I was stunned. I’d never done anything like that when I was a student. When the teacher told me to turn in a draft of a paper, I turned in a draft. A full draft. Didn’t my students realize that I couldn’t help them write better papers if they didn’t turn in something we could build on?

I was also, I’ll admit, a little upset. Weren’t my students taking the class seriously? Weren’t they serious about learning how to write? Weren’t they taking me seriously?

You may already know the next scene in this cinematic tragedy. Cut to Paul adding language to his syllabus: “To get full credit, all drafts must be complete, covering all the expectations outlined in the assignment prompt.” And you know what? It worked. The next time drafts were required, my students turned in multipage rough essays that covered all the ground the assignment required, setting themselves up nicely for second and third drafts that, with some feedback from their peers and me, presented some sound intellectual work. Learning was happening! I was pleased! I must be a GREAT teacher!

What I didn’t know then was that I was developing a habit that would, to paraphrase Hemingway, take over my pedagogical thinking gradually and then all at once. From then on, every time I encountered a student practice that undermined my goals as an instructor, I added a line to my syllabus. Students didn’t turn in all the assigned work? “A failure to turn in a final draft within seven days of the due date will result in an F in the course.” Students skipped peer-response days? “A failure to attend class on these days without an extremely good excuse or prior notice will be seen as a lack of investment in the work of the course and could lead to dismissal.” Students kept showing up to class without their assigned informal writing? “I will not accept anything that’s not typed, nor will I accept anything turned in after class or via email.”

In light of the flexibility many of us have adapted in the COVID era, reading these statements—all of them punitive, all of them uncompromising—is cringe-inducing. Clearly, the student I was carrying around in my head—the assumptions I held about the complex, dynamic, evolving humans who were stepping into my classroom every day—was a dark configuration, a lowest-common denominator aggregate of my own anger and fears and assumptions.

And I knew this even before the pandemic. Covering the syllabus on the first day of class, I used to joke nervously as we moved into the dos and don’ts of the course: “I like to call this the page of death,” I would say, turning to the lengthy list of classroom practices and course logistics. As much as I felt the need to create a learning environment that kept students on track, creating spaces and habits of mind that would allow them to attain their greatest learning, I knew all these rules and regulations made my students feel deeply—well, “uncomfortable” doesn’t quite capture it. Perhaps . . . diminished? All those rules, all those regulations, all those projected failures, all those implied failings on their part—surely these statements must have made my students feel “less than”: less adult, less serious, less rational, less well-intentioned, less trusted. Less good.

Here’s my point: right now all of us working or learning in higher education are sitting in that car outside the print shop, staring at signs that say things like “Teachers are on alert for inevitable cheating after release of ChatGPT” (Meckler and Verma) and “ChatGPT wrote my AP English essay—and I passed” (Stern).

The assumptions here aren’t so much implied as shouted in bold, all-caps, size-16 Comic Sans: students are cheaters. Students are liars. They can’t be trusted. They don’t care about their classes; they don’t care about their futures; they don’t care about us; and they sure don’t care about their learning.

It’s worth noting that most of this hyperventilation about AI isn’t coming from instructors working in the classroom with the aforementioned devious delinquents who’d sell their own mothers for a B minus and a red Solo cup at a kegger. This is something of a relief. Yes, there’s worry among colleagues; on my own campus, fully a quarter of our faculty signed up for our first discussion of ChatGPT, roughly twice as many as had showed up for these next most-popular offering. So, there’s concern. But there are also conversations, brainstorming, thoughtful reflection, even some optimism that AI might actually create opportunities to move education forward in productive ways. Already, on Twitter and in blogs and newsletters and hallways and conference rooms, faculty are sharing ideas about how to work with AI, how to use it to enhance learning, how to develop “AI-proof” assignments that actually deepen learning.

It is impossible to overestimate the necessity of these thoughtful conversations—and the deliberation and deliberative practices which they will engender—to the survival of higher education. Artificial intelligence isn’t going away. As Randy Bass and Joseph Aoun have pointed out, AI will change not only higher education but the kinds of work and the ways of working our students will engage in after graduation. If colleges and universities are to achieve their deeply held mission of preparing their graduates for constructive engagement with a world in crisis, we need to shake ourselves free of practices—assignments, curricula, pedagogies—incubated five, 50, or 150 years ago.

More than that, we need to be sure that the students we carry in our heads—the students we project into our syllabi and our lectures and our essay prompts—don’t diminish the actual students in the classroom. In his quietly brilliant In Defense of the Liberal Arts, Fareed Zakaria points out that beneath all the blistering rhetoric about millennials—William Deresiewicz calls them “excellent sheep”—there is evidence that “kids these days” actually care more than a little about the world they live in: they volunteer more than previous generations; they’re more likely than their parents to become community leaders; they’re more likely to join the Peace Corps; they’re more likely to work for NGOs (151–59).

“They,” in other words, are not unprincipled hooligans who will, one and all, plug key phrases into ChatGPT and hit “enter” before turning back to their video games and vaping orgies. Yes, “they” are young. Yes, “they” are tech-curious. Yes, certainly, there are moments when “they” will take shortcuts—as will we all.

But they are also wonderfully, amazingly, beautifully, uncompromisingly hopeful and idealistic. They want to grow. They want to search and discover. They want to stumble and try again and succeed. They want to find their best selves, meet their greatest potential—and then rush past it.

And us faculty? It’s our job to create spaces and learning opportunities where students can step into intellectual selves they never knew were possible. To do that, we have to walk into the classroom every morning, every afternoon, every evening, and every weekend, assuming those best selves. We need to trust our students. We need to believe in them. Without that, they are doomed. And so are we.


Aoun, Joseph E. 2018. Robot-Proof: Higher Education in the Age of Artificial Intelligence. Cambridge, MA: MIT Press.

Bass, Randy. 2020. “Can We Liberate Liberal Education?” In Redesigning Liberal Education: Innovative Design for a Twenty-First Century Undergraduate Education, edited by William Moner, Phillip Motley, and Rebecca Pope-Ruark, 221–38. Baltimore: Johns Hopkins University Press.

Meckler, Laura, and Verma, Pranshu. 2022. “Teachers Are on Alert for Inevitable Cheating after Release of ChatGPT.” The Washington Post, December 28, 2022.

Stern, Joanna. 2022. “ChatGPT Wrote My AP English Essay—and I Passed.” The Wall Street Journal, December 21, 2022.

Zakaria, Fareed. 2015. In Defense of a Liberal Education. New York: W.W. Norton & Company.

Paul Hanstedt, PhD, is the founding director of the Harte Center for Teaching and Learning at Washington and Lee University and the author of General Education Essentials: A Guide for College Faculty (about to come out in a second edition) and Creating Wicked Students: Designing Courses for a Complex World