Why Higher Ed's Anxieties Over AI May Be Misplaced
and how C.S. Lewis’s “Men Without Chests” reveals a deeper concern about the AI revolution
Last month, I attended an AI workshop designed for educators. From my perspective, it was a worthwhile workshop that helped me think about how I could better implement generative AI for managing tasks in both my research and in my teaching. When trying to balance a lot of roles at a small liberal arts college, you try to pay attention to anything that may help when keeping the plates spinning when wearing different hats.
However, the workshop was not immune from turning into the often-repeated existential session for teachers concerning AI: “Will AI make what I teach obsolete?", "How do we compete with machines that can produce academic products instantly?", "What's the point of human instruction if students can just ask ChatGPT?", etc. Granted, I believe these questions are valid and are worth addressing. However, the real question I took away from that workshop wasn't whether AI will outperform us at certain tasks—because the answer is of course it will. The question is how do we differentiate and teach what is uniquely human about the world that no algorithm can replicate; something that makes our contribution to the world irreplaceable no matter what new AI model is rolled out in the coming months or years?
That said, I believe a good response to this question was provided not today in 2025, but in 1943 when C.S. Lewis wrote his treatise on "Men without Chests" from The Abolition of Man. Writing in the 1940s, Lewis couldn't have imagined anything like ChatGPT, DALL-E, or Claude AI, but he perfectly described what I think is potentially our greatest danger and yet also our greatest assurance in the age of artificial intelligence.
AI Prompts Without Chests
In "The Abolition of Man," Lewis introduces us to what he calls "men without chests"—people who possess technical knowledge and can think clearly, but who lack what he calls the "chest": the notion of having well developed emotion and moral intuition. Lewis describes this condition with striking clarity: "We make men without chests and expect of them virtue and enterprise. We laugh at honour and are shocked to find traitors in our midst.” 1
While Lewis describes his observations about higher ed in the 1940s, he actuallty captures well what our current AI systems are—sophisticated "men without chests" on a technological scale. AI can process information (the head function) and optimize for goals (the belly function), but it fundamentally lacks that chest—the capacity for moral feeling, genuine empathy, and the kind of love that guides human judgment. Lewis explains the crucial role of this "chest":
The Chest—Magnanimity—Sentiment—these are the indispensable liaison officers between cerebral man and visceral man. It may even be said that it is by this middle element that man is man: for by his intellect he is mere spirit and by his appetite mere animal. 2
In other words, without the chest, we become either calculating machines or driven beasts—precisely what AI represents in its current form. Consider how current AI operates: it can analyze thousands of data points about human suffering and optimize relief efforts with incredible efficiency, but it does not know what to feel or ought to feel about the weight of another person's pain. It can predict what is the most applicable response to produce in a given situation based on the averages provided through a learning language training model, but it cannot connect its reason with a well-conditioned emotion to truly empathize with our fellow human beings.
Why We Shouldn't Fear AI (But Should Fear Becoming Like It)
Lewis helps us see that the real danger isn't AI becoming too human—it's revealing that humans are prone to becoming too much like AI. When we bypass too much of our thinking directly to appetite-driven goals, we risk atrophying our own moral muscles produced from our chests. This warning is about preserving and cultivating our distinctly human capacities for empathy, moral intuition, and genuine care for our fellow human beings so that we can better implement creative and more efficient strategies for enacting empathy, moral intuition, and general care.
But here's where Lewis actually offers tremendous comfort: he shows us exactly what makes humans irreplaceable. The chest—that seat of proper feeling and moral response—remains uniquely human territory. No algorithm can replicate the moment when a teacher sees potential in a struggling student and refuses to give up on them. No AI can substitute for the friend who sits with you in grief, not to fix anything, but simply because your pain matters to them. This is why I'm not particularly worried about AI replacing human connection, mentorship, or care. I'm more concerned about us accidentally replacing ourselves—becoming so enamored with efficiency and optimization that we forget to cultivate the very qualities that make us human.
Research & Education Should Do What AI Cannot
This is precisely why liberal education matters more than ever in the age of AI. Lewis understood that education should not merely transfer information or develop technical skills—it should cultivate the whole person, especially that crucial "chest" that connects head and heart. A liberal education does what AI cannot: it shapes our emotions to align with truth, beauty, and goodness. When we study literature, we don't just learn about narrative structure—we develop empathy by entering into the experiences of others. When we engage with history, we don't just memorize dates—we cultivate wisdom about human nature and the consequences of our choices. When we explore philosophy, we don't just learn logical arguments—we develop the moral imagination that helps us discern right from wrong in complex situations.AI can help us process information faster and more efficiently, but it cannot develop our capacity for wonder, moral outrage at injustice, or the deep satisfaction that comes from understanding truth. These are distinctly human experiences that emerge from a well-educated chest.
What makes this even more compelling is recognizing the different motivations that drive people in various academic disciplines and research endeavors—motivations that no algorithm can replicate or replace.
For the cell biologist who spends years studying cell division isn't just collecting data; they're driven by a profound love for understanding how life sustains itself and a deep compassion for unregulated cell growth affects those who develop cancer. Their research questions emerge not from algorithmic optimization but from a combination of scientific wonder and human concern: "What goes wrong in the cellular machinery that leads to cancer, and how can we intervene?"
For the historian who pores over dusty archives isn't just accumulating facts; they're motivated by a deep care for understanding how we got here and what we can learn from those who came before us. Their research agenda is shaped by a conviction that the past has wisdom to offer the present.
For the literature professor who reads the same text for the hundredth time isn't just analyzing syntax; they're compelled by love for the way language can capture the deepest truths about human experience. Their scholarly inquiry is driven by the belief that these texts can still speak to contemporary struggles and joys.
For the psychology professor who dedicates their career to understanding trauma recovery; AI can help process vast datasets about treatment outcomes, but it cannot replicate the researcher's fundamental motivation: a deep compassion for human suffering and an unwavering belief that healing is possible. Their research questions flow from their chest—from genuine care for human flourishing.
These loves—for truth, beauty, justice, human dignity, wonder, and wisdom—cannot be programmed. They emerge from that "chest" that Lewis identified as essentially human. AI can assist these research endeavors by processing data and identifying patterns, but it cannot replace the fundamental motivation that drives the questions worth asking, the persistence needed when experiments fail, or the wisdom to know which pursuits serve human flourishing.
Teaching the Head, the Heart, and the Chest
Instead of fearing AI's ability to be more efficient as certain than myself, we might ask better questions:
How do we leverage AI to handle routine tasks so we can invest more time in the uniquely human aspects of teaching—mentoring, encouraging, and developing wisdom alongside knowledge?
How can we design courses that cultivate students' "chests" while using AI to enhance their analytical capabilities?
What does it mean to teach critical thinking in an age when machines can process information faster than we can, but cannot discern what information matters most for human flourishing?
How do we help students develop the moral imagination needed to ask the right questions, not just find efficient answers?
Can we use AI as a tool to free up more time for the Socratic dialogue, the office hours conversation, and the research mentorship that shapes character alongside intellect?
How do we ensure that our students graduate not just with knowledge and skills, but with the wisdom to use both in service of others?
Lewis's insight suggests that AI becomes most dangerous not when it's too powerful, but when we've become too much like it—all head and belly, no chest. The antidote isn't avoiding AI technology, but cultivating deeper humanity. Therefore, As we navigate this AI-enabled world, thoughtful engagement means neither panicked avoidance nor uncritical embrace. To quote from Melvin Kranzberg and his Six Laws of Technology:
Rule #1: Technology is neither good nor bad; nor is it neutral.3
That said, we do need to consider the non-neutral changes that AI both can and can never replace. One thing I can safely state that AI will never be able to replace is having a well trained heart for our fellow human beings. But that's only true if we choose to keep cultivating those hearts. Lewis saw the danger of creating people who could think without feeling, who could analyze without caring, who could optimize without loving.
In summary, our task isn't to compete with AI in what it does well—processing information and optimizing outcomes. Our task is to become more fully human in the ways that matter most: loving well, caring deeply, and using both our heads and our hearts to serve the flourishing of others. That's something worth cultivating, whether AI exists or not. But in a world where artificial intelligence can handle more and more of our processing, it becomes absolutely essential. The conversations that matter most aren't about what AI can and cannot do, but about what AI will never be able to do—and how we can become more fully human in response. Lewis would remind us that the chest is what makes us human. In the age of AI, that's not a limitation—it's our greatest strength.
Citations
¹ C.S. Lewis, The Abolition of Man (New York: HarperOne, 2015).
² Lewis, The Abolition of Man.
³ Melvin Kranzberg, "Technology and History: 'Kranzberg's Laws,'" Technology and Culture 27, no. 3 (July 1986): 544.
Well-stated & meaningful insights for perspective on how to look at & respond to AI in academics-thanks, Justin! :)
Well said!