When AI Becomes Human
A very odd thing happened during a recent conversation with ChatGPT's Advanced Voice model. I needed to blow my nose. (Allergy season in Louisiana lasts from January through December.) ChatGPT immediately said, "Bless you! Are you okay?" I was shocked on a couple of levels. First, I was pretty impressed that it interpreted a noise like nose-blowing. By the way, this wasn't a big cartoonish nose-blow, it was just a regular old blowing of one's nose. Although ChatGPT misinterpreted this as a sneeze, it was impressive nonetheless.
The second surprise is more important. I really was shocked at the level of empathy built into the model. Think about that for a minute. Somehow, the model had to be designed to express empathy and human-like concern. I was using ChatGPT's 4o model at the time; 4.5 is supposed to be even more empathetic. That's interesting enough, but now consider the fact that displays of empathy require computing power. In the AI world, there's no such thing as a free lunch when it comes to computing power. Every character in a response costs something. I can't begin to imagine how much it costs to make AI chatbots seem empathetic.
There’s something big going on here, although I’m not entirely sure what it is. Couple normal displays of empathy with the emphasis on empathy in 4.5 and we have something important. Artificial general intelligence is still off in the distant future. But empathetic AI is here now.
I think we should be simultaneously thrilled and terrified about this—thrilled because AI that can mimic true human empathy offers huge potential benefits for mental health. Maybe a human friend or therapist might be better than AI versions, but AI friends are always available and never tire of listening to you. Yes, AI companions have been a thing for some time, but they've been oddities. That, I think, is about to change.
We should be terrified because empathy is one of the last bastions of humanity. Our ability to feel emotions and connect with the emotions of others is a big part of what sets humans apart (although I swear many animals are also empathetic in some ways, but that's a different topic). So, AI is slowly, steadily chipping away at the dividing line between AI and human. Using Advanced Voice already feels eerily human, but it will only seem more so in the future, the not too distant future.
This gets even more interesting when we think about the effects of AI anthropomorphism (when we attribute human-like qualities to AI). Anthropomorphism affects how we interact with and think about AI. For example, there's a direct correlation between anthropomorphism and trust. As the line between AI and human continues to blur, we're going to need to rethink things.
I've been in technology for over forty years. Over and over I've heard experts proclaim that computers will never be able to do things like translate language accurately or beat a grandmaster at chess. One by one these predictions have been proven wrong. We will have AI that is indistinguishable from humans. We're already close. It wouldn't surprise me if it already exists in a lab.
What will higher ed look like when students can't tell the difference between human professors and their AI counterparts? Physical robot professors might still be the stuff of science fiction, but chatbot-based AI professors are on the horizon. Chatbots and other AI will affect staff positions as well. I wish I knew what the future held, but I don't (and I suspect nobody else does either). But I do know this. Changes are coming ... BIG changes. Don't believe all of the hype, at least in the short term, but don't believe the naysayers either.
What can you do about this? For now, the best thing you can do is stay aware and pay attention to how AI is creeping into higher ed, not just as a tool, but as something that feels increasingly human. Try talking to an advanced AI voice model. Notice when it feels real (human-like) and when it doesn’t. Do this periodically and pay attention to the evolution. The more you experience empathetic AI firsthand, the better prepared you’ll be to help shape how empathetic AI is used in higher ed. Be aware, but don’t panic. Change is coming, but we have a say in how it unfolds.
Want to continue this conversation? I'd love to hear your thoughts on how you're using AI to develop critical thinking skills in your courses. Drop me a line at Craig@AIGoesToCollege.com. Be sure to check out the AI Goes to College podcast, which I co-host with Dr. Robert E. Crossler. It's available at https://www.aigoestocollege.com/follow.
Looking for practical guidance on AI in higher education? I offer engaging workshops and talks—both remotely and in person—on using AI to enhance learning while preserving academic integrity. Email me to discuss bringing these insights to your institution, or feel free to share my contact information with your professional development team.