AI adoption in college classrooms sparks debate over learning & future of education
- Hannah Traczynski
- 1 hour ago
- 4 min read

Artificial intelligence continues to reshape college classrooms, leaving faculty and students grappling with how far the technology should be used in academics.
As universities explore the benefits of AI, some professors see it as a necessary tool for future careers, while others worry it undermines the very purpose of higher education.
According to a January 2025 APA Monitor report, AI is now used in both instruction and support services, with institutions increasingly turning to chatbots, automated tutoring and tools that assist with writing and accessibility.
Education leaders argue that students need familiarity with the technology to thrive in a digital workforce. At the same time, organizations like Schools That Lead caution that relying too heavily on AI can eclipse essential human skills such as critical thinking and collaboration.
At Rochester Christian University, the ongoing conversation reflects those national concerns.

Kendra Corman, chair of the Department of Management and Marketing, believes AI has become “an essential tool” in business and should be treated as such in college. “We have to make sure that our students graduate with the skills on how to use it and how to leverage it the right way,” she said. “Or we’re doing you a disservice because it is in the business world.”
Corman said she encourages responsible use by requiring transparency in AI-generated work. She makes her students submit prompts, results and reflections when assignments involve AI. “AI is a tool,” she said. “It should take your ideas and make them better.”
She also sees equity benefits. International students whose first language is not English have seen improvement when AI helps them refine grammar and clarity. “Now they're actually getting graded on their ideas rather than losing points because English is their second language,” she said.
Despite these positives, Corman acknowledges the challenges. Students can take shortcuts more easily, and issues of academic honesty remain. But she maintains that students who want to avoid learning have always found ways around work long before ChatGPT.
She structures her courses to require personal interaction videos, in-person presentations and group discussions in ways AI can’t easily replace. “Those soft skills are so important,” she said. “I challenge my students to think differently.”
Looking ahead five years, Corman expects AI to be embedded in every classroom but believes the most human elements of education will become even more valuable. “The more personal and personalized, the less AI-created that’s going to be the bigger differentiator,” she said.
Not all professors share her optimism.

Dr. James Walters, dean of the School of Humanities, takes an almost opposite stance. He does not use AI for teaching or his own work and has concerns about the technology’s broader consequences.
“For me and my teaching, there’s no place for it,” Walters said, explaining that writing and research are essential cognitive processes that cannot be outsourced. “You think by writing. Going to an AI program short-circuits your own thinking.”
He has noticed an increase in AI-assisted assignments and enforces a strict policy: suspected AI-generated work receives a zero, though students can rewrite for credit. For him, that correction process is a key opportunity to help students learn why the choice matters.
Walters sees academic work as personal development, not just output. “If all you do is type a prompt into something and get an answer, then there’s no point in pursuing education,” he said.
While he agrees policies can vary from professor to professor, Walters believes universities must carefully consider the implications before adopting AI widely. “We shouldn’t just be expecting people to use these tools because they exist,” he said. “That’s a choice.”
His concerns also extend far outside the classroom. Walters fears AI development is driven by powerful companies seeking to eliminate labor. If workplace roles diminish, he worries the value of a college degree and human creativity could be threatened. Educators, he said, may one day see their own jobs on the line.
Even for those open to using AI, ethical debates continue. The APA Monitor report notes concerns such as data privacy, algorithmic bias, and misinformation in automated responses. Schools That Lead emphasizes that educators should balance AI adoption with opportunities that strengthen interpersonal learning and community connection.
Students themselves are caught in the middle of change. Corman said some feel discouraged when restricted from using tools they know exist. Others, like Walters, notice a decline in student-instructor interaction as young people rely more on digital answers than on asking questions.
There is no consensus yet on where AI belongs in higher education. What is clear is that universities must balance innovation with intention.
AI may assist students in writing better sentences, translating communication in real time, or generating ideas more efficiently. But both professors agree that humanity, whether through creativity, problem-solving, or personal connection, must remain at the center of education.
“I don’t think it’s going away,” Corman said. “That’s why I’m passionate about using it in the classroom.”
Walters agrees the future is uncertain, but argues that ambiguity is exactly why decisions must be made thoughtfully. “Nothing is inevitable,” he said. “We can think through these things. We still get a say in what education looks like.”
As colleges navigate the rapidly evolving technology, one question continues to surface: What should learning look like in an age when answers are only a prompt away?

Comments