Skip to main content
topics

As generative AI tools like ChatGPT become increasingly embedded in academic life, higher education students are not just adapting—they’re innovating. In a recent webinar hosted by D2L and the Online Learning Consortium (OLC), three education leaders—Dr. Emma Zone, Dr. Colette Chelf, and Dr. Dylan Barth—shared findings from a collaborative study that centers the student voice in the AI conversation. 

What emerged was a nuanced, often surprising portrait of how students are using AI in higher education—not to cheat, but to learn, explore and engage. This blog post captures the key takeaways from the webinar, spotlighting the voices of the speakers and the students they interviewed. Their message is clear: students are using AI in thoughtful, creative ways—and they need our support, not suspicion. 

Students See AI as a “Good Spark,” Not a Shortcut 

One of the most memorable quotes from the study came from a student who described ChatGPT as: 

“A really good spark… it gives me new ideas that I haven’t seen. It helps me think more about what I’m writing or the assignment I’m doing or the quiz I’m taking.” 

Dr. Chelf explained that this quote became a central theme of the research. “They’re not using it to replace thinking,” she said. “They’re using it to get unstuck, to get started. It’s a thought partner.” 

Dr. Zone emphasized that this challenges the dominant narrative. “The prevailing narrative around, ‘Oh, they’re just using it to cheat,’ may not really be the truth,” she said. “We need to unpack that.” 

AI as a Tutor, Not a Teacher 

Students consistently described AI as a tutor or assistant—never as a replacement for their instructors. “They used the word ‘tutor,’” Dr. Chelf noted, “but not ‘instructor.’ That distinction is important.” 

One student shared how they used ChatGPT to generate practice quizzes: 

“I’ll copy the information and I’ll put it into ChatGPT and ask it to give me a multiple choice 50-question test.” 

This kind of use, Dr. Barth explained, builds confidence and reduces anxiety. “Think about how much confidence a student going into a test has after taking three, four, five different practice tests,” he said. “They feel like, ‘Okay, I’ve got this.’” 

Dr. Zone added that this kind of self-directed learning is a powerful form of engagement. “It’s not just about teaching and learning,” she said. “It’s about the student’s entire persona as a learner.” 

Students Are Tinkering—and Learning 

Another key theme was experimentation. Students are still “tinkering” with AI, testing prompts, exploring ideas and learning how to use the tools more effectively. 

“There’s a sense of play when we’re tinkering,” Dr. Chelf said. “And that’s obviously, we all know, one way that we learn.” 

She shared an example of a STEM student who used AI to explore hypothetical research scenarios: 

“They were engaging in AI to see, ‘What would it say if I substituted this for that?’ It was about curiosity, not cutting corners.” 

Dr. Barth added that this kind of tinkering fosters digital literacy. “They’re learning how to prompt, how to scaffold tasks, how to chunk information,” he said. “It’s a hands-on learning process.” 

Ethical Concerns Are Real—and Students Want Guidance 

One of the most powerful insights from the webinar was that students are not trying to cheat—they’re trying to do the right thing, but they’re often unsure what that is. 

“They want to follow,” Dr. Chelf said. “We heard that over and over and over. ‘I want to do what I should do.’” But many end up determining the boundaries for themselves because they’re not being given the guidance.  

Dr. Zone echoed this, noting that students are navigating a confusing landscape of mixed messages.  

“They’re floundering on what is acceptable and how to use AI,” she said. “They do not want to get in trouble.” 

One student put it this way: 

“I’m not sure where that boundary lies between what are your ideas and what aren’t your ideas.” 

Dr. Barth emphasized that students’ ethical concerns were less about abstract principles and more about practical consequences. “They’re worried about getting flagged for cheating—even when they didn’t use AI,” he said. “That fear is real.” 

Faculty and Institutions Must Create Space for Dialogue 

The speakers emphasized that institutions need to move beyond blanket bans and instead create space for nuanced, transparent conversations. 

“Blanket bans are not the answer,” Dr. Chelf said. “We need to do the hard work of determining what those policies are going to be, where our boundaries are going to be, and communicate those to campuses and our students.” 

Dr. Zone added that these conversations won’t always be easy—but they’re essential. “We have to be really transparent in that those types of conversations can be difficult,” she said. “And providing space for those conversations can result in difficult discussions, which is part of the point. It’s hard for a reason.” 

Dr. Barth pointed out that faculty experimentation is key. “When students see that faculty are using AI and trying to understand it, that helps them feel more comfortable,” he said. “It also allows faculty to support students more effectively.” 

Practical Steps for Educators and Institutions 

The webinar concluded with a set of practical recommendations for supporting students in their use of AI: 

  • Support AI Literacy: “Students need that guidance,” Dr. Barth said. “They’re providing guidance to each other, which can be troubling. So really helping students develop AI literacy is key.” 
  • Update Academic Integrity Policies: Institutions should move toward “acceptable use” policies rather than outright bans and revisit them regularly. 
  • Invest in Faculty Development: “Having faculty champions to do this is a big part of it,” Dr. Barth noted. “Instructors who have worked with this can help others see the possibilities.” 
  • Incorporate AI into Curriculum: Dr. Zone shared a powerful quote from a K–12 educator: “We have a moral obligation to be teaching AI skills.” She added, “We also need to consider that within higher education.” 

Dr. Chelf also addressed the controversial topic of “busy work,” noting that students have always exercised agency in deciding what’s worth their time. “What’s busy work for one student might be brand new for another,” she said. “AI hasn’t changed that—it’s just made it more visible.” 

“Just like engagement is multifaceted, so is the impact, the use of, and literacy around these tools,” Dr. Zone shared. 

“We have an obligation to help students understand how to use these tools right now—and how they’ll affect them in the future,” Dr. Barth said. 

The message from this webinar is clear: students are already using AI in thoughtful, creative, and ethical ways. They’re not looking for shortcuts—they’re looking for support. And it’s up to educators and institutions to meet them where they are, guide them forward and help them thrive in an AI-enhanced academic world. 

Want More? Check out the Full Webinar On-demand, Available now:

What Students Are Saying About AI: New Insights on Engagement in Higher Ed 

WATCH NOW

Written by:

Table of Contents

  1. Students See AI as a “Good Spark,” Not a Shortcut 
  2. AI as a Tutor, Not a Teacher 
  3. Students Are Tinkering—and Learning 
  4. Ethical Concerns Are Real—and Students Want Guidance 
  5. Faculty and Institutions Must Create Space for Dialogue 
  6. Practical Steps for Educators and Institutions