Skip to main content

2. Students Want to Master—Not Game—the System

Our Research Shows

The dominant narrative around cheating obscures the wider reality: Most students are using AI to engage more deeply with content. Without guidance, however, they risk misuse or shallow learning. Employees are also using AI to grow their skills, but they need more structured, career-linked pathways. Learning, whether in higher education or corporate spheres, must connect AI use to tangible outcomes.

3X

Educators are 3X more likely to say AI has enhanced instruction and engagement over worsening it

“When I get stuck in procrastination…you know, what ideas to write about… AI kind of helps me. It helps me get a framework before I even do it and avoids the procrastination for me.”

  • Students use AI for brainstorming, tutoring, and overcoming learning barriers

  • Most students say AI helps them understand material—not just complete it

  • Students describe AI as a “study buddy,” helping with brainstorming, tutoring, and overcoming procrastination

  • Students with ADHD and anxiety report that AI helps them “consolidate” ideas and focus on coursework

Women working on a computer

Actions to Consider

  1. Integrate AI into instructional design and pedagogy
  2. Train faculty to model ethical and effective AI use
  3. Create student-facing resources on responsible AI learning, including clear and reasonable definitions for AI cheating
  4. Require all students to take a one credit course (pass/fail) during their freshman year on the ethics and usage of AI in the classroom and their studies
Student working on an assignment

Key Takeaway

The “cheating” narrative overshadows AI’s role as a learning amplifier. When institutions and educators reframe AI as a support for deeper engagement and pair access with modeling and ethical guidance, students become more confident,  independent and equipped for lifelong learning in an AI-rich world.