Skip to main content
Request a Demo

Taking the Temperature: How Higher Ed Feels About AI 

At the start of the webinar, participants were polled on their personal and institutional attitudes toward AI. The results revealed a mood of cautious optimism:

  • Personal viewpoint: 33% optimistic, 30% open-minded, 20% cautious, 6% concerned and 4% skeptical
  • Institutional stance: 35% exploratory, 28% measured, 18% enthusiastic, 15% apprehensive, 4% skeptical

The poll results helped get a clearer view on the standpoints of how AI is being seen in higher education—open to its existence, but cautious about how to use it most effectively.

“To me, GenAI is like a power tool. In the hands of a person who has supervision and guidance, follows safety protocols, it can be a fantastic thing,” explained Hobson regarding his current stance on AI use in education. “In the hands of someone who has none of those things, you’re going to get some not-so-great results.” 

The poll results also set the tone for a session that balanced ambition with realism—and reminded everyone that learning innovation is as much about listening as leading. 

AI as a Mirror of Humanity 

For Wanca, her curiously with AI started not from a technological standpoint, but from understanding what makes us human. This led to her approach of studying AI through a human lens. 

“I realized that AI doesn’t only learn from data, but it learns from humans. And every piece of the data that we give an AI to learn from, it carries our fingerprints, our empathy, our biases, our attention,” she said. “At the end of the day, I’m asking myself not whether we should be using AI, but how are we using AI? How we design systems that makes us more human.”  

We can learn from AI, because AI is our mirror. It mirrors the consciousness of its creator

Ina Wanca, Chief Academic Technology Innovation Officer at University of Hawaii.   

Wanca urged educators to see AI not as a threat but as an amplifier of human purpose—technology that magnifies whatever values we put into it. 

Designing for Learning, Not Automation 

For Hobson, he likes to lean on AI to enhance course design, not replace it. During the webinar, he shared how he created a custom GPT called Your UDL Pal to help faculty apply universal design for learning (UDL) principles. 

“[Faculty] do not necessarily know about the [UDL] framework or how to be able to apply it, but they’re curious, and they wouldn’t mind trying to be able to have a second set of eyes to be able to help them out,” he explained. This led to Hobson creating the custom GPT that allows users to have their content or assignment checked out through a UDL lens. He found that the GPT was able to provide new ideas and perspectives on how the content could be shared with a wider audience.

“It was trying to be able to provide different forms of flexibility and freedom and autonomy inside of a learning experience that perhaps I wouldn’t have thought about,” said Hobson. 

After sharing the custom GPT online, a senior designer from Harvard’s continuing education division reached out to build a version to train their faculty on. Its success has pushed Hobson to consider how he can use AI to help his students and continually experiment with AI to see how it can enhance his work and act as a sounding board for fresh ideas. 

You’re not starting necessarily with AI. You’re starting with an idea, a thought, a concept, and then you’re asking AI for some feedback.

Robyn Hammontree, Vice President, Academic Partnerships, D2L.

It’s a reminder that true learning design begins with purpose, not prompts. 

Building Human Architecture for AI Governance 

Zooming out from classrooms to campuses, Wanca shared how the University of Hawaii is implementing an AI Operating System—a governance model built to ensure AI advances institutional priorities, not just technical capacity. It provides a clear framework to move AI initiatives in a holistic way. 

The system has three layers or principles: 

  • Strategic: Defining vision, values, and priorities
  • Operational: Connecting strategy to action through alignment and policy
  • Tactical: Activating small, cross-campus task forces to experiment and share learnings

The first step is the building the strategic layer, said Wanca. This is where academic leaders—like a chancellor or president—set the institutional vision, values and strategic priorities they’re looking to advance. Additionally, a planning group is developed, comprised of cross-functional teams, who can bring different perspectives on the AI transformation journey. 

Next is the operational layer, which can include offices dedicated to AI or technology innovation. This department’s expertise will be used to create the alignment between the strategic layer and campus engagement, and execute the initiatives. 

Lastly, there is a tactical layer that includes an impact advisory group or task force. These small, focused teams take on priorities and move forward the initiatives. However, added Wanca, she uncovered an additional layer after moving to Hawaii: Human values. 

“I realized that there is also one layer that is very important, and that layer is called the human values,” said Wanca. “Because values really guide every decision.” 

No matter how your institution is using AI, it’s important to ask: Does this align with the human values of my institution, like responsibility, transparency or student success? Wanca encouraged attendees to use these values to help build a blueprint to ensure all priorities—be that use cases, tools, vendors and partnerships—will properly serve your community and institutional priorities. 

Redefining Success in AI Adoption 

Toward the end of the webinar, the conversation turned practical: How do institutions know when AI is being successfully implemented? 

For Hobson, success means alignment on what the institution is trying to do with the tools, and understanding the pros, cons and risks. Next is ensuring proper buy-in from all stakeholders and having the proper pathways for feedback to determine what is and isn’t working well. Having these proper protocols in place helps ensure students are receiving uniform information on how AI is used at their institution with no grey areas. 

For Wanca, in addition to Hobson’s assessment, successful implementation of AI means building a learning mindset. 

“For me, and specifically for University of Hawaii, I think successful AI implementation, it’s not about how many tools we launch, or pilots, or how many solutions we’re running. It’s about whether we really build a mindset and a culture that can keep us learning,” she said. “Because in this AI journey, everybody is at a different phase, you know? And I think we need to honor that.” 

Wanca also underscored the importance of values. 

“Really anchoring everything in values, because we have to remember at the end of the day, the humans really are giving the purpose,” she said. “If we don’t carry that human element with us, and we don’t allow ourselves the reflective time and just be comfortable with the unknown, comfortable experimenting, we won’t be able to move forward.” 

Things are changing and building that muscle that allows us to really be comfortable when we don’t see the future is how we can build a successful AI culture, a successful AI transformation,

Ina Wanca, Chief Academic Technology Innovation Officer at University of Hawaii.   

Both speakers agreed: AI literacy, transparency and collaboration are the real measures of success—not the number of tools adopted. 

Shaping the Future of Learning With AI 

The conversation made one thing clear: AI in education is not just about technology—it’s about people, processes, and purpose. Success depends on creating a culture that embraces curiosity while safeguarding critical thinking. Institutions must move beyond hype and fear, building adaptable frameworks that align with values and foster collaboration. 

Rather than using AI as a shortcut, educators should leverage it to deepen learning through reflection, personalization and inclusive design. Governance models should be created that are flexible and transparent, while faculty and students need opportunities to experiment and learn together. 

Ultimately, the future of AI in education will be shaped by how well we balance innovation with intentionality. By designing for evolution, we can harness AI to empower learners, strengthen communities and advance education in meaningful ways.

Using AI to Build (Not Bypass) Brainpower in Higher Education in Policy and Practice  

Discover how AI can empower—not replace—human learning. Walk away with key pillars to guide your institution’s approach.

Watch Now

Table of Contents

  1. Taking the Temperature: How Higher Ed Feels About AI 
  2. AI as a Mirror of Humanity 
  3. Designing for Learning, Not Automation 
  4. Building Human Architecture for AI Governance 
  5. Redefining Success in AI Adoption 
  6. Shaping the Future of Learning With AI