Skip to main content

Artificial Intelligence and Inclusion: Striking the Right Balance 

John Baker

At D2L, we embrace the responsible use of Artificial Intelligence (AI). For over a decade, our company has built AI tools within our products to help instructors save time so that they can focus on what really matters: improving outcomes for their learners. 

For instance, last year we announced that, thanks to hard work of D2Lers and supported by AI, all video uploaded into Brightspace is supported with automatically generated closed captions in multiple languages – for no extra charge. This was a significant step in removing the top remaining accessibility barrier for many learners and helps D2L deliver on our longstanding mission to deliver enable equitable, accessible learning to all learners.   

For us, accessibility is not an afterthought; it’s top of mind. At D2L, we follow a purposeful inclusive design practice and are the only learning platform recognized by the National Federation of the Blind through our inclusion in its SNAP program

I recently had an opportunity to share some of my thinking on the use of generative AI in education with a wide range of leaders in online teaching, learning, and administration at the 2023 DT&L and SOLA+R Conference organized by the University Professional and Continuing Education Association (UPCEA) and the University of Wisconsin-Madison (UW-Madison). I had the honour of joining Kelly Herman, VP of Accessibility, Equity & Inclusion at the University of Phoenix, in a panel discussion facilitated by D2L’s very own Global Accessibility Lead, Dr. Sambhavi Chandrashekar.  

Our conversation, which I’m proud to say received a 5-star rating from attendees, centered on AI and its implications on inclusive pedagogy. Here are some of my main takeaways from this important discussion: 

A framework for AI-infused inclusive education 

As we increasingly produce and use AI tools, we must ensure that every learner and instructor can use them. In our panel, Dr. Sambhavi offered a 3-pillar framework that we can use to anchor our thinking when considering accessibility and ethics in the context of emerging AI tools:  

  • Commitment to sustaining technology accessibility  
  • Creativity in course design and delivery   
  • Community to collectively work for the common good 

It’s important to consider both the benefits and potential barriers of these tools when implementing them in course design. Just as important is the ability to be creative and innovative in adapting the tools if necessary once they’re in place. If some students experience constraints as they use AI, we must be ready to change things. After all, change is a big part of any commitment to sustainability. 

It’s also why communities are so important. We need collective support to successfully negotiate large-scale transformations. The UPCEA panel was exactly that: an opportunity to engage with a community of experts to help figure out the practice, to build evaluative judgment, and to leverage these technologies in new ways together. 

The importance of keeping a human in the loop 

Sometimes we forget that technology is a creation of humans. We are creators, molding technology. When it comes to accessibility, it’s very clear a human touch is needed. It’s not enough to just launch a new technology, we must constantly shape it to better meet the needs of all that use it to help us knock down barriers as we build a more equitable world.  

That’s why earlier this summer, we posted D2L’s AI Guidelines. We strive to design, develop, and use AI systems that are unbiased and fair so we can continue advancing our accessibility-by-design mission and continue to help our customers reach their accessibility goals.   

These guidelines are already informing our product development. At our annual Fusion conference in July this year, we introduced a handful of D2L’s latest generative AI-powered technologies: 

  • A generative AI features that can assist educators with authoring and generating suggested Practice questions for formative assessment. This will help students self-evaluate and improve before taking a final exam.  
  • A new Brightspace Virtual Assistant, which uses AI to bring users contextualized help and documentation from within Brightspace.  
  • And a new partnership with Copyleaks, a leading AI-based text analysis, plagiarism identification, and AI-content detection platform.  

These are just a few ways we can see how the use of these technologies can go into providing better feedback, better coaching, and ultimately building better and more accessible human connections in education. 

Skiing through the AI glades 

Whenever I go glade skiing, which is one of my favourite sports – I don’t focus on the trees, I focus on the spaces between the trees – otherwise, you’re likely to wind up in the tree. Likewise, while managing the risks in AI, I believe we must focus on the opportunity it presents, while still being mindful of its challenges, because, without a doubt, this technology is a game changer. It’s not here to replace us. It is here to elevate and scale the meaningful work we do.  

To see more about what D2L is doing to enhance accessibility through technology, see our dedicated accessibility page

Written by:

John Baker

Stay in the know

Educators and training pros get our insights, tips, and best practices delivered monthly