From the day D2L opened for business, I’ve seen us as a learning company first and a technology company second. Over the last twenty years, while the technological tools we use have evolved constantly, the values that guide how we apply them to learning have not changed a bit. But staying true to those values is an ongoing challenge – particularly with the almost daily advances we’re seeing in areas like data analytics and artificial intelligence.
What’s bringing this to mind for me is a recent conversation I had with John O’Brien. I have known John since 2002 — and he’s built his career around finding ways for technology to make a positive difference in the lives of university and college students. Since 2015, John has been the president and CEO of EDUCAUSE — a non-profit association made up of over 2,000 colleges, universities and organizations based in 45 countries, all united in the cause of advancing higher education through technological innovation.
John and I began by talking about ethics in digital innovation, and how it‘s similar to the “writing across the curriculum” movement, which seeks to incorporate effective writing across all courses in the curriculum because it’s so essential for everyone. As John sees it, the same is true for ethics: we need an “ethics across the curriculum” movement.
John told me about a recent survey showing that out of 13 leading institutions doing digital projects involving student data, only three had used any formal ethical framework. “Is there someone at your university or college whose full-time job is to worry about these things?” is a question John likes to put to people. He believes the answer needs to be a firm “yes,” ideally in the form of a Chief Privacy Officer. He also believes that universities and colleges should have an inventory of questions they ask vendors before they work with them.
As one of those vendors, I agree.
What John is telling us is that ethical considerations need to be front of mind at all times. He likes to cite a quote from data scientist Cathy O’Neil: “Algorithms are opinions embedded in code.” For example, if car insurance software is written to use both your driving record and your credit score as criteria to decide whether or not you can get insurance, is that a baked-in form of bias?
Or what about the issues raised by using AI to conduct job interviews? “If we use technology that gives people higher ranks if they smile a lot or gesture more than others – that’s concerning to me,” John says. It reminded us of the then-cutting-edge aptitude tests we took as high school students. John’s recommended a career in coal mining. Mine said I was a destined to be a mechanic.
So, these issues are hardly new. As John points out, in our analog past “how we graded students, how we determined what constitutes a course or a degree, they were algorithms too, right?” Bias has always been an issue – it’s just more pressing now than ever.
Despite his notes of caution, John says there is plenty of cause for optimism. For instance, MIT is raising $1 billion to create a new college that will fund 50 faculty positions around artificial intelligence – including interdisciplinary consideration of ethical implications. His dream is that, as we go forward, the curriculum has expanded to the point where “everyone has some exposure to ethical thinking.”
Through his ongoing work as a thinker, speaker and leader at EDUCAUSE, John is helping make that future a reality. And he’s doing it with the same enthusiasm for technology’s role in learning that he had starting out with his first computer in 1984. “I’m just so excited about these new emerging technologies,” he says.
The trick, I think — is being every bit as ethical in your approach today as you will be tomorrow, so that you can keep every bit of technology ethical, too.
Written by:
Stay in the know
Educators and training pros get our insights, tips, and best practices delivered monthly