Menu

Wananga landing Wananga landing
News

Safe-guarding an ethical future for AI

15 March 2024

Digital literacy experts are advocating for the need to change the way we engage with and teach Artificial Intelligence (AI), setting students up for success in the future. 

APPLY NOW

Te Whare Wānanga o Waitaha | University of Canterbury (UC) Associate Professor of Digital Education Futures Kathryn MacCallum and colleagues have identified the critical components of an AI literacy curriculum that can be adopted by teachers of students at any level. These components are outlined in the initial findings of a three-phase Delphi study.

The foundational components of their research suggest that an AI literacy curriculum must include an understanding of AI concepts, learning about the application of AI and developing necessary technical skills along with an understanding of the issues, challenges and opportunities that AI brings, including ethical considerations.

“The concept of AI literacy has become increasingly prominent in recent years, and due to the increasing pervasiveness of AI technologies in every part of society, everyone must become AI literate,” says Associate Professor MacCallum.

She says fostering an understanding of AI at a young age is becoming more critical but it is also something that should be infused within all tertiary programmes.

“We need a future-focused curriculum to support our students to live in a digital society and for that, they need to understand how AI is developed, its diversity and its influences on us. AI literacy needs to sit alongside other digital literacies, where we support students to be more than just technology users; we want them to be the future creators of these future technologies.

She says her framework differentiates from others in that it explores the different levels of AI literacy, moving from an informed user to developers and designers of AI systems. The intention of the framework is not to tie this literacy to a specific age but come from the framing that all students need a basic knowledge of AI to be aware users of it. So, even at a basic level, technical understanding is critical.

“AI is embedded into most systems we engage with today,” Associate Professor McCallum says. “For example, AI technologies are embedded in social media and search engines and therefore influence what we see, engage with, and even what we listen to.

“We often saw ethics and social issues disconnected from the teaching of how AI works. In this framework, ethics is a critical part, but it also comes from understanding how AI systems are developed, so we can see the implications this has on us,” she says.

Associate Professor MacCallum says one outcome of the study is the framing of Aotearoa New Zealand’s unique bicultural focus, which is at the core of this study, and an important lens often missing from other frameworks.”

“Having a bicultural and ethical lens infused into the framework will hopefully support students to be more aware of the influence they have as future creators.”

Associate Professor MacCallum’s work in influencing the way AI literacy is taught goes beyond informing frameworks. She is also involved in a pan-university project to explore policies and approaches to GenAI in teaching and learning and how these can be applied at UC, including creating new courses and informing the academic integrity module. This work also links to her other work around embedding digital literacies and computational thinking across the schooling curriculum.

 


Media contact
 
  • Email
  • Phone: (03) 369 3631 or 027 503 0168
What to read next
Privacy Preferences

By clicking "Accept All Cookies", you agree to the storing of cookies on your device to enhance site navigation, analyse site usage, and assist in our marketing efforts.