Back to News + Stories

Research Project Explores Artificial Intelligence and Disinformation in Art + Design Higher Education  

(Clockwise from top L): Members of the AI + Disinformation research team Joshita Nagaraj, Asad Aftab, Ki Wight, Mia Portelance, Abhishek Singh Bais and Leah Burns.

A nine-month project and ongoing “learning community” group form part of a broader SSHRC-funded research initiative led by Dr. Ki Wight to explore the impact of AI on creative higher education. 

A new project led by researcher and Emily Carr University of Art + Design (ECU) faculty member Dr. Ki Wight engages ECU students and faculty to explore ethical frameworks for understanding disinformation and generative artificial intelligence (GenAI) in the context of art and design education. 

The nine-month AI & Disinformation in Art & Design Higher Education initiative forms part of a broader, two-and-a-half-year project, funded by a Social Sciences and Humanities Council (SSHRC) Insight Development Grant. 

“Misinformation and disinformation are not often connected to art and design education in existing literature. Yet these topics are obviously linked as our students are being educated to participate in contemporary creative industries impacted by AI and participating in AI technologies,” Ki says. “Our idea was to fill that void and establish a baseline understanding of practices, opinions, research and potential gaps to help inform things like policy development, teaching practice innovations and curriculum development, and find ways to address both AI and disinformation in a proactive way.” 

The AI & Disinformation project research team included ECU faculty members Dr. Leah Burns, Michelle Ng and Dr. Sara Osenton as well as Master of Design students Asad Aftab (MDes 2025), Abhishek Singh Bais (MDes 2026), Joshita Nagaraj (MDes 2025) and Mia Portelance (MDes 2026).   

Team members presented findings from the AI & Disinformation project and the broader SSHRC-funded research initiative at the Teaching + Learning Centre’s (TLC) Practice & Pedagogy Symposium at ECU in May 2025. Ki and Leah also presented the team’s research at the annual academic symposium for the Canadian Society for the Study of Education in Toronto in June 2025. 

 
A FLUID RESPONSE TO EVOLVING TECHNOLOGY 

Through a review of more than 550 sources, the team identified broad concerns that GenAI is unavoidable in education; that its increasing ubiquity could lead to a deskilling of creative practitioners while also threatening their intellectual property through commodification and appropriation; that entrenched biases and inaccuracies are propagated by AI tools, thus reinforcing misinformation and distorting public understanding of creative work; and that institutional policy often moves too slowly to adequately respond to the tools’ lightning-fast evolution. 

To understand how to address these findings in a meaningful way at ECU, a forthcoming “Gen AI Learning Community” will gather community members to grow insights around the technology’s impacts within ECU’s classrooms.  

Comprising ECU educators from across disciplines and specialties and operating through the TLC, the GenAI Learning Community will provide a nimble framework for working in collaboration on critical, ethical responses to AI’s rapidly evolving impacts on creative education and beyond. 

“There are so many brilliant people doing such incredible things on this topic within ECU’s classrooms and beyond,” Ki says. “Our hope is to bring together all these incredible faculty leaders to keep growing this knowledge base and support fluid, on-the-ground responses to the evolution of the technology.”  

Ki Wight. (Photo courtesy Ki Wight)

 
A LITERACY BUILT ON ETHICS 

Ki says institutions and educators must be crystal-clear about their values to understand how broader guidance around ethical use of AI might apply within their specific context. 

For instance, studies and experts often note that greater literacy is needed with regard to both disinformation and AI. But what exactly this means isn’t often articulated. 

“The things that underlie literacy are agency, equity, community and care — human care, which might include care for the environment or social justice or many other things,” Ki says.  

“When we use phrases like ‘human-centred’ or ‘centring agency’ or ‘centring collaboration and community,’ that means centring our ethics. We build literacy on that foundation by investigating the tool through the lens of our ethics. We don’t just play with it because it’s new and it’s there. This isn’t the place for carefree play.” 

This approach is essential as the team’s research also shows how AI can mean very different things to different disciplines. For example, many illustrators view AI as an existential threat due to its potential to replace paid, human work. Meanwhile, in animation and filmmaking, AI-driven tools have been in use for a decade or more. 

By gathering context-specific knowledge in real time, the GenAI Learning Community will aim to create nuanced pedagogical frameworks that remain sensitive to students’ diverse needs and concerns and can be rapidly mobilized within classrooms. Ki says the learning community aims to demystify AI and develop hands-on learning opportunities to critically examine its inner workings, limitations, risks and potential for ethical co-creation. 

 
LEADING THE CONVERSATION 

Ki hopes to eventually invite members of industry groups such as DigiBC as guest speakers and resources for the learning community, work that has already begun with ECU’s industry liaison, Alan Goldman. She and her research colleagues will also be holding a public event with media educators in the new year and are working on a publication to share findings and advance the conversation. 

“AI is impacting our world so quickly and profoundly, but as creative practitioners we have a chance to really lead some of these conversations, even at the level of arts councils, governance bodies and federal agencies — those places in Canada where policies are created which can either help or hinder our field,” Ki says.  

“My hope is that by doing this type of small-scale testing and work with faculty, we can find a way to share and grow the insights we develop and the solutions we create.” 


By: Perrin Grauer