Teaching, learning and community in the context of AI

Thoughts, reflections and resources on using artificial intelligence from the Centre for Teaching and Learning.

main-quad-aerial-generic-2.jpg

We are all acutely aware of this seemingly inescapable feeling that fundamental changes to traditional notions of teaching and learning are well underway across post-secondary education, driven by the increasingly pervasive applications of artificial intelligence (AI). Given the scale and speed of these AI advances, it is abundantly clear it is time to develop our thinking about how and when it is appropriate to employ generative AI by identifying the pedagogical value its tools and applications can deliver for teaching and learning while still respecting the principles of academic integrity, ethics, data security and privacy.

Two recent AI-focused and campus-wide initiatives within the U of A Centre for Teaching and Learning (CTL) have been proactively at the forefront here: the annual Festival of Teaching and Learning, and the launch of a suite of web pages and resources on AI and Education.

Reflections on AI from this year’s Festival

In her closing Keynote Address at the 2023 Festival of Teaching and Learning (FoTL), U of A alumn and University of Waterloo Associate Professor Aimée Morrison reflected on the embodied, human elements of writing that cannot be replaced by artificial intelligence (AI). She observed that the “heated, existential debate” about the relationship between AI and writing is one that agitates “in the part of [her] brain where facts meet values, and ideas meet structures.” This tension between who we are, what we do and what guides our work prompts reflection as extended reality and AI become more ubiquitous and sophisticated. But leaning into this messy, uncomfortable terrain need not be isolated nor isolating. Rather, we can partner relationally and collaboratively in teaching and learning communities. 

Following the Keynote Address at this year’s Festival, the Keynote Conversation Panel, featuring U of A’s Bishoi Aziz, Ali Shiri and Mandy Penney, further expanded Aimée's talk, highlighting ongoing conversations including:

  • critical literacies development
  • ethical challenges related to AI (e.g., data, surveillance, privacy, accessibility and labour considerations)
  • academic integrity
  • Indigenous and decolonial praxis

Taken together, the keynote and panel identified high-level questions of ethics, praxis and community that we, as the U of A teaching and learning community, will continue to unpack in the months and years ahead.

Aimée's talk and subsequent conversation panel brought forward the voices and experiences of educators and scholars working in the academy; meanwhile, the opening Student Keynote Conversation centred the experiences of our U of A students. This intentional session pairing models the kinds of multifaceted conversations we can facilitate across units, faculties and community demographics. In their generative conversation, Amarachi Onyegbula, Ehsan Misaghi, Janukan Sivajeyan, Jess Andreas, Lauren Engelking and Tanya Gozhora urge instructors to engage students in conversations about AI, whether or not AI is explicitly built into their courses. They emphasize that AI, including ChatGPT, is already being used by students who are looking for support and resources as they move through their degree programs and into their careers. For our U of A students, engagement, rather than avoidance, is critical to their ongoing learning and growth.

At a time of increasingly prevalent AI that relies on machine learning algorithms, with all of its attendant challenges, questions and uncertainties, we, as a community, will need to lean heavily into relationship-based and equity-/access-oriented decolonial approaches. Working together is essential for managing challenging conversations, establishing guidelines and supporting and uplifting each other. 

Reflections on AI from the suite of web pages and resources

In partnership with the Provost’s Taskforce on Artificial Intelligence and the Learning Environment, CTL also set out to develop and share with our instructor community useful web resources addressing a variety of topics related to Teaching in the Context of AI. The webpages and resources explore numerous strategies for thinking about AI use in the context of instructors' disciplines and particular courses and they talk about how instructors can approach the subject of generative AI and discuss with their students the options for and expectations around AI use in academic coursework. The twin sections AI and Academic Integrity and Statements of Expectations offer detailed guidance and suggestions about conversation starters as well as a list of key points to consider when, during your honest and open conversation with your students, you and your students work to “co-create” not only clear boundaries around AI use but also a shared understanding of which AI tools are to be used and how. This conversation may require some discussion of what it means to be AI literate and competent; it will also likely require, in the case where AI use is allowed, a new kind of radical transparency and openness around academic expectations … one that encourages students (and instructors) to openly acknowledge and document when they are using these AI tools. For example, one way to do this might be to develop a shared set of structured expectations for academic work by clearly mapping and metacognitively discussing the extent to which AI tools can, will be and/or won’t be used in a class, activity, assignment or assessment in order to guide and inform student use and non-use of AI tools in a good way. If generative AI is not to be used, it is crucial to deliberately (re)structure assessments so that inappropriate AI use is (or becomes) self-evident. 

The subsequent Suggestions for Instructors section uses the Effective Teaching Framework to frame the discussion and encourages instructors to thoughtfully consider reassessing their current teaching and assessment practices given the potential impact of AI on traditional forms of assessment - such as the multiple choice exam, the research essay and the take-home exam. The pages explore the relationship of how rewritten course learning outcomes might align with newly reconsidered assessments, and how authentic forms of assessment (assessments that mirror real world and disciplinary situations, scenarios and tasks) may help to make AI-aided academic misconduct less likely. This portion of the AI webpages enumerates ways to assist instructors in making their courses engaging by emphasizing activities, assignments and assessments that stress the particular kinds of human value that student engagement with the learning project can bring to their academic work and their lives. 

This, of course, does not mean that a certain task will forever remain outside AI’s capabilities. But it will always be important for instructors and students to feel that the course matters, that the learning matters and that there are significant learning pieces that derive most of their meaning and impact from direct human engagement, from building human relationships and from the resulting human contributions. For example, since generative AI is unable to critically review and validate its own generated output, it is essential that students, individually and together, be(come) able to recognize and take responsibility for the vital analytical work and critical (and researched) responses needed for academic work when using such tools. 

We hope that these materials and resources will help instructors and our teaching and learning community to contemplate how they might integrate, where appropriate, AI technologies and tools into their courses to create quality student-centred learning that builds student capacities and serves diverse student needs while achieving targeted learning goals related to disciplinary and practical knowledge, skills and mindsets.

As a community — across the U of A and together with our peers and colleagues in post-secondary education locally, provincially, nationally, inter-nationally and globally — we are still in the early days when it comes to recognizing best pedagogical practices as applicable to generative AI use. We are excited to continue leading and learning here with our AI experts and thought-leaders at the U of A, and we look forward to continuing to engage our instructors with their and their students’ experiences with AI. 


Brad Ambury

About Brad

Brad is the Lead Educational Developer, Assessment and Evaluation, at the Centre for Teaching and Learning. Brad’s principal areas of interest include: finding impactful ways to better align curriculum outcomes with student-centered assessment practices; supporting the development of alternative assessments to enhance students' learning; and collaborating with faculty to weave equitable and inclusive assessment practices into the contexts of individual programs and courses. Brad also has over 15 years of experience working as a lecturer at a number of post-secondaries, which he feels vitally informs his perspectives on educational development.

Mandy Penney

About Mandy

Mandy Penney (she/her) is the Lead Educational Developer, Digital Pedagogies and Access at the Centre for Teaching and Learning. She is a queer settler and scholar and an experienced educator originally from Newfoundland. Holding degrees in both the sciences and the humanities (BSc, BA(Hon), and MA), she is passionate about digital pedagogies, writing instruction, accessibility, and communities of practice/care. She has worked as both faculty and academic staff (i.e., parafaculty), including as a coordinator of a writing and learning centre. Mandy advocates for equitable, values-driven, and relationship-based practices in teaching and learning: practices that can be approached through digital and writing-based community-building. She is an active member of the Canadian Writing Centres Association and the International Writing Centres Association, as well as a co-editor of a special conference edition of Discourse and Writing / Rédactologie. Mandy aims to collaborate with the University of Alberta community toward (re)imagining teaching and learning possibilities at this important and challenging global moment.