About Generative AI

About Generative AI

Generative AI is an artificial intelligence that “generates” new content—such as text, images, music, video, or even code—based on patterns it has learned from existing data. Traditional AI primarily analyzes data and makes predictions. Conversely, GenAI produces novel outputs responding to human prompts. This enables creative collaboration between users and AI, and human direction shapes the final AI-generated content.

Large language models (LLMs) trained on vast datasets covering diverse (multimodal) forms of human communication are at the core of GenAI tools. These models predict sequences of words or generate other content by recognizing patterns in the data. While GenAI can produce compelling results, its outputs may also reflect biases and limitations of the training data, leading to inaccuracies, oversimplifications, or harmful content.

GenAI excels at reconfiguring existing knowledge, simplifying complex topics, and producing draft content. This makes it useful for tasks like summarizing information, generating computer code, translating languages, and writing in different styles. However, its outputs are based on statistical patterns and do not include genuine understanding, critical awareness, reasoning, or insight.

GenAI presents opportunities for rapid content creation, personalized feedback, and creative assignments, but it requires your careful oversight to ensure accuracy and accountability. As AI becomes increasingly integrated into educational practices, it is important to guide students in critically evaluating AI outputs and using these tools responsibly.

Getting started

GenAI tools are a rapidly evolving landscape. As these technologies continue to develop, their impact on higher education will likely expand and shift in anticipated and unexpected ways. As you integrate GenAI into your teaching practice:

  • Start small, and experiment with one or two tools that align with specific teaching goals
  • Leverage GenAI student learning experiences in alignment with course outcomes
  • Work to implement GenAI through ethical, transparent, and well-documented practices
  • Share experiences and best practices with colleagues and CTL
  • Stay informed about institutional policies regarding GenAI use
  • Regularly review and update your Statements of Expectations for AI for your courses
  • Focus on developing students' critical thinking, evaluation, and problem-solving skills alongside AI tool use
  • Familiarize yourself with GenAI providers’ documentation
  • Review privacy and data security policies, and guides for educators that explain the tools’ capabilities, limitations and best practices
  • Encourage students to review information that guides them in the appropriate use of GenAI with topics such as proper citing AI assistance, understanding potential errors or biases, and ethical use

Teaching With Generative AI

This overview highlights seven key categories of generative AI tools, outlining their capabilities, potential educational applications, and critical considerations for classroom use. As these technologies evolve rapidly, this summary should be viewed as provisional and updated regularly. Each section aims to assist you in evaluating how these tools can enhance teaching while ensuring responsible use, academic integrity, and high educational standards.

Content Creation Case Study: Listen to a podcast discussing this website created with GenAI

Google’s NotebookLM allows users to upload source documents and engage in focused discussions about their contents, complete with clear citations and references to original materials.

The podcast below is a multimodal use case featuring two virtual podcasters discussing the content of this webpage.

a_felt-textured_three-dimensional_scene_featuring_no_text.png

Addressing the AI Elephant in the Room. Image Concept: B. Ambury and ChatGPT DallE-3.

Podcast Summary of About GenAI (Listen to the audio)
The ‘hosts’ discuss the contents of the ‘About GenAI’ section of CTL’s Teaching in the Context of AI web resources. They explain how Gen I is a type of artificial intelligence that can create new content. They explore categories like text generation, image generation, video creation, audio/speech production, multimodal content, research, coding, and programming. The discussion highlights GenAI’s potential benefits and challenges in teaching and learning contexts, emphasizing the importance of guiding students to use these tools responsibly while maintaining academic integrity and promoting critical thinking.

GenAI Tool: Google’s NotebookLM

Text Generation AI

Tools: ChatGPT, Google Gemini, Claude, Bard, Microsoft Copilot, Meta (Llama)

Capabilities: These tools excel at generating and understanding human-like text across languages with strong contextual awareness. They handle diverse tasks, from creative writing and technical documentation to complex reasoning, code generation, mathematical computations, and step-by-step problem solving, maintaining context over extended interactions.

Example use cases

  • Instructors: Lesson planning, assignment creation, rubric development, feedback generation
  • Students: Writing assistance, ideation, concept explanation, study guide creation, paper drafting, co-creation
  • Instructors and students: Language learning, translation, brainstorming, text analysis

Key considerations

  • Provide clear guidelines for students on appropriate GenAI use in assessments and assignments
  • Ensure proper citation and attribution where applicable
  • Establish and communicate expectations for responsible, ethical use to uphold academic integrity
Image Generation AI

Tools: DALL-E 3, Midjourney, Stable Diffusion, Adobe Firefly

Capabilities: These tools transform text descriptions into ‘original’ images with precision. They can generate, edit, and modify images while maintaining consistent styles and incorporating specific artistic elements. Advanced models excel at creating technical illustrations, diagrams, and designs. Recent developments have improved their handling of text within images.

Example use cases

  • Instructors: Create visualizations for theoretical concepts, develop visual assets for course materials, generate step-by-step visual guides for procedures
  • Students: Visualize project concepts and designs, create presentation graphics, produce visual aids for assignments/presentations
  • Instructors and students: Design infographics, create data visualizations, develop accessible visual explanations, generate concept mockups

Key considerations

  • Verify the quality and accuracy of generated images
  • Design assessments that balance GenAI use with required traditional visual skills
  • Ensure AI tools enhance rather than replace core visual learning outcomes
  • Address potential copyright/IP issues with reference images
Video Generation/Editing AI

Tools: Runway, Synthesia, D-ID, HeyGen

Capabilities: These tools use AI to create, edit, and modify videos, generating talking heads, animated avatars, and realistic speech movements. They can change backgrounds, apply special effects, and produce short videos from text. While still evolving, they simplify video production and editing tasks that once required expert skills.

Example use cases

  • Instructors: Create instructional videos, lecture summaries, and course introductions
  • Students: Project presentations, video assignments, demonstrations
  • Instructors and students: Convert text content to video format, creating educational animations

Key considerations

  • Maintain transparency about AI-generated content
  • Ensure accessibility standards are met
Audio/Speech/Music AI

Tools: ElevenLabs, Whisper, Murf.ai, Play.ht, Sonos.ai, etc.

Capabilities: These tools excel in speech and audio recognition, converting text to natural-sounding speech and transcribing with high accuracy. Features include voice cloning, emotion control, noise filtering, and multilingual support. Music-generating AI creates original compositions, mimics styles, and produces refined tracks with advanced editing tools.

Example use cases

  • Instructors: Create lecture audio, generate transcripts, voice-over for presentations
  • Student: Text-to-speech for accessibility, record presentations, language practice
  • Instructors and students: Content narration, podcast creation, audio feedback

Key considerations

  • Verify transcription accuracy
  • Address privacy and other ethical concerns with voice technology
Multimodal AI

Tools: GPT-4V, Claude 3, Google Gemini, LLaVA

Capabilities: Multimodal AI systems uniquely combine the ability to process and understand multiple input types, including text, images, and, in some cases, video. These advanced systems can analyze complex visual information while maintaining a dialogue, interpret technical diagrams and charts, and provide detailed explanations of visual content. They excel at tasks requiring the integration of visual and textual understanding, such as analyzing scientific figures or interpreting complex technical documentation.

Example use cases

  • Instructors: Analyze student visual work, create multimedia content
  • Students: Understand complex diagrams, visual problem-solving
  • Instructors and students: Combine text and image analysis, interpret technical diagrams

Key considerations

  • Understand limitations in analyzing technical content
  • Verify the accuracy of interpretation
Research and Academic Tools

Tools: Elicit, Perplexity, Consensus, Semantic Scholar, Research Rabbit

Capabilities: These tools streamline academic research by analyzing and synthesizing scholarly literature. They extract key findings, identify trends, map citations, and highlight research gaps. Advanced features include methodology analysis and summarizing complex papers across disciplines.

Example use cases

  • Instructors: Literature review, curriculum development, research guidance
  • Students: Paper discovery, research planning, literature synthesis
  • Instructors and students: Find relevant sources, summarize research papers

Key considerations

  • Verify (responsibly evaluate and validate) AI-generated summaries and citations.
  • Teach proper research methodology alongside AI tool use
Coding and Programming AI

Tools: GitHub Copilot, Amazon CodeWhisperer, Replit Ghost, Anthropic’s Claude, and Open AI’s Codex (and ChatGPT)

Capabilities: This category of tools acts as intelligent coding assistants, providing code completion, bug detection, and solution suggestions. They generate code from natural language, explain complex segments, and assist with documentation. Advanced features include pattern recognition, automated testing, and maintaining consistent coding styles.

Example use cases

  • Instructors: Create coding examples, debug student code, generate practice problems
  • Students: Code completion, debugging assistance, learning complex programming concepts
  • Instructors and students: Code documentation, refactoring, problem-solving

Key considerations

  • Ensure students understand underlying programming concepts
  • Develop guidelines for appropriate use in programming assignments

Cautions and Ethical Considerations

Effective integration of GenAI in education requires balancing its potential benefits against ethical considerations. GenAI technologies offer powerful opportunities to enhance teaching and learning, but their integration requires careful consideration of ethical, social, and pedagogical implications. By carefully considering these implications when designing courses and assessments, you can create quality student learning experiences that are both innovative and responsible.

Fabrication (accuracy and reliability)

GenAI models can generate content that appears plausible or factual but may be entirely fabricated or inaccurate). This phenomenon, known as "hallucination," threatens academic integrity. It occurs because these models prioritize pattern recognition and plausibility over factual accuracy.

Teaching implications

When incorporating GenAI in coursework, consider the risk of fabricated content in different types of academic work. AI-generated content might be suitable for preliminary drafts or brainstorming exercises (low-stakes activities). Still, stricter human oversight is required for high-stakes work like research papers, lab reports, or other assignments where factual accuracy is crucial. Students may unknowingly include fabrications in their work, especially when using AI to summarize or explain complex topics. These errors are challenging to detect as AI outputs often seem plausible and may slip through traditional fact-checking methods.

Strategies

  • Foster critical thinking and evaluation skills specific to validating the authenticity of AI-generated content
  • Use the GenAI Tool Use: Responsibility Statement sample form to encourage and guide the responsible use of these technologies with your students
Bias and discrimination

The large datasets on which LLMs are trained can contain biases, discrimination, and other harmful content. Biases may also be found in the algorithms that govern an LLM’s ability to learn and drive its output generation. GenAI models can perpetuate and amplify societal biases, potentially reinforcing discriminatory perspectives and stereotypes.

Teaching implications

Biases in AI systems create significant challenges across all academic disciplines. These biases and harmful content become particularly concerning when students treat AI-generated content as authoritative rather than viewing AI output as a starting point for critical analysis. Instructors must help students develop the skills to recognize and challenge these biases while ensuring that course materials and assignments actively counter potentially discriminatory AI-generated perspectives.

Strategies

  • Stress the importance of human judgment and critical evaluation when students work with GenAI
  • Use the GenAI Tool Use: Responsibility Statement template with students to encourage and guide responsible and ethical use
Data privacy and security

Using GenAI tools and applications raises important privacy and data security concerns. GenAI tools process substantial amounts of user data, including data from students and instructors, which raises specific concerns about student (and instructor) privacy and institutional data governance. This data can be used for AI platform improvement but may pose risks if exploited or compromised. Protecting student data from breaches and misuse is paramount.

Teaching implications

The data collection practices of GenAI tools create complex considerations for course design and assessment task requirements. Students who submit their work to these tools may unknowingly contribute their intellectual property to future model training. This becomes particularly concerning when assignments involve personal reflections, sensitive information, copyrighted material, or proprietary content. Students may also share personal or confidential information in their prompts without realizing the potential privacy risks.

Institutional policies regarding student data protection may limit or prohibit using certain AI tools, requiring instructors to carefully evaluate which platforms align with their institution's privacy requirements. These considerations become especially important in courses dealing with sensitive topics or working with vulnerable student populations.

Strategies

  • Regarding assessments and institutionally non-supported third-party GenAI tools and applications, instructors will want to prepare assignment and assessment alternatives for students who do not use third-party GenAI for assessment because of reasonable concerns about data privacy and security
Equity and access

While many GenAI tools are currently free, subscription-based models and technical requirements are cost barriers for some students. Access issues might prevent some students from using some tools, creating an unequal learning environment that worsens existing inequalities. This disparity may necessitate targeted interventions to ensure fair and equitable access to these potentially important tools. For instance, GenAI capabilities may soon be integrated into University-provided learning environments, applications, and other technological supports.

Teaching implications

The shift towards AI-integrated and enhanced education must account for varying student access to these technologies. Premium features in AI tools, which might provide more sophisticated or reliable outputs, may be available only to students who can afford subscription fees. Technical requirements for running advanced AI tools could exclude students using older devices or those with limited internet access.

Language learners or international students might face additional challenges when using AI tools primarily optimized for English speakers. These disparities can significantly impact learning outcomes if course design assumes equal access to AI capabilities.

Strategies

  • Select GenAI tools accessible to all students, avoiding tasks and tools that might favour those with more financial means
  • Remember that not all AI tools are designed with student accessibility in mind
  • Prepare assignment and assessment alternatives for students who cannot use a specific tool because of an access issue or cost concern that puts them at a disadvantage to other students
Authorship, copyright, intellectual property (IP)

AI-generated content challenges traditional concepts of authorship, copyright, and IP. As AI systems become increasingly sophisticated in generating text, images, music, and even code, traditional notions of ownership and originality are challenged. AI-generated content, a hybrid combination of human inputs (prompts) and GenAI outputs, blurs the lines of authorship.

This raises questions about responsibility and accountability for the created material. Some experts believe that traditional definitions of plagiarism may need re-evaluation in light of GenAI's capabilities (Eaton, 2023). Since most GenAI tools learn from vast datasets of existing content scraped from the internet, others argue that using copyrighted content without permission to train AI systems leads to potential copyright infringement.

Teaching implications

In some contexts, these issues will require careful consideration to ensure that intellectual property rights are protected and that the responsible use of AI aligns with ethical, legal, and academic integrity principles.

The collaborative nature of AI interaction – where students might refine and iterate their work through AI feedback – requires new frameworks for understanding original work and proper attribution. This is particularly complex in disciplines where AI can generate substantial portions of content, such as computer programming or creative writing.

Strategies

  • Place emphasis on the importance of responsible GenAI use and transparent attribution of GenAI use
  • Be aware of and help students understand possible copyright and intellectual property implications, especially when using AI-generated content in creative and/or academic work
  • Educate students to be ready to indicate when AI tools have been used and how they contributed to the final product
  • Use the GenAI Use: Acknowledgment and Reflection template to approach and guide this important expectation for academic work involving GenAI tools
Information quality and critical thinking (misinformation/disinformation)

Misinformation and disinformation are major concerns in the context of GenAI and increasingly sophisticated social media because these technologies can be used to create and amplify the spread of realistic but inaccurate information. The distinction between misinformation and disinformation hinges on intent.

While misinformation (in the form of fabricated output) might be spread unintentionally, disinformation is a calculated effort to mislead. Still, it can be difficult to determine whether the content was created with the intent to deceive, making it even more difficult for individuals to separate fact from fiction.

Teaching implications

The ability of GenAI to produce convincing but potentially inaccurate information requires a fundamental shift in how we teach information and digital literacy. Students must learn to navigate an information landscape where traditional markers of credibility may no longer hold.

This challenge extends beyond simple fact-checking to understanding AI-generated content's limitations and potential biases. In research-heavy courses, students need sophisticated strategies for verifying AI-generated information against reliable sources. The rapid evolution of AI capabilities means that information literacy skills must be improved to address new challenges in content evaluation and validation.

Strategies

Other ethical considerations

GenAI development raises concerns about labour practices and environmental impact.

Human costs
Concerns have been raised about the working conditions and compensation of exploited individuals involved in data labelling and content moderation, tasks crucial for training GenAI models. These outsourced jobs, labour subcontracted to firms operating in the Global South, often involve repetitive and mentally taxing work, and workers may be subjected to low wages and precarious employment. Individuals tasked with moderating content for GenAI models have been exposed to disturbing or harmful material, which can negatively impact their mental health. The emotional toll of reviewing such content requires attention and support for the well-being of these workers.

Environmental impacts
Developing and using GenAI requires significant computational power, leading to high energy consumption and carbon emissions. Training LLMs involves processing vast amounts of data, demanding substantial energy and resources. This energy-intensive process contributes to the growing carbon footprint of AI technologies. The data centers, which house the infrastructure for AI development and operations, also consume large amounts of water for cooling. As the usage of AI technology increases, the technology advances and its infrastructure expands, the demand for data centers is expected to grow, further intensifying water usage and potentially straining water resources in certain regions worldwide.

Teaching implications

The broader ethical implications of AI development create meaningful teaching opportunities while raising practical challenges for course design. Students (and instructors) may raise valid concerns about the labour practices involved in AI development or the environmental impact of LLMs. These objections require instructors to balance the educational benefits of AI tools against ethical considerations while respecting students' principled positions.

This dynamic creates valuable opportunities for discussing professional ethics within specific disciplines and to help students develop frameworks for ethical decision-making in their future careers. At the same time, it requires careful consideration of accommodating students (or instructors) who choose not to use AI tools fairly while maintaining consistent learning outcomes across the course.

Addressing these challenges in the context of GenAI and teaching and learning will require a multifaceted approach involving dialogue among multiple stakeholder groups.

Strategies to foster dialogue and inclusivity

  • Facilitate ethical discussions about the human and environmental costs of GenAI use during relevant course topics. Promote critical thinking and ethical decision-making within the relevant discipline
  • Offer alternatives: provide non-AI options for coursework and assessment tasks to accommodate students’ ethical concerns while ensuring equitable outcomes
  • Model responsible GenAI use: demonstrate ethical use in teaching, balancing its benefits with its broader impacts. Encourage students to evaluate the tools they use critically

Resources

Suggested supplementary resources for instructors:
Anthropic (Claude) user guides, Prompt design and much more

Educator’s FAQ, Open AI resource discussing academic integrity and assignment design

Generative AI Product Tracker (Ithaka S + R). Generative AI products marketed towards postsecondary instructors or students

Privacy and Data Considerations GenAI Quickstart: Foundations for Faculty (McGill University) Part nine in a series of online modules exploring GenAI and teaching and learning

Tips for Using GenAI GenAI Quickstart: Foundations for Faculty (McGill University) Part three in a series of online modules exploring GenAI and teaching and learning

Types of GenAI Tools GenAI Quickstart: Foundations for Faculty (McGill University) Part two in a series of online modules exploring GenAI and teaching and learning

What is GenAI? GenAI Quickstart: Foundations for Faculty (McGill University). Part one in a series of online modules exploring GenAI and teaching and learning

Resources for instructors (and students):
GenAI Use: Acknowledgement and Reflection, Student AI use process documentation

GenAI Tool Use: Responsibility Statement, Student AI use declaration

Resources for students:
Evaluating AI Content, University of Alberta Library

Google Learning Center, Using Google Gemini Chat to support learning

Student Guide to Writing with ChatGPT, Academic integrity and responsible, ethical, and transparent student use

Using Generative AI Guide, University of Alberta Library