Challenge
Professor Robert Collier, from the School of Computer Science at Carleton University in Ottawa, Ontario, explains we have entered a new era of accessible artificial intelligence (AI) in education. Educators are both enthused and concerned, recognizing there is no single definitive approach to integrating AI or predicting its long-term impact. His recommendation is to incorporate AI in ways that are ethical, legitimate and effective, while leveraging AI to improve student learning outcomes.
In his introductory courses in the Computer Science program, rationales, best practices and skills for proficient AI use are highlighted. This ensures graduates accumulate the knowledge, experience and confidence to work effectively with AI as it evolves throughout their degree programs and their careers.
Experimentation
Dr. Collier and Connor Hillen, a Computer Science lecturer, were awarded a Future Learning Innovations Fellowship from Carleton University. Together they developed a first year, required course in the School of Computer Science designed to provide students with a solid foundational understanding of AI. They collaborated with Carleton’s Teaching and Learning Services, which supported the course’s integration into the learning management system (LMS), facilitated ethical approvals and assisted with other pedagogical and administrative aspects.
The course teaches students how to structure their engagement with AI, so they are an integral and continuous part of the coding process – going beyond merely providing initial direction and accepting first-run results. Professor Collier likens AI to an intern: capable of assisting with tasks but not ready to lead a project. Students must maintain control, understand the process thoroughly, learn to formulate effective AI queries, critically assess AI-generated results and instruct AI to refine these results appropriately.
To facilitate this learning, the course requires that the students understand and master the essential steps in manual and AI-assisted coding, submitting their development steps as well as the final products for assessment. This restricts their use of AI in the design stage of an assignment as ChatGPT would not be able to produce the evidence of this preparatory work.
One major concern among Computer Science students is the fear that AI will replace them, particularly in coding roles. Professor Collier counters this by emphasizing that AI serves as an assistant, not a substitute. The course is structured to promote active engagement with AI, while reinforcing the essential role of human oversight.
Introduction to Computer Science 1 is built around three foundational areas of learning:
- Algorithm design – Analyzing a problem and developing a structured approach to solving it.
- Computational thinking – Understanding how computer code operates.
- Translation – Converting algorithm design into functional code.
Rather than beginning the course with a focus on AI instruction and AI-aided assignments, the course requires that students submit their manually generated preparatory or scaffolding work for assessment. These tasks are completed without AI assistance. For example, students fill in tables and prepare test cases of what the program they are preparing should accomplish by hand. Students typically perform these tasks as part of the design phase of any assignment, although novice students sometimes believe that skipping these tasks speeds up their work and so neglect them.
Professor Collier assesses both the final product and the scaffolding activities to ensure the students “do the rough work” themselves. Students who simply ask ChatGPT to respond to the problem are not able to demonstrate the necessary “rough work,” making the assignment somewhat resistant to illicit AI use.
For other assignments, Professor Collier provides wrapper libraries which are collections of prepared, usable code fragments that can be used by programmers to perform certain, typically common, tasks. Drawing libraries, for example, are used by programmers who need to draw shapes on the screen so they don’t have to write the functionality themselves. They can use a code library to perform this function.
Professor Collier wrote a computer program to generate a slightly different wrapper library for each student. ChatGPT is not familiar with the functions outlined in the wrapper libraries which restricts its usefulness in writing code to use them. In addition, the individual wrapper libraries interferes with students who would want to share code illicitly. Consequently, this assignment requires each student to work individually to read and analyze code, without AI assistance.
By the course’s mid-point, students transition to pair programming, where two students share a keyboard. One writes the code while the second student is the “navigator” directing the big picture, so the coding project achieves its goal. This models AI-assisted programming, in which the human serves as the “navigator” while AI performs the coding, subject to ongoing review.
Building on this foundation, students work with ChatGPT, learning about AI-assisted source code generation and prompt engineering to get quality responses. As well as acquiring the skills, students learn about the strengths and weaknesses of AI. One cautionary example Dr. Collier uses is a Hangman game coded in the Python programming language. While the AI-generated program appears functional, it frequently declares incorrect game outcomes, demonstrating the need for rigorous verification.
By the end of the course, students complete two take-home assignments using ChatGPT. One is an interactive game requiring them to produce both text and code. When submitting assignments, students must include transcripts of their interactions with ChatGPT, not just the final code and text. Evaluation is based in part on their ability to create and use prompts effectively to refine AI-generated responses.
A face-to-face tutorial (laboratory) assignment introduces students to the idea of the biases inherent in large language models and some of the ethical and legal considerations that can arise with their use.
Professor Collier emphasizes that AI will provide an answer to every question – even when the response is wrong. AI models attempt to sustain and control conversations. In the course, students need to gain the skills to challenge and direct AI vigorously to ensure accurate and complete results. This is as true in coding as in text responses. He advises students to view AI not as an “intelligence” but as a conversational model that, as he expresses it, “is capable of lying if it keeps the conversation going.”
During the first few weeks of the course, Dr. Collier deliberately restricts the use of AI, ensuring students develop fundamental skills independently, even those that could be performed by AI. By the end of the course, however, he wants students to be proficient in using AI effectively. Striking this balance is an ongoing challenge as it is crucial to avoid portraying AI as either as adversary or as a universal solution.
The course is offered for the third time in the Winter 2025 session.
Results and Improvements
Student feedback has been largely positive, with many appreciating the structured approach to learning about fundamental skills as well as how best to work with AI. However, students found the initial assignments to be vague and consequently challenging. Professor Collier attributes this to his intentional efforts to minimize AI usage early in the course. In response, he refined assignment instructions and clarified learning objectives.
In the latest offerings of the course, Dr. Collier maintained the scaffolding and wrapper library assignments but reduced the number of assignments, making them more interconnected. The purpose is to provide students with a portfolio piece to recognize and showcase their cumulative skill acquisition. Through encouraging engagement with the tasks and a sense of accomplishment, he hopes to discourage illicit use of AI.
Potential
As a Teaching Mentor in the Department of Computer Science, Professor Collier serves as the first point of contact for new or contract instructors regarding issues such as academic misconduct, exam procedures and other concerns.
With both new and experienced educators, Professor Collier encourages them to avoid being too tied to traditional approaches in their reactions to plagiarism and other student misconduct concerns, especially those around the use of AI. This can result in reverting to personally supervised assessments or waiting for a ‘silver bullet’ solution to these issues.
Instead, he encourages showing students how AI can contribute to effective learning by using it for support and assistance. In his Computer Science course, this combines building fundamental skills core to their profession with an appreciation for AI’s capacities and how to best they can be applied. This approach can help educators provide students with constructive introductions to and expectations for AI.
For Further Information
Dr. Robert Collier
Associate Professor, Teaching Stream
School of Computer Science
Carleton University
Ottawa, Ontario