How can I update assessments to deal with ChatGPT and other generative AI?

There is a constant stream of news and opinions about ChatGPT, a relatively new generative artificial intelligence (AI) that can produce paragraphs of human-sounding text in a matter of seconds in response to prompts. We’ve previously covered how it works and how it can be used to make learning, teaching, student support and assessment more effective and efficient, focusing on how teachers and students might use it to draft resources, ideas, and perspectives. Here, we focus on one of the most pressing concerns around ChatGPT and similar technologies – namely, since every response it generates is unique, how do we update our assessment practices to take advantage of its capabilities whilst mitigating academic integrity risks presented by ChatGPT?

What is our approach at Sydney?

AI tools like ChatGPT are still immature but it is clear that they will become part of the way we work. It’s hard to think of a career that won’t be profoundly affected by them, including academia. To ensure the relevance of our curriculum and the authenticity of our assessment, it is thus imperative that we begin to bring these tools into the classroom and our assessments now. The approach at Sydney is the verbatim use of such tools to produce text which a student then submits as their own is not appropriate, and the new Academic Integrity Policy 2022 reflects this. Given the current limitations in the AI’s responses, which can include incorrect information, out-of-date explanations, and even non-existent references, it’s also important that students are aware of the dangers of using them uncritically, just as they need to be when they use other resources and tools such as textbooks, websites, and even scholarly literature. 

At the time of writing, use of ChatGPT is free but given its current popularity and potential, it’s likely that the creators will soon introduce a monthly fee for a higher service tier. Even now, users need to sign up to be part of the current research stage which the creators are using as part of its development. Requiring its use for assessment is therefore to be avoided.

Achievable things you can do this semester

Have an open conversation with students

Given the volume of social and mainstream media about ChapGPT, including the decision by the NSW Department of Education to ban it from its schools, it is most likely that your students will have already heard about it. They may have even tried it themselves. We’d recommend that, after you’ve had a chance to brush up on how it works and any caveats, you have an open and honest conversation about it with your cohort. The main points to bring across would be that:

  • it is designed to mimic human writing because it is trained on billions of words of human-written text from books and the internet, and so its responses may look realistic and believable;
  • it works like autocomplete, by mathematically predicting which word(s) have a higher probability of following other words, based on what it was trained on;
  • don’t trust anything it says“, since it is designed to mimic human writing but not actually understand concepts, so we still need to know things and think for ourselves;
  • that said, it can make pretty good guesses when asked questions, because it has absorbed so much human-written text.

It’s important for our students to start building their ‘AI literacy’, just as we have been helping them to build their digital and other literacies. Our graduate qualities associated with written communication and information and digital literacy are already described in a way that embraces this. You could even consider demonstrating ChatGPT in class, live, asking it to respond to a prompt, perhaps from an assessment. Use this as a teaching moment to explain the points above, emphasising that it doesn’t replace learning but actually makes it even more important than ever.

Active learning through in-class writing

Effective learning and assessment is relational, active, and applied. The presence of these AIs does not change this, although it may provide additional impetus for changing supporting practices. Consider making small tweaks to your learning and teaching activities so that students are actively writing in class. For example, instead of using the majority of class time to deliver content and have students composing work without active support from you, consider flipping your class slightly so that pre-class work is provided for students to cover some content independently, and class time is used to draft or revise written work. This way, you can be there to answer questions that may arise, live, and also discuss their work with peers. Build one or two 15-minute ‘writing sprints’ into a class session (like a compressed shut-up-and-write), and then ask if a student might be open to sharing their work on the screen up the front for you (and the class) to collectively revise. Yes, students may playfully provide an AI-generated response for critique, but this is still a teaching moment and over time they should see the value of these sprints.

Bring in local context

Increasing the authenticity of assessments may also help. Asking students to bring in ideas, evidence, perspectives, and data from contemporary or personal events or contexts will make it more difficult (although not impossible) for them to just ask an AI to write their assignment. For example, ask students to include core references from your unit that you specify, or a perspective from their family, or an analysis of news from the past month, or an example from their local geographic area. This has the added advantage of increasing the relevance of your assessment, which fosters motivation. Of course, students could still use AI to generate a draft (and indeed, it can be quite good at reflective writing) and then manually inject these contemporary references, but they will need to engage with the AI-generated text in order to meaningfully connect it with their context – and learn in the process

Use AI to personalise assessment tasks

More overt engagement with AI could come through you using it to generate resources or stimulus for students’ use in assessment. For example, ChatGPT is adept at forming descriptive prose. If you already have authentic assessments that ask students to perform a realistic task such as formulating a business plan, responding to patient complaints, drafting a memo for government, etc, then you could use ChatGPT to generate fictitious businesses, patients, and government agencies. We all know that case studies like these can be difficult to write; empowered by ChatGPT, you could actually draft and edit a unique case study for each group to use in your authentic assessment, which also makes it more difficult for groups to collude. Why not take it a step further and ask groups what their interests are – then you can prompt ChatGPT to draft a case study based on this interest, and truly individualise the case study to groups’ interests.

Update your exams

Across the University, we are slowly making good progress towards moving away from invigilated examinations, which in many cases are not authentic. If you are still running such mid-semester or final exams, you could consider how generative AI impacts upon the future of work in your discipline and design exam questions accordingly. For example, given AI is already used in industry to make computer programmers more productive, instead of asking students to write code in an exam, you may consider presenting AI-generated code and asking students to evaluate and fix it. As another example, since AI is increasingly being used to generate copy for news outlets, instead of having students write a whole article for a contemporary topic, you could present an AI-generated article and ask students to critique it and identify key areas of improvement. This way, even our invigilated exams could bring in elements of authentic practice from our disciplines.

These are all small, realistic tweaks in our assessment that we can do in the next few weeks and months to engage productively with AI. What are some things with a medium-term horizon that we could consider?

Things to plan for a bit later on

Try different assessment types

There are some assessment types that are more immune to AI and can allow students to develop and demonstrate understanding. Oral assessments are high on assessment security and, depending on the discipline, can provide an authentic, discursive context for students to deep dive into ideas and demonstrate their understanding to a knowledgeable examiner. Other forms of assessment are also less prone to direct impact by AI that generates text, such as asking students to generate podcasts and videos. Such multimodal assessment approaches have the benefit of helping students develop their communication and influence skills, and may generate authentic outputs for real world audiences. Additionally, staging assessments, such as requiring students to submit drafts, receive feedback, and improve their work, are less prone to risk from generative AI. Although it’s possible that students may use AI to write their drafts, you will be able to follow multiple steps in their writing process if they need to incorporate your feedback into future iterations.

Working together with AI

On another level, you may want to directly incorporate students’ (voluntary) use of AI into their writing. For example, for a written assignment you might give students a choice to use AI or not. For those who choose to use AI, the assignment might involve generating drafts using AI and then, through tracked changes and comments, demonstrate how they have revised the draft and improved upon it, and explain why. Given AI’s role in the future of work, this might actually be a fairly authentic task, because graduates in many fields will need to use AI to stay productive. Helping students now to learn how to engage well with AI, and learn crucial disciplinary skills and knowledge through meaningful engagement with AI, will serve our students well.

Tell me more!

More from Danny Liu

Chunking your class – lessons we learnt from observing other lecturers

In week 4, lecturers across the University opened the doors to their...
Read More