‘Tell me what you learned’: oral assessments and assurance of learning in the age of generative AI

Adobe Stock used under licence

Generative artificial intelligence (AI) has created new challenges for educators seeking to design assessments that provide assurance of learning and uphold academic integrity. In this context, oral assessments have re-emerged as a powerful tool and safeguard.

Over the past two years, we have been running some types of oral assessments in lieu of traditional written assignments in the Business School’s Discipline of Work and Organisational Studies. These assessments have proven highly effective in motivating students to learn using authentic approaches that harness the latest digital technologies (‘engagement in learning’) while simultaneously ensuring that students have mastered the required disciplinary knowledge and skills (‘assessment of learning’). In our experience, oral assessments are consistent with the University of Sydney’s ‘Lane 1’ and ‘Lane 2’ approaches to assessment design and help to resolve many of the challenges associated with assurance of learning in the age of generative AI.  

How does it work?

We provide two case studies of how we have successfully applied oral assessments in our units.

Case 1 – Presentation follow-ups to dive deeper

In WORK2205 (Strategic Human Resource Management) and WORK5002 (Foundations of Human Resource Management and Industrial Relations), students are assigned a case study describing a people management problem facing a fictional organisation (e.g., should hybrid and remote workers be required to return to the office, or how to design an organisational mental health program). Students are given a number of assigned readings and instructed to develop a brief presentation using only those readings to inform a set of recommendations.

The oral assessment consists of a 10–15-minute one-on-one session with their examiner, in which students present their recommendations and answer a series of follow-up questions. To ensure academic integrity across asynchronous sessions, examiners draw from a bank of questions to create a unique question-and-answer session for each student. For example, students are asked to explain how the empirical findings in an assigned reading support their recommendations, or to explain how a particular theory or framework relates to the case study scenario. Students are assessed on their mastery of the assigned readings, as evidenced in both the scripted presentation and the question- and-answer session, and their ability to explain how those readings relate to the case study scenario. All sessions are recorded for moderation purposes.

Case 2 – Group-based individual oral exploration

In WORK6010 (HR Data Insights), the oral assessment aims to assess students’ understanding of key concepts early in the semester. Students are asked to propose a specific Human Resources question they are interested in examining and must explain how they would use an analytical tool they have learned about in earlier weeks to help answer their question.

The oral is held in the presence of their group assessment teammates, with each student receiving 5-min to present their ideas to the examiner and answer 2-3 clarifying questions raised by the examiner in response to their presentation. At least one question will invite students to further explain or clarify a specific point that they mentioned, and another question will probe their understanding of a related concept or idea that the student did not mention. Students are assessed on how clearly they can speak about their ideas, respond to the questions asked, and demonstrate their understanding of and emerging ability to apply what they have learned to date.

Balancing authenticity, authentication, and student anxiety

In both cases, the balance of scripted presentation and question-and-answer session enables students who may feel anxious in this setting to order their thoughts ahead of time, while the question-and-answer session enables examiners to authenticate students’ mastery of the material they have just delivered and probe for deeper understanding. Our students are future business professionals who will likely spend much of their careers making fact-based presentations to clients and senior leaders. These oral assessments are authentic in both form and function, allowing students to hone their presentation skills and their ability to field impromptu questions. For examiners, this approach facilitates an assessment of disciplinary mastery, including students’ ability to recall key concepts and theories, and to apply those concepts and theories to real-world scenarios.

This approach empowers students to use digital tools responsibly. Many modern professionals use digital tools such as generative AI to prepare reports and presentations. Yet we know that such tools are not always reliable, and are even prone to ‘hallucinations‘, creating nonsensical arguments and fabricating references. In our units, students may use a variety of digital tools (including generative AI) to help them prepare for their presentations, to analyse and synthesise available information and construct their arguments. However, the requirement that presentations be grounded in unit-related concepts and theories requires students to ‘fact check’ any information sourced online against each unit’s assigned content. In this approach, we effectively teach students how to cross reference information sourced from a search engine or generative AI tool against high quality, peer-reviewed research – a vital skill for any future professional.

Can it scale?

This assessment approach has now been trialled in two postgraduate units (WORK5002 and WORK6010) and one undergraduate unit (WORK2205). All three are core units, and typically see enrolments between 50 and 115 students. Scheduling one-on-one sessions with so many students is potentially the most challenging and time-consuming aspect of this approach, but this can easily be streamlined with digital tools like Calendly or by having students book their sessions during class time. The amount of time required to conduct and mark each assessment is broadly on par with the time previously spent marking written assignments.

the interpersonal interaction between examiner and student heightens accountability on both sides; each party must be prepared and actively involved in the process

We have found this form of assessment to be highly efficient and — with a few exceptions where moderation is required — the allocated marking time is consistent across the cohort. Setting up an assessment like this may feel daunting, but in practice we have found it to be less labour intensive than providing extensive feedback on individual written assessments. Moreover, the interpersonal interaction between examiner and student heightens accountability on both sides; each party must be prepared and actively involved in the process. Our students experience that being well prepared is worth the effort, as the consequences of inadequate preparation when face-t0-face with an examiner feel more tangible. Consequently, we have observed a decrease in the number of informal and formal appeals from students following the introduction of this assessment approach. In our experience, this approach saves time, increases student engagement and accountability, and is well worth the effort.

How do students feel about these oral assessments?

These oral assessments have proven highly popular with our students. Across all three units, student responses to the Unit of Study Survey question 5 (“the assessment tasks challenged me to learn”) improved following the introduction of the oral assessment. Furthermore, students tell us they would like to see more of this type of assessment.  

In 2023, students of WORK2205 were asked whether they would prefer to see the oral assessment replaced with an invigilated exam. Most students reported that it was “challenging but rewarding”, “beneficial for my future career” and “an interesting and different assignment for real-life applicability”. In the words of one student: 

Being exposed to a 10-minute presentation can be daunting, but I think it is necessary to ensure you know what you’re getting yourself into as a student, and what to expect in your future career.

Oral assessments are an excellent way to develop students’ communication skills, and enable examiners to seek immediate clarification, which can be helpful for students from non-English speaking backgrounds. There is less pressure for students from non-English speaking backgrounds to demonstrate mastery of unit content using perfect grammar or syntax, as students need only convey their ideas conversationally. Second, this assessment approach enables the examiner to seek immediate clarification of concepts which may not have been conveyed clearly in the first instance; for example, if ideas expressed during the presentation are unclear, the examiner can use the question-and-answer session to probe and verify students’ understanding.

This approach also highlights each student’s individual voice and perspective, making it challenging for others to manipulate or replicate their work. This individualised evaluation process is a powerful deterrent against contract cheating and the misuse of generative AI. In addition to sending a clear message about the importance of academic integrity, oral assessments require on-the-spot thinking, and the ability to analyse and synthesise information. This puts a spotlight on students’ critical thinking skills, which are crucial to academic and professional success.  

Generative AI is here to stay. Oral assessments offer a powerful way to evaluate and assure students’ learning and contribute to a culture of academic and professional integrity. 

 

More from Meraiah Foley