Incorporating the use of AI tools in a student research project program

Adobe Stock, used under licence

The University of Sydney Doctor of Medicine (MD) Program has a mandatory 14-week, individual student research project in year 3 of the 4-year postgraduate degree. This program has ~300 students per year, 250+ research supervisors, and 12 research coordinators across 10 locations in NSW.

Student projects can be in any area of medicine or health, and many project types are allowed, including prospective or retrospective studies, dataset analysis projects, clinical projects, medical education projects, laboratory projects, structured literature reviews, study protocol development, and product development projects. Data collected and analysed may be quantitative, qualitative or both – a very broad scope.

The MD project is scaffolded over the 14-week block by a series of milestone assessment tasks which step the students through their project to the final assessment task, which is a written scientific report. We were clear up front (May 2024, in line with University policy) that we allowed the use of AI tools for anything other than generation of text, provided the students declared this use.

Addressing the risks of AI

We knew that any large written task, like a final scientific report, is going to be subject to the use of AI tools, so our team got together and we looked at each of the assessment tasks in turn and then considered what the impact of the AI tools (that we know of at the time!) would be on the MD project.

We tabulated all our assessment tasks and considered which, where, how and why AI tools could be used. This revealed not only weaknesses in the tasks, but also the places where a research project would be advantaged by these tools, and where we thought researchers would be using them. The “risk” appraisal of our assessment tasks revealed that we only had one assessment which we considered to be secure, the live question and answer session during the scientific oral presentation towards the end of the block. For the rest of the milestone tasks, we expect AI tools to be used.

The strategy for 2024 MD Project (May-Aug 2024) we devised was:

  1. Accept they will use these AI tools, and give clear advice on their use and the boundaries. Explain that we expect AI tools to be used in an ethical and responsible manner, and that students will declare and document their use of these tools. To this end we developed an AI use declaration they must complete upon submission of their final report, which asks them to document which tools were used, how they used them and a reflection of the outputs.
  2. The Milestone tasks through the project are formative and designed to keep the project moving so they submit their reports on time. These are essentially progress reports, and the feedback on these focusses on the learning process and project management as well as understanding of the research topic.
  3. Move to a tighter oral presentation, which we deliver in a more formal setting. This is challenging though: 300 students,15 min each, live oral presentations equals 75 hrs plus marking and administration. The live questioning is the critical assessment element and this needs to be equitable across the cohort, ensuring the questions are fair, probing and relevant and only the student answers them.

Outcomes and Impacts

We were clear, realistic and honest with the students about using AI tools and the aim was to create space for the students to feel comfortable experimenting with them. We note that assessing 300 oral presentations is challenging and requires planning and resourcing, however this is a valuable learning activity, the live questioning is a secure assessment task and the mini-symposia are a great way to get students and academic staff together to showcase the research project work. We were also prompted to review the learning objectives around research skill development and added in professional skill development – teamwork, project management, time management, critical thinking and problem solving, interpersonal skills (or, up-managing your supervisor/superiors!). This reflects a better articulation of skills we want and expect the students to learn in the MD Project. We know you can’t do effective research without these skills and they are critically important skills for future clinicians.

Tips and tricks for educators

  1. Do a realistic risk appraisal of all your assessment tasks. Consider all the AI tools that you are aware could be used to complete assessment tasks and test them out yourself on your assessments. These tools change weekly, so be aware of and prepared for that.
  2. Appraise the learning risks alongside the benefits that AI tools will bring. Refocus on the fundamental learning objectives of the task.
  3. Remember that our students need to be our partners in this revolution; they are (and should be) adopting these AI tools.
  4. Train your students and all staff in effective, ethical and responsible use of these tools.

Acknowledgements: MD Projects team: Sally Middleton, Richmond Jeremy, Rajneesh Kaur, Medicine eLearning team: Daej Arab and Aylwin Sim

Tags from the story
,
Written By
More from Joanne Hart

It’s no mean feat! Assessing medical student research projects at scale

The University of Sydney Doctor of Medicine (MD) Program includes a mandatory...
Read More