“It doesn’t test what we want it to” – tweaking online exams to better assess students in pharmacy

Cartoon image of a student sitting in front of a large computer screen taking an exam.
A student taking an online exam on a computer representing the concept of Remote Learning and Technology Usage., created with Generative AI technology

For graduates working in healthcare, being able to undertake difficult tasks at a moment’s notice in a way that is safe is a core requirement. To meet this need university pharmacy programs have to be able to demonstrate that their graduates can prepare and supply compounded medications safely and accurately in accordance with current legislation and guidelines (Australian Pharmacy Council, 2022). To do this, students need to be assessed on a range of skills, including determining the required formulation and method to prepare this, preparation of the product, labelling of the product and completing appropriate documentation (Pharmaceutical Society of Australia, 2016).

At the University of Sydney, pharmaceutical compounding skills are taught in the compulsory undergraduate unit PHAR3815 Pharmaceutical Skills and Dispensing A. In this unit students develop the ability to write-up methods for preparing different products and different formulations. This is an essential skill for their future pharmacy practice. This was ordinarily assessed through the summative mid-semester quiz and final exam. However, the move to online exams due to COVID has raised two key issues:

  • more students did not finish the online exam in contrast to when it was completed traditionally using pen and paper
  • students do not attempt writing -up pharmaceutical methods

To explore and rethink this issue from a critically reflective standpoint, myself and the teaching team drew on Brookfield’s famous four lenses of critical reflective practice (Brookfield, 1995):

  • the self (our experiences as learners)
  • the students (what our students are saying)
  • the peer lens (what our teaching colleagues)

Drawing on this we were not only able to think about the reasons for these issues and but consider potential strategies to address them.

When we listened to our students. they told us that they felt that they did not have enough time to complete all the exam questions, something that was only made worse by the move to online exams. Many students described themselves as “slow typers” and felt that this was a significant disadvantage in an online exam. Given this, the best use of their time was to leave the method writing question until last, as this involves many steps and is worth the same number of marks as other types of questions (calculations and labels) that require less time and information to be entered. The requirement to identify the purpose of each ingredient used in the product in their ingredients list  was also challenging for students, provoking unnecessary exam stress when they waste time searching for information in their resources.

As a teaching team we were disheartened to realize that our assessment meant that students could pass the exam with large gaps in understanding and without demonstrating competence at the key skill of writing methods. We also identified information (e.g. purpose of different ingredients used in a product) that was unnecessarily tested in multiple questions and therefore used up students’ time that could be better spent on the key skills that we wanted to assess.

To address this learning we made the following changes to the exam:

  • students were provided with an additional 30 minutes in each exam, taking the total time to 100 minutes for the mid-semester quiz and 130 minutes for the final exam.
  • we gave students a suggested time of 2.5minutes per mark, an increase from 1.5 minutes per mark in 2021. This gives students the ability to better plan and manage their time.
  • we increased the marks for the methods writing questions to 8 (up from 5 in 2021) and ensured that this is higher than the label and calculations questions (worth 6 and 5 marks each, respectively)
  • we removed the requirement for students to identify the purpose of ingredients in their ingredients list, as this is tested in another question where they write a label for the product (which includes the active ingredients and preservative).
  • to account for the extra exam time, we added an additional question that tests understanding where we ask students to respond to common questions that we as coordinators receive (e.g. why we do things a particular way in the lab or why a common error is incorrect according to our grading rubric).

These changes had a positive impact both for the mid-semester quiz and final exam results. Compared to 2021, the average for both assessments were higher (62% vs 57% for the mid-semester quiz and 67% vs 58% for the final exam) and fewer students failed the exam (25% vs 30% for the mid-semester quiz and 4% vs 23% for the final exam). Perhaps most significantly, the proportion of students who did not attempt or finish their answer for the methods writing questions fell (21% vs 4% for the mid-semester quiz and 23% versus 5% for the final exam).

Examining this assessment through the lenses of our students, ourselves and our teaching colleagues allowed us to not only identify the ways in which our exams were not adequately testing a key skill but to devise student-centred strategies to address this.

Written By
More from Jessica Pace

Collaborative Canvas design – a case study to support student engagement

This brief case study promotes the benefits of a collaboration between an...
Read More