Assessing Communication Skills in a Clinical Environment using iPads

 

Natalie Pollard holding an ipad
Image, Tina Barclay 2017. All rights reserved

In the Faculty of Health Sciences we know how essential it is that students receive specific and timely feedback to help them improve their performance. This is particularly important before students enter a clinical environment on placement. We want to be assured that they won’t find it hard to communicate effectively with patients and that they are able to practice safely. So a Week 5 practical exam was set as a diagnostic assessment event.

Natalie Pollard, an Associate Lecturer in Work Integrated Learning, manages all aspects of students’ diagnostic radiography clinical training in the Faculty. She has tried several assessment designs that aim to provide students with an authentic assessment of their communication skills and give them fast and rich feedback, without it being too time consuming for examiners to mark.

In 2015, the initial assessment of student communication skills took the form of a video exam. Eighty students submitted a 4 minute video of themselves acting as a radiographer and obtaining a clinical history from a patient. Students were also asked to watch the video prior to submission, complete the marking rubric and write a 1 page reflective piece about their communication skills. One of the problems with this assessment design was that students were able to formulate their own script of patient responses and practice the skill an unlimited number of times before submission. It was thus not an ‘authentic’ assessment of communication skills.

In 2016, examiners assessed students’ ‘live’ performance using a paper based marking rubric. Students were allocated a 10 minute time slot and were provided with a Patient request form, a Clinical history form (containing prompts of questions to ask) and the marking rubric. During this time, students needed to establish a rapport with their patient in order to obtain their clinical history and record this information accurately on a clinical history form. Completed clinical history forms are submitted to the examiner at the end of the exam.

The disadvantage of this design was that it didn’t allow students to receive specific individualised feedback as there was insufficient space on the form and in any case, examiners reported they didn’t have the time to write detailed comments.

Subsequently, Natalie worked with the Faculty’s Educational Development Team to create an iPad marking solution. The team was able to transfer the entire exam into electronic format using eOSCE software which could then be transferred to iPads for multiple use in the Faculty.

An eOSCE is an electronic OSCE, an objective structured clinical examination, a short hands-on, real-world test used frequently in health sciences education. OSCEs are designed to test clinical skill performance and competence in skills such as communication and clinical examination. Because of their authentic approach to learning, they keep students engaged and allow them to understand the key factors that drive medical decision-making.

This Semester, each examiner has their own individual iPad. When they open the eOSCE program, there is a list of the students they will be examining down one side, and when they click on the student name the relevant rubric appears. The exam coordinator can add in prompt questions for the examiner at the top of each section of the rubric to help maintain consistency between examiners. Each examiner reads the categories on the rubric, and clicks on the mark for the student for each individual category. There is space to type in comments (which we found all examiners took the time to do as this was faster than writing), or they could use the voice recognition software embedded in the program to record their feedback directly into the iPad. The coordinator of the exam has a central iPad into which the data for all students is loaded. The coordinator can see in real time how each student is going, and flag early on any students that have failed. Following the exam, students are emailed a PDF document which has all their results and comments from their examiner.

Previous examiners have reported that the software is easy to use and much less cumbersome than shuffling multiple pieces of paper, and there is less preparation time required for exam organisers, e.g. no printing of rubrics. Far fewer students ask for further feedback about their test result. Higher USS scores are achieved in the question about personalised feedback because teachers can distribute the marks a lot quicker than when a paper based system is used. And it is much easier to run on time on exam day!

 

Written By
More from Tina Barclay

Active Teaching Tips: One way to share and display students’ handwritten work during lectures

Given the opportunity, students are keen to actively participate in a lecture...
Read More