#MacPFD14
Workshop Abstract

Asking the Right Q:

Evaluating student learning experiences and empowering course reform though course evaluations

💻 Delivered Virtually

📅May 25, 2021

Presenters:
Bruce Wainman
for Danielle Brewer-Deluce, Anthony Saraco,  and Noori Akhtar-Danesh, 

Objectives:
By the end of this session, participants will recognize the relevance of course evaluations for long-term evidence-based reform, and understand how q-methodology can be used to generate a rich, statistically interpretable source of actionable information. 

The Problem:
One year ago, pandemic-related restrictions quickly forced courses online, and despite educators’ best intentions, the speed of transition coupled with limitations on time, resources and technology, the student learning experience just kind of happened. Content was delivered in whatever way was feasible, and students learned what they could with the knowledge that remediation and supplementary opportunities would surface once restrictions lifted. Now, a year later, with many courses continuing online and the return to face-to-face learning on the horizon, talk of creating permanently online or blended learning options for the future have emerged. The question outstanding however, is how has learning in our now digital classrooms has been affected, and which aspects of this revised style warrant investment and repeating in future years.  

The Gap:
With current course assessment methods (Likert-scales and open-ended feedback), there is no clear way to know how changes in a course have impacted the student learning experience. Fortunately, in a recent series of studies, our group established Q-methodology as a robust alternative to traditional course evaluations. Through utilizing a revised statement ranking system, and combining qualitative and quantitative analysis methods, Q-methodology mitigates issues of disparate feedback, and averaged scores which make Likert-scale data difficult to interpret and act upon.

The Hook:
Applying sequential Q-methodology assessments to the 2019 and 2020 cohorts of a large undergraduate anatomy course, we found 3 statistically grouped pools of students with a generally positive, negative or neutral disposition toward the course. Those in the 2019 and 2020 cohorts were significantly differently distributed amongst the factors (p = 0.006), such that a greater proportion of the 2020 vs 2019 cohort (58% vs 37%) expressed dissatisfaction. Primarily, dissatisfaction surrounded issues with course structure and assessment – two aspects which were significantly altered in response to the 2020 pandemic. 

The Takeaway:
Clear evidence for course reform relies upon knowing what worked, and what didn’t. In this case, while a negative shift in disposition seems unremarkable, Q-methodology uncovered key elements contributing to student dissatisfaction which are important to consider as programs consider online-learning as a long-term option.Â