By: Vasiliki Andreou
Medical trainees spend on average 2,080 hours in their workplace per calendar year!
It shows how much postgraduate medical education relies on workplace learning and assessment. Workplace assessments have numerous benefits, but their implementation may encounter potential obstacles. These obstacles include the lack of a clear definition of what needs to be assessed and time constraints.
The lack of standardization in assessment contributes to a degree of considerable variability in how trainees are evaluated. This variability makes it challenging to compare assessment results across different clinical settings, limiting the usefulness of the assessments in gauging trainee progress.
Another significant challenge for workplace assessments is time constraints. Assessors may struggle to find adequate time to complete assessments, particularly in a busy clinical environment where patient care is the top priority. Trainees may also find it difficult to balance assessment requirements with their clinical duties.
As medical educators, we were looking for a way to move forward. This is when we came up with the idea of following a competency-based approach to assessment in the workplace. Competency-based medical education (CBME) can help in overcoming these challenges of assessment in the workplace. CBME can offer a standardized framework by defining competencies necessary for effective medical practice, and it can essentially help ensure that all trainees meet the same high standards. Also, CBME can improve the time efficiency of assessment moments by working with standardized lists of competencies.
To test this assumption, we decided to put the competency-based assessment in the workplace to the test. Based on the CanMEDS competency framework, we developed two assessment tools for evaluating trainees in their workplace (Figure 1). The study was implemented in the Flemish General Practitioner’s (GP) Training program and involved both trainees and trainers. Over a period of several months, participants used the two assessment tools to evaluate their competencies and filled out questionnaires with both closed and open-ended questions to measure the outcomes of the study.
Is it worth the hype?
We found that both trainees and trainers considered the competency-based assessment system useful for the workplace. They thought that this system saved time during busy clinical work. The standardized lists were easy to fill in and use. They were also a great basis for openly discussing negative feedback when needed. By comparing trainees’ performance to a standard, trainers could argue why performance was good or bad. What we also saw was that both trainees and trainers started becoming aware of learning growth and development. After repeatedly using the competency-based instruments, trainees and trainers started realizing how trainees’ performance had improved throughout a period.
However, what we also saw was that trainees and trainers did not understand some competencies. They often complained about complex and not clear language. Some even said that they would even discuss it with other colleagues to comprehend what they needed to assess. Consequently, they often evaluated competencies that they thought as straightforward and avoided others with the more complex formulation.
Keep calm and carry on
In conclusion, we can draw some important conclusions from this study. First, trainees and trainers can see the value of a competency-based assessment system. Such a system helps with defining what needs to be evaluated while taking into account the fast pace of the clinical workplace. Nevertheless, we need to work further to use simple, understandable, and close-to-practice language to formulate medical competencies.
Moving forward, we should invest the time and effort necessary to develop and design assessment tools that are easy to understand and closely aligned with the realities of clinical practice. By overcoming the bottlenecks and challenges of competency-based medical education assessment, we can help to ensure the continued growth and success of our medical professionals.
About the author: Vasiliki Andreou is a PhD student at the Academic Centre of General Practice at KU Leuven, Belgium. She is interested in assessment methods in medical education. Her PhD focuses on implementing programmatic assessment in postgraduate medical curricula.
The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page
Images created by Vasiliki Andreou