DID-ACT’s evaluation process for pilot implementations of the train-the-trainer courses

We are kicking off 2022 by building from our “Pilot Implementations on Clinical Reasoning” held from June 2021 until December 2022. This will provide added applicable information gathered from our recently published “Evaluation and analysis of learner activities of the pilot implementations of the train-the-trainer course“.

Evaluation for Quality Control

To ensure the quality of our curriculum’s development, our pilots accompanied a questionnaire for participants and facilitators. We are using this feedback to create necessary emphasis and/ or create a clearer final product for our learners. These responses were coupled with monitoring our chosen learning management system (LMS), Moodle, and virtual patient system, CASUS. DID-ACT’s six institutional partners took part in the evaluation by facilitating 9 pilot courses across Europe. In brief, approximately 100 teachers participated in the 5 clinical reasoning teaching topics in the train-the-trainer course pilots. Approximately half of the participants returned their evaluation questionnaires alongside 12 responses from facilitators. The results, discussed further here, coding was double-checked and disagreements were solved by consensus.  

Survey tools for clinical reasoning curriculum assessment

In our pilots we made the decision to use survey-based tools for measuring the train-the-trainer (TTT) and students courses. Our goal was to capture responses using fewer questions in a way that allowed for comparison between piloted units. In the end, we used Evaluation of Technology-Enhanced Learning Materials (ETELM). This tool, developed by David Cook and Rachel Ellaway, gave us our launch pad to have questionnaires be our standard evaluation tool. This was attractive for many reasons, including facilitating implementation into our learning unit template within our LMS.

Lessons learned around evaluations for our pilot implementation

Through iteration and collaboration with a psychologist, experienced educators, and researchers, we found the following pertinent for our project and beyond: 

  • Ensure you are using consistent language; i.e use either ‘course’ or ‘learning unit’, as pertinent to your project
  • Be mindful of using the word ‘practice’ as it can be interpreted many ways; i.e in DID-ACT’s case, we changed, “This learning unit will change my practice” to “This learning unit will improve my clinical reasoning” 
  • Providing participants an option to write free-text in their native language, as a project allows 
  • Avoid too many questions that may lead to overloading participants 
  • Asking about years in a profession versus age provides more succinct answers for what we needed.

 TTT Pilot Implementation Survey Results

We set up our questionnaires using a scale of 1 (definitely agree) to 7 (definitely disagree). The average score of responses was 5.8 when prompted with the question about whether the courses would improve the teaching of clinical reasoning. The pilots excelled  in the areas of selection of topics, small group discussions, the facilitators, and inter-professional learning opportunities. Growth was suggested in the areas of technical navigation of the LMS, assessment and feedback on process, and content that was tailored more to professions other than physicians. 

Analysis of Pilot Implementations

The survey questionnaires were analyzed on Microsoft Excel where, using quantitative methodology, we calculated the descriptive statistics. In contrast, for open-ended responses, we performed a content analysis. Participant utterances were coded with the categories proposed in D3.2 (Didactical, Content, Technical, Interaction/ Collaboration, Implementation/ Time, Implementation /Facilitators). As well, we extended by adding three more categories (Content/Assessment, Overall and Others). All data was processed anonymously with each statement being set as positive, negative, or neutral. 

Overall, the TTT pilot implementations were a success as well as were our efforts in evaluating them. We will implement constructive feedback applicable to other learning units as we continue to develop them. Alongside this, we will return to the original pilot implementations and amend what needs to be improved. You can read a more detailed overview of D5.2 Evaluation and analysis of learner activities during these TTT pilot implementations here.

How to teach synchronously in a virtual setting

  • You need a reliable camera, microphone, and virtual platform and be familiar with its features, such as whiteboard, chat, polling, breakout rooms, etc.
  • At the beginning establish communication rules, e.g. whether participants should raise their (virtual) hand, use the chat, and/or just speak. Also, we recommend asking participants to turn on their camera
  • For small group work break out rooms work very well, just be clear about the tasks the groups should work on prior to dividing them into the groups.
  • For collaboration the use of integrated virtual whiteboards or other platforms such as Padlet are very useful. Just make sure prior to the session that you have everything setup and the links at hand, e.g. to post them in the chat.
  • Allow a bit more time for starting the session and the group works as there might be participants who are not familiar with the platform or technical problems might occur.

How to motivate unprepared participants

  • Make clear that the asynchronous assignments are a core part of the course and that its content will not be repeated. Even if it is difficult, stick to that when starting the synchronous teaching session.
  • If you expect unprepared participants, you can start the session with a student-centered group exercise mixing prepared and unprepared students to increase peer-pressure and make them realize that being unprepared does not feel good.  
  • Use the introductory or closing quizzes / tests so that participants can self- assess whether they have the required knowledge and you as a facilitator can see the level of knowledge and preparation of your participants.

Further recommended reading:

How to involve participants with different levels of experience

  • To account for such different levels, we recommend making use of the asynchronous preparatory phases which also include introductory quizzes in which participants can self-assess their prior knowledge and you as a facilitator can assess the differences within your group. Participants with less prior experience can also be guided to additional preparatory resources.
  • Encourage participants to work in pairs or small groups when preparing so that they can help and learn from each other. You could even facilitate this by dividing them into groups with different levels of experience.
  • Similarly, during the synchronous phases, we recommend forming groups with participants different levels of experience and emphasize the peer support aspects of such group activities.
  • We also recommend starting with rather smaller groups and allow more time than stated in the course outlines, if you expect a heterogenous level of experience. This way you can better manage this challenge.
  • Encourage your participants to ask questions, emphasizing that nobody knows everything and that it is important for learning to ask questions.  
  • Especially in the train-the-trainer course you might have to deal with over-confident participants, who especially in an interprofessional setting can dominate the group. This is a complex cultural challenge, but you could try to establish (and follow) communication rules at the beginning of a session.  

How to address potential overlaps or redundancies

  • Identify what is already included and what is missing in your curriculum related to clinical reasoning outcomes and compare it to the DID-ACT blueprint. Prioritize learning outcomes that are not yet covered but regarded as important.
  • Identify activities, resources, or teaching sessions with similar learning outcomes that might be in need for change anyway because of low evaluation results, teachers or students struggle with it. These could be suitable for adding or replacing parts with DID-ACT activities.
  • Ask teachers and students about overlaps and gaps they see in their teaching / learning of clinical reasoning and where they struggle. This could also be done by a reflection round after related teaching activities in the curriculum
  • Although ideally a longitudinal integration is aimed at, we recommend to starting small with a pilot implementation to gain experience and develop a show case.

How to teach in an interprofessional setting

  • Allow for enough time prior to the teaching for the organization and motivation / encouragement of stakeholders and participants
  • Allow for enough time and guidance during the course so that the participants from the different professions can get to know each other and their professions and discuss their different perspectives. This might mean that you need to calculate some extra time in addition to the suggested duration of the learning unit.
  • There may be a different understanding of clinical reasoning in the different health professions, so we recommend making participants aware of this. You could for example use and adapt activities from the learning units on the health profession roles to facilitate this.
  • Courses in an interprofessional setting should not come too early in the curriculum (not before professions have formed their own professional identity - however, this also depends on the aim of the course). 
  • Make sure you have enough participants from different professions. If possible, the facilitator could divide the participants in smaller groups with an equal distribution of professions. 
  • Similarly, you need an equal distribution of facilitators / facilitators from different professions.
  • Develop customized learning materials considering the different professions. If needed you can adapt the material and activities provided in the DID-ACT curriculum.

Further recommended reading:

van Diggele, C., Roberts, C., Burgess, A. et al. Interprofessional education: tips for design and implementation. BMC Med Educ 20, 455 (2020). (Link)

Skip to content