Quality Criteria

On this page you’ll find the documented measurable quality criteria for DID-ACT project deliverables.


Quality Criteria for D1.1 (a) Report on specific needs, preoccupations of stakeholders, and barriers: web survey results

  • ≥ 35 interview participants (approx 5/partner)
  • ≥ 70 survey responses (approx 20 per partner)
  • Interview guideline for implementation and analysis provided
  • Diverse range of participants (faculty, students, educators,…) for the interviews and survey
  • Survey, interview questions, and report reviewed and agreed upon by all partners
  • Pre-testing of survey (think aloud, piloting with partners)
  • Conduction of as many interviews as necessary for clear understanding of current practices and barriers.
  • Scientific publication as a long-term quality criteria beyond this deliverable

Quality Criteria for D1.1 (b) Report on specific needs, preoccupations of stakeholders, and barriers: interviews

  • Interview guideline for implementation and analysis provided
  • Survey and interview questions reviewed and agreed upon by all partners
  • Diverse range of participants (faculty, students, educators,…) for the interviews and survey
  • Conduction of as many interviews as necessary for clear understanding of current
  • >= 35 interview participants

Quality Criteria for D1.2 Report on solutions for the needs described in D1.1 and consequences for the curriculum development process

  • Implement a Design thinking workshop informed by the interview and survey results (deliverables D1.1a and D1.1b) according to best practices (had to be adapted to an online format).
  • Participation of all partners in the design thinking workshop and providing an open atmosphere for discussion.
  • Prioritization of solutions by the partners regarding their importance and feasibility.
  • Provide and discuss solutions on an institutional and project level to ensure applicability also for non-partner institutions.
  • Discuss consequences for all identified barriers, even if no solutions can be provided by DID-ACT for all barriers.

Quality Criteria for D2.1 Learning objectives for the clinical reasoning curriculum in a database.

  • Consensus workshop to discuss CR frameworks, one for students and one for faculty.
  • Review CR frameworks in relation to the needs analysis and the literature
  • A graphical representation describing aspects and dimensions of the CR curriculum frameworks will be reviewed by two associate partners and an external expert in addition to project partners.

Quality Criteria for D2.2 Curricular framework with teaching/assessment methods for the student curriculum and the train-the-trainer course

  • Consensus workshop to discuss the clinical reasoning framework, for the student curriculum and the train-the-trainer course.
  • Review clinical reasoning frameworks in relation to the need’s analysis and the literature.
  • A tabular representation describing aspects and dimensions of the clinical reasoning curriculum frameworks will be reviewed by associate partners.
  • Constructive alignment serves as a guiding principle for the learning objectives and curricular framework development.

Quality Criteria for D2.3 Collection of available open educational resources (OER) suitable to be used in WP3 and 4

  • Review of each resource of at least one DID-ACT partner, other than the one entering the resource, in order to secure access and relevance.

Quality Criteria for D2.4 Publication of recommendations for learning objectives of a clinical reasoning curriculum

  • As a base for the recommendations we will analyze at least five different learning objectives catalogs from different healthcare professions and partner countries.
  • Consensus workshop in which different competences and perspectives in the DID-ACT team are used to provide input to the recommendations.

Quality criteria D3.1 Course outline and online course material

  • Tailored and adaptable to the educators needs
  • Easily accessible, applicable, and understandable
  • Based on the DID-ACT curricular framework (D2.2)
  • Reviewed by all partners and additionally by associate partners

Quality criteria D3.2 Pilot implementations Train the Train Course(s)

  • > 50 participants (grant proposal)
  • Thoroughly evaluated incl. direct feedback & learning analytics
  • Covering a wide range of topics of the train-the-trainer courses that fit to the partner faculty development program
  • Piloting of at least two same learning units by 2-3 partners
  • Piloting at least to courses in an inter-/mutliprofessional setting

Quality criteria D3.3 Refinement of course(s)

  • Workshop on sharing experiences of pilots at the beginning of this deliverable
  • Considering all issues identified in the pilot implementation
  • In close cooperation with target group(s), partners, and associate partners and with repeated feedback rounds
  • Discussed and agreed upon by all partners

Quality criteria D3.4 Development of course certificate (In Review)

  • Meets the needs of the target group (refine after WP1)
  • Fullfills local and national requirements
  • Automatically created for end-user
  • DID-ACT layout

Quality criteria D4.1 Course outline and online course material

  • Tailored and adaptable to the learners and educators needs
  • Easily accessible, applicable, and understandable
  • Usage of OERs where applicable
  • Based on the DID-ACT curricular framework (D2.2)
  • Reviewed by all partners and additionally by associate partners

Quality criteria D4.2 Pilot implementations Student Course(s)

  • > 500 participants
  • Thoroughly evaluated incl. direct feedback & learning analytics
  • Running at partner institutions (except INS) + 3 other institutions
  • Covering a wide range of topics that are relevant for the partner curricula
  • Piloting of at least two same learning units by 2-3 partners
  • Piloting at least two courses in an inter-/mutliprofessional setting

Quality criteria D4.3 Refinement of course(s)

  • Workshop on sharing experiences of pilots at the beginning of this deliverable
  • Considering all issues identified in the pilot implementation
  • In close cooperation with target group(s), partners, and associate partners and with repeated feedback rounds
  • Discussed and agreed upon by all partners

Quality criteria D4.4 Plans for longterm curricular integration (In Review)

  • Each parter to provide a plan
  • Involvement of at least 3 associate partners or external stakeholders
  • Aligned with D7.3
  • Plans include description, milestones, stakeholders, potential costs

Quality Criteria for D5.1 Set of evaluation and analysis tools

  • Recommendations will be made for two separate course evaluation target groups: students and educators
  • We will select evaluation tools that measure outcomes at the reaction level (level 1) in the Kirkpatrick model
  • In addition, we will select evaluation tools that include self-assessment items to cover DID-ACT learning goals set in D2.1
  • The selection of evaluation items and tools will be informed by a literature review to promote re-use of validated items and tools
  • Data covered by the learning analytics models will be mapped against the learning goals set in D2.1 and learning methods from D2.2
  • The selected learning analytics tools should not require additional payment
  • Selected usability evaluation tool should meet standards in web ergonomics evaluation

Quality Criteria for D5.2 Evaluation of train-the-trainer course

  • Selection of evaluation tools
  • Each partner delivers an evaluation schedule
  • Collected data provided by each partner institution
  • Analysis of collected data based on state-of-the-art analytical methods
  • Report and interpretation of results reviewed and agreed by all partners

Quality Criteria for D5.3 Evaluation of pilot student curriculum

  • Selection of evaluation tools
  • Each partner delivers an evaluation schedule
  • Collected data uploaded from each partner institution
  • Analysis of collected data
  • Report and interpretation of results reviewed and agreed by all partners

Quality Criteria for D6.1 Planning and introduction into the agile project management

  • F2F part on kick-off meeting
  • Limited Support questions
  • Feedback from team members

Quality Criteria for D6.2 Quality criteria for each deliverable (In Review)

  • For declaration please see section for each deliverable in this table
  • Defined QC fulfilled for all deliverables?

Quality Criteria for D6.3 Monitoring of the working progress

  • Half year reports on project outcomes, documenting whether deliverables were on track and deviation from project proposal
  • Report reviewed by all partners

Quality Criteria for D7.1 Social network analysis (SNA) & SM strategy

  • SNA: Includes all partners present at the kick-off meeting and the channels they are most active in
  • SNA: provides practical conclusions for the social media strategy
  • SNA: consider data privacy aspects and does not include or publish any personal not-publicly-available information
  • SM:> 200 new followers until M36
  • SM: avg of 4 posts/month over all active channels
  • SM: followers from >200 institutions worldwide

Quality Criteria for D7.2 Website and learning management platform

  • Website: > 50K visit website hits
  • Website: > 5.000 unique visitors
  • Website: Apply rules for accessibility, usability, access for disabled persons
  • LMS: Apply rules for usability, accessibility (also for disabled persons)
  • LMS: Evaluation if ≥ 5 different LMS
  • LMS: Solution fullfills defined requirements

Quality Criteria for D7.3 Guideline and support for integrating the train-the-trainer course and the student curriculum into non-partner healthcare institutions

  • > 50 downloads in M33 – 36 (grant proposal p.52)
  • ≥ 2 external reviews from associate partners or other stakeholders
  • Reviewed and agreed by all partners

Quality Criteria for D7.4 Dissemination events

  • > 200 participants in workshops, presentations during the project
  • Evaluation of dissemination events when possible: Feedback from participant(s)?
  • ≥ 10 dissemination events / kongress abstracts (national, international, non partner countries)
  • At least 1Dissemination event per profession in an interprof. and/or nursing conferences (national / internantional)

Quality Criteria for D7.5 Developing Sustainability model (In Review)

  • Reviewed and agreed upon by all partners
  • Basic costs calculation for all activities
  • Defined future roles for all partners and associate partners

Quality Criteria for D8.1 Signed consortium agreements

  • Signed agreements from all partners
  • Signed amendments from all partners, when necessary

Quality Criteria for D8.2 Kick-Off Meeting

  • Kick-Off meeting in Augsburg

Quality Criteria for D8.3 Yearly Project Reports

  • Reviewed and agreed upon by all partners

Skip to content