36 Chapter 2 Although not the primary focus of this review, it is important to consider the effectiveness of interventions by examining their results. Evaluating the impact of interventions on behavior change (Kirkpatrick level 3) and system-level change (Kirkpatrick level 4) should be central to all intervention evaluations. [58] Demonstrating positive outcomes can motivate health professionals to adopt SDM-oriented behavior, [59] and encourage policymakers to make systemic changes that support SDM implementation. [60] Our study has both strengths and limitations. One strength is that we developed our own set of quality indicators, although they have not been validated due to the lack of existing standards. However, these indicators were based on robust scientific evidence and theoretical foundations, providing a starting point for enhancing comparability and reproducibility of study results. A major limitation of our study is that we did not reach out to authors for additional information about the interventions, which could have resulted in potential inaccuracies regarding the inclusion of criteria from our framework. Nevertheless, our review underscores the critical need for improved intervention descriptions in published studies. By providing comprehensive details about interventions, including intervention materials, group size, training duration, and teaching strategies, the application of proven, high-quality, and effective interventions in various settings is facilitated. Addressing these inadequacies in reporting will contribute to the overall improvement in the quality of research in medical education. Conclusion In conclusion, the current literature lacks common standards for evaluating educational quality beyond outcome measurements. This gap has resulted in inconsistent study results, intervention heterogeneity, and variable reporting of outcomes. Though yet to be validated, our evaluation framework addresses this gap by providing minimal standards based on scientific evidence and identifiable elements from existing checklists for describing educational interventions. Despite its simplicity compared to comprehensive checklists, our evaluation framework significantly improves current practice and meets the expressed need for better training in SDM. However, when applying our framework, only a limited number of studies met educational quality requirements. Furthermore, we did not find a significant relationship between the criteria in our evaluation framework and the effectiveness of training interventions. The heterogeneity in study and intervention characteristics, the poor training content lacking focus on SDM, as well as the variability in reporting intervention details and outcomes, may have contributed to these findings. Practice implications The improvement of SDM training quality begins with developing and validating a robust educational evaluation framework. Such an evaluation framework should serve as a guide for designing future interventions, standardizing reporting practices, and evaluating intervention outcomes, thus enhancing the reproducibility of positive results. Our study’s evaluation framework provides a foundational step towards the development of such a framework. Additionally, it is crucial for all SDM interventions to adopt a widely recognized definition of SDM when designing
RkJQdWJsaXNoZXIy MTk4NDMw