Laura Spinnewijn

115 General discussion within the workplace environment might yield unintended results [7, 8]. Chapter 2 introduced additional confounding variables that made reliable outcome measurements challenging. Notably, many of the interventions did not adequately address the essential elements of SDM, a finding consistent with previous reviews of SDM teaching interventions. [6, 9] Another concerning discovery was that only a minority of studies focusing on resident training met the essential training prerequisites proposed by our framework. This lack of effective training elements likely reduced the effectiveness of the training, as prerequisites for experiential and reflective learning were not adequately met. Furthermore, it became apparent that most outcomes relied on self-reported measures provided by the participants themselves, which adds to the challenges of objectively determining the effectiveness of SDM training. Furthermore, Chapter 2 revealed that, in general, studies inadequately described intervention details, which contributes to the difficulties in evaluating training efforts. [10] These poor study descriptions hamper the translation of promising interventions into different settings, thus limiting the potential impact they could have. Chapter 3 added inadequacies in tools evaluating SDM skills as well. In response, we designed, pilot-tested, and evaluated a tool to structurally collect patient feedback on residents’ SDM performance and consultation skills within Obstetrics and Gynecology. Even though patients are an integral part of a resident’s social environment, they are often overlooked in training and not routinely involved in efforts to teach or assess SDM skills. In this chapter, we deliberately selected patient feedback, systematically gathered after consultations with residents, as our intervention to bridge this gap in SDM training and assessment. After collecting this patient feedback, we interviewed the participating residents. To evaluate the outcomes of these interviews, we employed thematic content analysis guided by a framework rooted in reflexivity. Reflexivity, in this context, involves critically examining the assumptions that underlie one’s actions, considering the impact of those actions, and assessing what constitutes good clinical practice from a broader perspective. [11] This reflective approach closely aligns with the principles of reflective learning by Kolb, as discussed in Chapter 1. [3] Residents recognized the importance of patient feedback but admitted they rarely sought it. We found no evidence that receiving patient feedback over time improved residents’ performance, as patient ratings tended to stay the same, and residents were barely able to formulate any learning points out of the feedback received. In other words, receiving patient feedback alone did not result in more reflective practice amongst study participants in Chapter 3, nor did it lead to reflection-upon-action or practice change. In our discussion section, we contemplated these findings. We inferred that the absence of guided or facilitated reflections may have contributed to the limited development of reflective practice within this context. [12] In conclusion, there is a paucity in the literature regarding the effectiveness of training characteristics. When we examined and applied the existing evidence, we found that many 7

RkJQdWJsaXNoZXIy MTk4NDMw