Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Application of the skills network approach to measure physician competence in shared decision making based on self-assessment

  • Levente Kriston ,

    Contributed equally to this work with: Levente Kriston, Lea Schumacher

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Validation, Visualization, Writing – original draft

    l.kriston@uke.de

    Affiliation Department of Medical Psychology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Lea Schumacher ,

    Contributed equally to this work with: Levente Kriston, Lea Schumacher

    Roles Data curation, Formal analysis, Investigation, Methodology, Visualization, Writing – original draft

    Affiliation Department of Medical Psychology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Pola Hahlweg,

    Roles Investigation, Validation, Writing – review & editing

    Affiliation Department of Medical Psychology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Martin Härter,

    Roles Funding acquisition, Investigation, Resources, Validation, Writing – review & editing

    Affiliation Department of Medical Psychology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Isabelle Scholl

    Roles Data curation, Investigation, Project administration, Supervision, Validation, Writing – review & editing

    Affiliation Department of Medical Psychology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

Correction

3 Nov 2023: Kriston L, Schumacher L, Hahlweg P, Härter M, Scholl I (2023) Correction: Application of the skills network approach to measure physician competence in shared decision making based on self-assessment. PLOS ONE 18(11): e0294211. https://doi.org/10.1371/journal.pone.0294211 View correction

Abstract

Several approaches to and definitions of ‘shared decision making’ (SDM) exist, which makes measurement challenging. Recently, a skills network approach was proposed, which conceptualizes SDM competence as an organized network of interacting SDM skills. With this approach, it was possible to accurately predict observer-rated SDM competence of physicians from the patients’ assessments of the physician’s SDM skills. The aim of this study was to assess whether using the skills network approach allows to predict observer-rated SDM competence of physicians from their self-reported SDM skills. We conducted a secondary data analysis of an observational study, in which outpatient care physicians rated their use of SDM skills with the physician version of the 9-item Shared Decision Making Questionnaire (SDM-Q-Doc) during consultations with chronically ill adult patients. Based on the estimated association of each skill with all other skills, an SDM skills network for each physician was constructed. Network parameters were used to predict observer-rated SDM competence, which was determined from audio-recorded consultations using three widely used measures (OPTION-12, OPTION-5, Four Habits Coding Scheme). In our study, 28 physicians rated consultations with 308 patients. The skill ‘deliberating the decision’ was central in the population skills network averaged across physicians. The correlation between parameters of the skills networks and observer-rated competence ranged from 0.65 to 0.82 across analyses. The use and connectedness of the skill ‘eliciting treatment preference of the patient’ showed the strongest unique association with observer-rated competence. Thus, we found evidence that processing SDM skill ratings from the physicians’ perspective according to the skills network approach offers new theoretically and empirically grounded opportunities for the assessment of SDM competence. A feasible and robust measurement of SDM competence is essential for research on SDM and can be applied for evaluating SDM competence during medical education, for training evaluation, and for quality management purposes. [A plain language summary of the study is available at https://osf.io/3wy4v.]

Introduction

An essential patient-centered communication competence in health care delivery is the ability to support shared decision making (SDM) in medical consultations. SDM is frequently described as an interpersonal decision making process with a strong emphasis on a balanced flow and exchange of information, values, preferences, power, and responsibility between the patient and the health care professional during medical consultations [1, 2]. SDM has been considered ethical in medical consultations because it ensures that patients are informed about various treatment options and that the patients’ preferences are valued in medical decision-making [3]. This seems particularly important considering that physicians´ assumptions on their patients’ preferences often mismatch the patients’ actual preferences, and patients tend to choose different treatment options when they are better informed [4]. Further, SDM may help to reduce the use of inappropriate tests and interventions when benefits and drawbacks of these are clearly discussed [5] and could lead to better medication adherence [6]. Finally, as patients tend to choose more conservative options when asked, SDM might even reduce health care costs [7]. Thus, it is not surprising that major health care organizations have adopted the principles of SDM [810].

Although several definitions of SDM exist, it is rarely acknowledged explicitly that the same term can refer to ontologically very different concepts [11]. It is frequently unclear, whether ‘SDM’ is used to denote observable attributes of the communication process in a medical encounter, the perception of these attributes by the patient or the physician, attitudes of the participating individuals, a specific method or technique which physicians can utilize, a general philosophy of shaping health care, or a scientific model of medical communication. Conceptual clarity is indispensable for the measurement of latent constructs [12]. From a competence-focused perspective, SDM competence can be defined as the physician’s ability to use specific behavioral skills in a way which supports building a consensus with the patient regarding the favored treatment among multiple viable options in accordance with the patient’s preferences and values [13]. According to this approach, SDM competence requires physicians to organize a defined set of behavioral skills into a certain pattern or network to make a patient-centered decision in the medical consultation more likely.

In a recent study, we found that modelling SDM competence as a network of skills can be used to predict physicians’ observer-rated SDM competence [13]. In that study, patients rated the degree to which certain SDM-related skills were shown by the physicians in their routine medical consultations. These ratings were used to create an SDM skills network for each physician, which models how individual SDM skills are related to each other. Attributes of these networks, e.g., how strongly a skill was related to other skills, predicted observer-rated competence with high accuracy. Using this approach with skill rating input from other sources than patients would substantiate the validity of conceptualizing SDM competence as an organized network of behavioral skills. In the present study, we investigated whether processing physician-reported data on their SDM skills according to the skills network approach can be used to predict observer-rated SDM competence.

Materials and methods

Design and procedures

The design of the present study was based on a previous investigation [13]. We re-analyzed data from a study on measuring SDM, collected between August 2009 and September 2010 in Hamburg, Germany [14]. In that study, consultations between adult patients with chronic conditions who faced a treatment decision and physicians providing primary and specialty outpatient care were examined using ratings from patients, physicians, and external observers. The investigators aimed to include thirty physicians with written documentation of ten consultations and audio-recordings of three consultations each. The ethics committee of the state chamber of physicians in Hamburg approved the study protocol (record no. PV3180). All participants provided written informed consent. In the present analysis, we used data from the physicians and the external observers.

Measures

Basic demographic and clinical data on the participating patients and physicians were collected by administering written questionnaires.

Physician-reported data on SDM skills were collected with the physician version of the 9-item Shared Decision Making Questionnaire (SDM-Q-Doc), which was filled out after the respective consultations [15]. This measure requires physicians to rate the degree to which they showed nine behaviors in the consultation using a six-step Likert-type scale ranging from zero to five. The behaviors captured by the SDM-Q-Doc correspond to key SDM skills: focusing the decision, sharing the decision, presenting options, informing on options, supporting comprehension, eliciting preferences, deliberating the decision, selecting an option, and planning actions [13, 15].

Observer-rated SDM competence of the physicians was measured by three widely used validated measures, the OPTION-12 [16, 17], the OPTION-5 [18, 19], and the Invest in the End subscale of the Four Habits Coding Scheme (4HCS) [20, 21], based on the audio-recorded consultations. We decided to include all three measures in the present analysis, because they capture SDM competence from different perspectives. The OPTION measures focus on decision making, while the 4HCS assesses primarily communication. We decided to include the OPTION-5 in addition to the OPTION-12, because it has a stronger focus on patient preferences and is based on a revised model of SDM [18]. As shown also by empirical analysis, the OPTION-12, the OPTION-5, and the Invest in the End subscale of the 4HCS capture overlapping but notably distinct constructs [13].

Two independent raters assessed each consultation using pilot sessions and manuals to achieve sufficient agreement. Inter-rater reliability varied between 0.69 to 0.76 across instruments for averaged ratings of the physicians’ SDM competence, showing substantial agreement between raters [13]. Raters were blinded to the results of the assessment with other measures. For analysis, we transformed all scores to range from 0 to 100, with higher values indicating a higher level of SDM competence. Each of the measures was averaged across consultations in order to obtain three observer-rated SDM competence scores for each physician. Validity of this method of estimating competence was supported by substantial physician-level variance of the three scores and moderate to high physician-level correlation between them [13]. This means that while SDM competence considerably varied between different physicians, the three observer-rated measures indicated similar SDM competence for each individual physician. An overview on the design, measures and analysis is displayed in Fig 1.

thumbnail
Fig 1. Overview of the research design, measures and analysis.

SDM, Shared Decision Making; SDM-Q-Doc, Shared Decision Making Questionnaire—physician version; 4HCS, Invest in the End subscale of the Four Habits Coding Scheme.

https://doi.org/10.1371/journal.pone.0282283.g001

Statistical analysis

The physicians’ self-rated data on their SDM skills were analyzed according to the skills network model of competence [13]. We assessed the associations between the nine SDM skills and constructed a skills network for each physician. These networks display individual SDM skills as nodes. The connections between nodes are called edges, which indicate how strongly individual SDM skills are related to each other.

For each SDM skill, a Bayesian multilevel linear regression was estimated with the skill as the outcome variable and all other skills as predictors, based on the physician-rated data from all consultations of all physicians. The intercept and slopes were allowed to vary between physicians yielding estimates for each individual physician. Thus, the strength of the associations between individual SDM skills was expected to vary across physicians. Bayesian analysis requires the definition of a prior distribution for each estimated parameter. This prior distribution is updated during the analysis by combining it with the observed data to obtain a posterior distribution, which informs on how probable certain values of the estimated parameter are. We used weakly informative priors, reflecting that we had an approximate but not exact idea of the expected size of the statistical parameters before calculation (see S1 File). If more than two of the nine skills were missing for a consultation, data points from that consultation were excluded. One or two missing ratings per consultations were imputed using the expectation-maximization algorithm.

Based on the estimated coefficients from the multilevel regression models described above, a skills network was constructed for each physician. The regression estimates describing the direction and strength of the association between the different skills for each physician were used as edge weights. When the 95% credible interval of a regression estimate included a zero, this association was excluded to avoid spurious associations. Nodes in the skills networks were placed using the Fruchterman-Reingold algorithm, thus, as far as possible in two dimensions, their distance is relative to the strength of their association [22]. Consequently, skills that were strongly related were placed closer to each other in the networks. Three network parameters, namely activation, outstrength and instrength, of each skill for each physician were calculated. Activation of a skill was defined as the mean of that skill across consultations, i.e., how strongly each physician indicated to have used the skill across their consultations. Outstrength of a skill was calculated by summing the weights of the outgoing edges of that skill and indicates how strongly a skill influences other skills. Instrength was calculated by summing the weights of the ingoing edges of that skill, showing how strongly that skill is influenced by other skills. In addition to the physician-specific networks, we created a population network through averaging the network parameters across physicians. Thus, in addition to constructing a network for each physician, a population network showing how skills are related on average across all physicians was also created. A more detailed description including a step-by-step instruction for calculations can be found elsewhere [13].

Finally, we performed Bayesian linear regression analyses to test whether the network parameters of each physician can predict observer-rated SDM competence as measured with the OPTION-5, OPTION-12 and the Invest in the End subscale of the 4HCS. By doing so, we tested whether characteristics of the skills networks predicted the SDM competence of individual physicians as rated by external observers. First, a confirmatory model with the activation, outstrength and instrength of the skills ‘focusing the decision’, ‘eliciting preferences’ and ‘deliberating the decision’ as predictors was tested, since these skills were relevant in the previous analysis with patient-rated data [13]. We used informative priors with means and standard deviations estimated from the posterior distribution of the estimates observed in the analysis of the patient-reported data (see S1 File) [13]. Subsequently, we created an exploratory model to investigate whether ignoring previous results changes the conclusions substantively. For this, three Bayesian linear regression models were fitted for predicting each observer-rated measure of SDM competence with the activation, the instrength, and the outstrength of all skills as predictors, respectively. The network parameters of the skills, which were significant predictors in this first step for at least one of the observer-rated measures, were regressed onto the three observer-rated measures in the final exploratory model. Weakly informative priors were chosen for all exploratory analyses (see S1 File).

All analyses were conducted in R version 4.0.4 [23]. Bayesian (multilevel) regression analyses were conducted with the package brms utilizing Markov chain Monte Carlo sampling methods [24]. Networks were plotted using qgraph [25]. All regression models were run with four chains, a total of 20,000 iterations, a thinning rate of 10, and 12,000 burn-in simulations, resulting in a posterior sample of 2,000. For each model, the Gelman-Rubin potential scale reduction statistic [26] and traceplots were checked for convergence. We labeled a regression coefficient as statically significant when its 95% credible interval did not include zero. The R code of all analyses is available at https://osf.io/z7368/.

Results

Sample

In the original study, 33 physicians agreed to participate [14], of which 28 provided self-assessment of their SDM skills in 326 consultations. Ratings of 18 consultations were excluded as they had more than two missing data points, resulting in data from 308 consultations included in the analyses (on average 11 consultations per physician). Audio recordings were available from 24 physicians and 80 consultations (on average 3.3 consultations per physician).

Over 70 percent of the participating physicians (42.9 percent female, mean age 50.4 years) were specialized in family or internal medicine and less than one in four had 20 years or more experience (Table 1). The majority of the patients in the investigated consultations (60.3 percent female, mean age 54.2 years) were married, had a low to medium formal education, and were employed or retired (Table 2). About one third of the patients were diagnosed with type 2 diabetes, chronic back pain, and depressive disorder, respectively. The subsample of the physicians and patients contributing audio-recorded consultations were comparable to the total sample.

Population network of SDM skills

The average skills network (Fig 2) showed that the skills ‘focusing the decision’ and ‘sharing the decision’ were, despite their strong reciprocal association, disconnected from the remaining network, suggesting that these skills were only related to each other. ‘Presenting options’, ‘informing on options’, ‘eliciting preferences’ and ‘selecting an option’ were strongly connected, with ‘deliberating the decision’ being in the center of this skill cluster, showing a high level of interrelatedness between these skills. The skills ‘supporting comprehension’ and ‘planning actions’ were more peripheral in the skills network, as they were only related to ‘informing on options’ and ‘selecting an option’, respectively.

thumbnail
Fig 2. Average skills network across physicians.

The width of the arrows represents the strength of the skills associations. The pie around each node indicates the extent of activation of each item. The labels refer to the following skills: 1. focusing the decision; 2. sharing the decision; 3. presenting options; 4. informing on options; 5. supporting comprehension; 6. eliciting preferences; 7. deliberating the decision; 8. selecting an option; 9. planning actions.

https://doi.org/10.1371/journal.pone.0282283.g002

On average, the skill ‘planning actions’ were most frequently shown (Fig 3, panel A). ‘Presenting options’ had the strongest influence on other skills (Fig 3, panel B), and ´informing on options’ was most strongly influenced by other skills (Fig 4, panel C). There was considerable variation between the physicians in their network structure and network parameters (Fig 3; skills networks of individual physicians can be seen in S1 Fig). Thus, how skills were related to each other differed between physicians.

thumbnail
Fig 3. Network parameters of the investigated skills.

Black dots represent the average score, and grey dots indicate estimates from each physician network. The labels refer to the following skills: 1. focusing the decision; 2. sharing the decision; 3. presenting options; 4. informing on options; 5. supporting comprehension; 6. eliciting preferences; 7. deliberating the decision; 8. selecting an option; 9. planning actions.

https://doi.org/10.1371/journal.pone.0282283.g003

thumbnail
Fig 4. Calibration plots for the confirmatory and exploratory prediction of observer-rated SDM competence.

Panels A, B and C show predicted and observed scores for the confirmatory model, panels D, E and F for the exploratory model. Black dots represent the physicians’ scores; smoothing (loess) curves are displayed for each outcome by the grey line. 4HCS, Four Habits Coding Scheme.

https://doi.org/10.1371/journal.pone.0282283.g004

Confirmatory prediction of observed SDM competence from skills networks

The skill ‘eliciting preference’ played an important role in the prediction of observer-rated SDM competence in the confirmatory model, as its activation was significantly positively related to SDM competence ratings with the OPTION-12 and the OPTION-5 and its outstrength was significantly positively related to the SDM competence rating with the 4HCS (Table 3). This means that how often this skill was used and how strongly it was associated with other skills could predict observer-rated SDM competence. Further, the outstrength of ‘deliberating the decision’ was significantly negatively associated with SDM competence as measured with the OPTION-12. This indicated that when a physician’s network showed that ‘deliberating the decision’ influenced many other skills, the SDM competence of that physician was rated lower. The confirmatory model explained about half of the variance of the observer-rated SDM competence with correlations between predicted and observed values ranging from 0.65 to 0.75. Thus, skills network characteristics explained a considerable amount of variation in the observer-rated SDM competence of physicians. Predicted and observed values of the confirmatory models are depicted in Fig 4, panels A-C.

thumbnail
Table 3. Confirmatory prediction of observed SDM competence from network parameters.

https://doi.org/10.1371/journal.pone.0282283.t003

Exploratory prediction of observed SDM competence from skills networks

When the activation, instrength and outstrength of all skills were regressed on the observer-rated SDM competence, the skills ´focusing on the decision´, ‘presenting option’, ‘informing on options’ and ‘eliciting preferences’ were significantly related to at least one of the three observer measures (S1S3 Tables). Results from the subsequent analysis, which included the activation, instrength and outstrength of these four skills, are reported in Table 4. Only the activation of ‘eliciting preference’ was significantly related to SDM competence as measured by the OPTION-5. Still, the model explained about half of the variance for each of the observer measures, with multiple correlation coefficients ranging from 0.69 to 0.82 (Table 4). Predicted and observed values of the exploratory models are displayed in Fig 4, Panels D-F.

thumbnail
Table 4. Exploratory prediction of observed SDM competence from network parameters.

https://doi.org/10.1371/journal.pone.0282283.t004

Discussion

A wide range of empirical results suggest that physicians have a limited ability to assess their professional competences accurately [27]. This includes communication competences, where studies frequently show a lack of association between physicians’ self-assessment and external rating by trained observers [2830]. Here, we found encouraging evidence that it is possible to use physicians’ self-assessment of behavioral skills for measuring competence, even though the measurement is computationally more complex than using simple (averaged) global ratings as a direct measure of competence.

In the population network, the most central SDM skills were presenting options, informing on options, eliciting preferences, deliberating the decision, and selecting an option. Supporting comprehension and planning actions seem to be somewhat more peripheral skills, while focusing the decision and sharing the decision are (albeit strongly associated with each other) completely disconnected from the rest of the network. This architecture is strikingly similar to the structure of the population network of SDM skills based on patient-reported data [13], even though patient and physician assessments of the specific skills from the same consultation considerably disagreed in previous investigations [31, 32]. It should also be noted that, although we did not attempt to cluster skills in the present study explicitly, the identified structure of the SDM skills shows similarities with the categorization of the skills postulated by the three-talk model of SDM by Elwyn and colleagues [33]. These findings suggest that skills networks are able to capture a robust and replicable physician-level construct, which we hypothesize to be SDM competence.

Validity of interpreting the information contained in the network structure as an indicator of SDM competence was supported by its association with observer-rated data. In a confirmatory approach, we found that the combination of data-based inference with findings from the analysis of patient-reported data [13] (in the form of informative priors for Bayesian analysis) produced strong predictions of observer-rated competence. In the spirit of a continuous Bayesian accumulation of evidence, the results of the confirmatory analysis can be considered to synthesize the findings of the previously reported investigation using patient-reported data and the current study based on physicians’ self-assessment quantitatively. Results of the exploratory analysis led to models with even stronger predictive accuracy. This indicates that skills networks based on physicians`self-assessment of their SDM skills were highly predictive of their SDM competence as rated by external observers. In general, the findings support the hypothesis that patient and physician rated data may be used interchangeably for competence assessment if handled in the context of the network approach.

Both patients`and physicians`ratings of SDM processed according to the skills network approach seem to yield an objective assessment of SDM competence, which highly relates to external assessments of this competence. This finding has various implications. From a theoretical perspective, it suggests a new definition of professional competence, which can be contrasted to and integrated with existing ones [34]. For the network science of psychological phenomena [35], it means a methodological extension and a new field of application. Lastly, for assessing professional SDM competence [36], it offers a new way of measurement based on self-rating of physicians. By applying the skills network model of SDM competence to physician-rated data, we provided a promising opportunity for a feasible assessment of SDM competence. Self-ratings are, in contrast to observer ratings, more easily applicable and less time-intensive, offering a genuine opportunity for their application in routine practice.

Since measuring SDM competence with skills networks seem to offer a replicable and robust assessment of this professional skill (high agreement between patient, physician, and observer assessment), our proposed method is of relevance and could be applied to areas in which a feasible and robust assessment of SDM competence is highly needed. First, research on SDM depends largely on a valid measurement of SDM competence, for example to assess predictors and treatment outcomes for different levels of SDM competence. Considering that assessing competence by observation is very resource intensive, utilizing brief self-assessment increases the range of options for research projects. Second, to evaluate the effectiveness of a trainings for SDM, including education of health care professionals, the assessment of this competence is of central importance. Novel measures without the need for external judgment by qualified experts could contribute to a more comprehensive evaluation of interventions aiming to implement SDM. Finally, the network approach to SDM competence could be applied when assessing SDM as a part of quality management in clinical routine care. Here, a robust assessment can be gained from quite easily attainable patient or physician ratings of SDM. In this context, analysis of a continuous data stream from SDM surveys may enable monitoring of the SDM competence of individuals, teams, departments, or hospitals. Furthermore, a detailed analysis of the obtained skills networks could reveal specific and actionable targets (i.e., skills or skill connections) for improvement. Being able to create individual skills networks and to precisely pinpoint skills and skill connections that need to be improved could open the way to a data-driven and individualized measurement, education, training, and monitoring of complex competences.

Current findings are limited by the restricted sample size and the considerable complexity of the statistical models relative to the sample size. These factors are likely to be partly responsible for the wide credible intervals of the estimated parameters. Due to this imprecision and to collinearity between network parameters, the influence of specific network parameters of individual skills could be investigated only to a limited extend. As network parameters were correlated to each other, it remains unclear how each individual network parameter relates to observer-rated SDM competence and which network parameters are most important for indicating SDM competence. Jointly, the network parameters showed a high predictive accuracy for the observer-rated SDM competence, and future studies need to assess which specific network parameters are most important for this. Furthermore, since the approach has been only applied to data from a self-selected sample from outpatient care in Germany, generalizability to other contexts needs to be investigated in future studies. This should also include comparing results between various contexts and subgroups, for example, defined by the primary specialty of the physician or the disease of the consulted patients, which was unfortunately not possible in the present study due to the limited sample size. Finally, results from the exploratory analyses need to be interpreted with due caution, as different model building procedures could have led to different results and current results could not be cross-validated. Still, especially through the confirmatory testing and the replication of findings from previous analyses with patient data, the current study offered considerable support for the skills network approach to SDM competence. By applying a Bayesian framework, some previously mentioned weaknesses could be extenuated and problems such as the multiple testing problem avoided. Future studies need to test this new approach with larger datasets to assess the relative importance of individual network parameters and skills.

Structuring clinical competences into a hierarchically organized categorical system is challenging, particularly in the interpersonal and communication domains [37]. “Choosing the right boundaries for a unit of analysis is a central problem in every science” [38], and this is particularly true for clinical skills and competences, which are strongly interrelated and frequently overlapping. Thus, it is not always clear how to narrow down the densely connected network of clinical skills into well definable and analyzable competences. Whether SDM is a sufficiently distinct concept from this perspective, i.e., whether it is operationally sufficiently closed in the environment of other skills and competences, should be empirically investigated in further studies by collecting data on a broader range of skills and competences for network analysis.

Conclusions

Our findings provide further support for conceptualizing and modeling physicians’ SDM competence as a network of SDM skills. This conceptualization suggests a new definition of professional competence, offers a methodological extension and a new field of application for network science and, most importantly, provides a new way of measuring professional competence based on self-rating of physicians. A robust measurement of SDM competence offers new opportunities for research, for evaluating learning success in education and training, and for monitoring SDM competence for quality management purposes in clinical routine care. In combination, these consistent theoretical, empirical, and practical implications have the potential to open up a new approach to professional competence in health care.

Supporting information

S1 File. Information on prior distributions.

https://doi.org/10.1371/journal.pone.0282283.s001

(PDF)

S1 Fig. Skills networks of individual physicians.

https://doi.org/10.1371/journal.pone.0282283.s002

(PDF)

S1 Table. Prediction of observer-rated shared decision making competence from the activation of all skills.

https://doi.org/10.1371/journal.pone.0282283.s003

(PDF)

S2 Table. Prediction of observer-rated shared decision making competence from the outstrength of all skills.

https://doi.org/10.1371/journal.pone.0282283.s004

(PDF)

S3 Table. Prediction of observer-rated shared decision making competence from the instrength of all skills.

https://doi.org/10.1371/journal.pone.0282283.s005

(PDF)

References

  1. 1. Spatz ES, Krumholz HM, Moulton BW. Prime time for shared decision making. JAMA. 2017;317: 1309–1310. pmid:28384834
  2. 2. Barry MJ, Edgman-Levitan S. Shared decision making—pinnacle of patient-centered care. N Engl J Med. 2012;366: 780–781. pmid:22375967
  3. 3. Salzburg Global Seminar. Salzburg statement on shared decision making. BMJ. 2011;342: d1745. pmid:21427038
  4. 4. Mulley AG, Trimble C, Elwyn G. Stop the silent misdiagnosis: patients’ preferences matter. BMJ. 2012;345: e6572. pmid:23137819
  5. 5. Hoffmann TC, Légaré F, Simmons MB, McNamara K, McCaffery K, Trevena LJ, et al. Shared decision making: what do clinicians need to know and why should they bother? Med J Aust. 2014;201: 35–39. pmid:24999896
  6. 6. Ratanawongsa N, Karter AJ, Parker MM, Lyles CR, Heisler M, Moffet HH, et al. Communication and medication refill adherence: the Diabetes Study of Northern California. JAMA Intern Med. 2013;173: 210–218. pmid:23277199
  7. 7. Oshima Lee E, Emanuel EJ. Shared decision making to improve care and reduce costs. N Engl J Med. 2013;368: 6–8. pmid:23281971
  8. 8. Nickel WK, Weinberger SE, Guze PA, Patient Partnership in Healthcare Committee of the American College of Physicians. Principles for patient and family partnership in care: an American College of Physicians position paper. Ann Intern Med. 2018;169: 796–799. pmid:30476985
  9. 9. Carmona C, Crutwell J, Burnham M, Polak L. Shared decision-making: summary of NICE guidance. BMJ. 2021;373: n1430. pmid:34140279
  10. 10. US Preventive Services Task Force. Collaboration and shared decision-making between patients and clinicians in preventive health care decisions and US Preventive Services Task Force recommendations. JAMA. 2022;327: 1171–1176. pmid:35315879
  11. 11. Wollschläger D. Where is SDM at home? Putting theoretical constraints on the way shared decision making is measured. Z Evid Fortbild Qual Gesundhwes. 2012;106: 272–274. pmid:22749074
  12. 12. Borsboom D, Mellenbergh GJ, van Heerden J. The theoretical status of latent variables. Psychol Rev. 2003;110: 203–219. pmid:12747522
  13. 13. Kriston L, Hahlweg P, Härter M, Scholl I. A skills network approach to physicians’ competence in shared decision making. Health Expect. 2020;23: 1466–1476. pmid:32869476
  14. 14. Scholl I, Kriston L, Dirmaier J, Härter M. Comparing the nine-item Shared Decision-Making Questionnaire to the OPTION Scale—an attempt to establish convergent validity. Health Expect. 2015;18: 137–150. pmid:23176071
  15. 15. Scholl I, Kriston L, Dirmaier J, Buchholz A, Härter M. Development and psychometric properties of the Shared Decision Making Questionnaire–physician version (SDM-Q-Doc). Patient Educ Couns. 2012;88: 284–290. pmid:22480628
  16. 16. Elwyn G, Hutchings H, Edwards A, Rapport F, Wensing M, Cheung W-Y, et al. The OPTION scale: measuring the extent that clinicians involve patients in decision-making tasks. Health Expect. 2005;8: 34–42. pmid:15713169
  17. 17. Hirsch O, Keller H, Müller‐Engelmann M, Gutenbrunner MH, Krones T, Donner‐Banzhoff N. Reliability and validity of the German version of the OPTION scale. Health Expect. 2012;15: 379–388. pmid:21521432
  18. 18. Barr PJ, O’Malley AJ, Tsulukidze M, Gionfriddo MR, Montori V, Elwyn G. The psychometric properties of Observer OPTION(5), an observer measure of shared decision making. Patient Educ Couns. 2015;98: 970–976. pmid:25956069
  19. 19. Kölker M, Topp J, Elwyn G, Härter M, Scholl I. Psychometric properties of the German version of Observer OPTION5. BMC Health Serv Res. 2018;18. pmid:29386031
  20. 20. Krupat E, Frankel R, Stein T, Irish J. The Four Habits Coding Scheme: validation of an instrument to assess clinicians’ communication behavior. Patient Educ Couns. 2006;62: 38–45. pmid:15964736
  21. 21. Scholl I, Nicolai J, Pahlke S, Kriston L, Krupat E, Härter M. The German version of the Four Habits Coding Scheme—association between physicians’ communication and shared decision making skills in the medical encounter. Patient Educ Couns. 2014;94: 224–229. pmid:24286733
  22. 22. Fruchterman TMJ, Reingold EM. Graph drawing by force-directed placement. Softw Pract Exp. 1991;21: 1129–1164.
  23. 23. R Core Team. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing; 2014.
  24. 24. Bürkner PC. Advanced Bayesian multilevel modeling with the R package brms. R J. 2018;10: 395–411.
  25. 25. Epskamp S, Cramer AOJ, Waldorp LJ, Schmittmann VD, Borsboom D. qgraph: Network visualizations of relationships in psychometric data. J Stat Softw. 2012;48: 1–18.
  26. 26. Gelman A, Rubin DB. Inference from iterative simulation using multiple sequences. Stat Sci. 1992;7: 457–472.
  27. 27. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296: 1094–1102. pmid:16954489
  28. 28. Burt J, Abel G, Elliott MN, Elmore N, Newbould J, Davey A, et al. The evaluation of physicians’ communication skills from multiple perspectives. Ann Fam Med. 2018;16: 330–337. pmid:29987081
  29. 29. Pollak KI, Arnold RM, Jeffreys AS, Alexander SC, Olsen MK, Abernethy AP, et al. Oncologist communication about emotion during visits with patients with advanced cancer. J Clin Oncol. 2007;25: 5748–5752. pmid:18089870
  30. 30. Gude T, Finset A, Anvik T, Bærheim A, Fasmer OB, Grimstad H, et al. Do medical students and young physicians assess reliably their self-efficacy regarding communication skills? A prospective study from end of medical school until end of internship. BMC Med Educ. 2017;17: 107. pmid:28666440
  31. 31. Kriston L, Härter M, Scholl I. A latent variable framework for modeling dyadic measures in research on shared decision-making. Z Evid Fortbild Qual Gesundhwes. 2012;106: 253–263. pmid:22749072
  32. 32. Röttele N, Schöpf-Lazzarino AC, Becker S, Körner M, Boeker M, Wirtz MA. Agreement of physician and patient ratings of communication in medical encounters: A systematic review and meta-analysis of interrater agreement. Patient Educ Couns. 2020;103: 1873–1882. pmid:32376141
  33. 33. Elwyn G, Durand MA, Song J, Aarts J, Barr PJ, Berger Z, et al. A three-talk model for shared decision making: multistage consultation process. BMJ. 2017;359: j4891. pmid:29109079
  34. 34. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287: 226–235. pmid:11779266
  35. 35. Schmittmann VD, Cramer AOJ, Waldorp LJ, Epskamp S, Kievit RA, Borsboom D. Deconstructing the construct: a network perspective on psychological phenomena. New Ideas Psychol. 2013;31: 43–53.
  36. 36. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65: S63–67. pmid:2400509
  37. 37. Makoul G. Essential elements of communication in medical encounters: the Kalamazoo consensus statement. Acad Med. 2001;76: 390–393. pmid:11299158
  38. 38. Hutchins E. Cognitive ecology. Topics Cogn Sci. 2010;2: 705–715. pmid:25164051