Hostname: page-component-586b7cd67f-dsjbd Total loading time: 0 Render date: 2024-11-25T13:25:31.139Z Has data issue: false hasContentIssue false

Supporting our supervisors: sending out an SOS

Published online by Cambridge University Press:  14 October 2016

Derek L. Milne*
Affiliation:
School of Psychology, Newcastle University, Newcastle, UK
Robert Reiser
Affiliation:
Reiser Healthcare Consulting, Kentfield, CA, USA
*
*Author for correspondence: Dr D. L. Milne, School of Psychology, Ridley Building, Newcastle University, Newcastle NE1 7RU, UK (email: [email protected]).
Rights & Permissions [Opens in a new window]

Abstract

In this Introduction to the Special Issue of the Cognitive Behaviour Therapist on clinical supervision we start by highlighting the unmet and overdue need for coherent organizational systems to support, guide and develop clinical supervisors. We identify a seven-step, cyclical model that describes how such a system might work, with particular reference to CBT supervision. These steps start with conceptualization (e.g. definition of CBT supervision) and complete the problem-solving cycle with evaluation (e.g. corrective feedback). We provide an overview of typical research and development activity for each part of this model to illustrate how a sound supervision infrastructure might best be developed. The SOS model provides a systematic approach to indicate the organizational conditions under which CBT supervision might flourish.

Type
Special Issue: International Developments in Supporting and Developing CBT Supervisors
Copyright
Copyright © British Association for Behavioural and Cognitive Psychotherapies 2016 

Introduction

There are heartening signs of progress within the professional specialization of clinical supervision, progress that is evident across groups and nations (Watkins & Milne, Reference Watkins, Milne, Watkins and Milne2014), but also many reasons for continuing concern. Supervision remains a paradoxically neglected activity, one that is regarded as essential within professional practice, yet one that has been relatively rarely studied in a systematic way. Even the basic models and measures of supervision remain a subject of fundamental debate, in that we await a scientifically informed consensus on many of the core aspects of best practice. To illustrate, the special supervision issue of Training and Education in Professional Psychology (2014) included significantly divergent definitions of a successful outcome in supervision. Within this neglected field, perhaps the least studied topic concerns the optimal organizational system for supporting and developing supervision. Dorsey et al. (Reference Dorsey, Pullmann, Deblinger, Berliner, Kerns, Thompson, Unutzer, Weiss and Garland2013) noted that a majority of dissemination programmes for evidence-based treatments in community settings had omitted ongoing support of supervisors, although Callahan et al. (Reference Callahan, Almstrom, Swift and Heath2009) note ‘that supervisors may account for approximately 16% of the variance in outcome beyond that accounted for by the client's initial severity and the treating therapist's attributes’ (p. 75). In the United States, organizations that play a key role in developing and promulgating national evidence-based practice (EBP), such as the Veteran's Administration, have no organized internal system to support supervised practices. Within the supervision literature, organizational systems have ‘rarely been investigated or discussed’ (Holloway, Reference Holloway, Watkins and Milne2014, p. 612).

In the present conceptual review we redress this neglect by addressing the question: Under which organizational conditions does clinical supervision flourish? A reasonable assumption is that these conditions are similar to the ones affecting organizational development in general, with the implementation of EBP a topical instance. There is a well-recognized gulf between research and practice, severely limiting access to evidence-based care and triggering national efforts to improve implementation (McHugh & Barlow, Reference McHugh and Barlow2010). Training is a popular option for reducing this gulf. While it represents part of an established approach to successful implementation of new practices, it needs to be conducted as one element within a systematic approach to innovation that takes due account of organizational factors (Beidas & Kendall, Reference Beidas and Kendall2010). These factors include staff shortages, high clinical loads, limited access to support or supervision, and poor management relationships (Moran et al. Reference Moran, Coyle, Boxall, Nancrow and Young2014). These authors reviewed 43 studies where training and support interventions had been introduced to overcome such barriers, with a focus on the factors that contributed to success. Their thematic analysis suggested that the successful implementation of programmes to support supervision depended on several factors including a needs analysis prior to intervention, active involvement of stakeholders, training in supervision, strong leadership and organizational commitment, adequate resources – including time off or relief for practitioners, and regular programmatic evaluation.

Successful organizational development seems to be predicated in part upon the flexibility with which innovation is implemented in that encouraging user judgement and discretion appear to boost use. For example, Stirman et al. (Reference Stirman, Calloway, Toder, Miller, DeVito, Meisel, Xhezo, Evans, Beck and Crits-Christoph2013) studied the effect of modifications of standard CBT, as suggested by clinicians within community mental health settings (including tailoring interventions, integrating cognitive therapy into other approaches and loosening of session structure; Stirman et al. Reference Stirman, Calloway, Toder, Miller, DeVito, Meisel, Xhezo, Evans, Beck and Crits-Christoph2013, p. 4) to proactively address barriers in order to improve the uptake of EBPs. Stirman noted that ‘More than half the clinicians in the study acknowledged that they drifted or departed from a cognitive therapy approach’ (p. 3), and these clinicians routinely identified client engagement as a key barrier. Hence, there is a need to modify and tailor training and supervision in order to take a proactive approach that anticipates and addresses these user-identified problems. For example, this could involve incorporating client vignettes with poorly engaged clients in supervision; and in training using role play to address common barriers likely to be encountered by practitioners who are modifying their usual approach to session structure. The more impressive and effective staff development innovations often provide personalized opportunities for supervisors to prepare for barriers and challenges. One example within training is having a ‘relapse-prevention’ module, including group problem-solving, realistic goal-setting, and the development of personal coping skills (Beidas & Kendall, Reference Beidas and Kendall2010). Such ‘systems-contextual’ approaches are virtually absent within the supervision literature. In this deficient context, one needs to look more broadly at the implementation research literature to identify systematic approaches, ones that take into account organizational and contextual factors.

Although such studies are informative, the variables identified in the studies above were rarely modelled systematically or considered in relation to supervision. This means that we lack a clear and compelling sense of direction for supervisor support within mental health services. Where should we be heading, and what will it take to support our supervisors sympathetically and systematically, at the organizational level? These are the fundamental questions propelling this Special Issue of the Cognitive Behaviour Therapist. This present conceptual review therefore starts by outlining how we might think about a systematic approach to supporting our supervisors, the SOS model.

The SOS model

Definition of SOS

We define support for our supervisors (SOS) as an evidence-based, systematic, organizational process which ensures that supervisors receive the necessary leadership, support and development to fulfil their role effectively and with job satisfaction. It differs from informal, traditional, or organizationally flawed arrangements, such as ‘peer supervision’. The basic functions of our SOS model are to embed supervision within a normative, formative and restorative infrastructure. Respectively, these are achieved by evidence-based methods such as drawing on competencies frameworks, undertaking ‘gold standard’ training in supervision, and providing supervision-of-supervision (or consultancy). The effectiveness of these various organizational interventions could be measured by instruments assessing such variables as the development of competence in supervisors (e.g. Milne & Reiser, Reference Milne, Reiser, Watkins and Milne2014), their job satisfaction or occupational burnout.

This model adopts a research and development (R&D) implementation strategy, which consists of two nested problem-solving cycles (Department of Health, 1994). In Figure 1, we have translated this original R&D framework into supervision-related terminology, and have proposed a supervision-specific extension of the model by incorporating the steps in a general problem-solving strategy (Hayes, Reference Hayes1989).

Fig. 1. The SOS model, a systematic way of thinking about supporting our supervisors.

The SOS model set out in Figure 1 has the supervisee at its heart, and depicts the experiential learning cycle, a formative emphasis on learning and professional development. The supervision cycle is nested in the middle of Figure 1, describing how supervision operates through a series of repeated needs assessments, goal-setting and evaluation cycles borrowing from Kolb's (Reference Kolb1984) experiential learning cycle. These cycles have already been described in some detail elsewhere (Milne, Reference Milne2009). The novel aspect of Figure 1 is the outer ring, the ‘support our supervisors’ (SOS) cycle. This organizationally focused cycle adds a more explicit, evidence-based, CBT consistent, and coherent dimension to existing systemic supervision models (e.g. Hawkins & Shohet, Reference Hawkins and Shohet2000; Holloway, Reference Holloway, Watkins and Milne2014; Hoge et al. Reference Hoge, Migdole, Cannata and Powell2014). To conceive this SOS cycle, we adopted the basic tasks of R&D (‘conceptualization’, etc.), assimilating where helpful the organizational model and evidence-based approach to supervision outlined by Milne (Reference Milne2009). In addition to a primary formative function (i.e. developing supervisors’ expertise), Figure 1 also emphasizes the normative function of supervisor support, recognizing how organizations influence supervision through such means as R&D and effective leadership. Note in particular that, by means of the cited examples, we have added an emphasis on the organizations’ supportive function (e.g. drawing on reviews of the key research findings to best train supervisors in facilitating peer support groups). An organization that fulfils its restorative function helps supervisors ‘feel that they are supported, accepted, nurtured, acknowledged and validated’ (Milne, Reference Milne2009, p. 185). That is, just as in most models of supervision, in the SOS model we envisage the need for normative, formative and restorative functions to be performed by supervisors’ organizations. Although for space reasons, and to simplify the Figure, all three functions are not specified within Figure 1, we believe that only when all three functions are fulfilled can we expect supervisors to be properly developed, guided and supported. With respect to supervision, these influences were first described by Kadushin (Reference Kadushin1976) and popularized by Proctor (Reference Proctor, Marken and Payne1988), and they remain key constructs for identifying modes of effective supervision (e.g. White & Winstanley, Reference White, Winstanley, Watkins and Milne2014). In Figure 1 we extend these three functions to the support of supervisors, believing them to be equally necessary conditions for a systematic support infrastructure.

We next describe and illustrate each of the seven SOS cycle tasks with some representative research and development activity or relevant theory.

SOS overview

Conceptualization: How should we construe a supervision infrastructure?

Definition of CBT supervision

A preliminary task in conceptualization is the clear definition of clinical supervision (Milne, Reference Milne2007). Our definition of CBT supervision adds some specifics to the general definition provided in Milne (Reference Milne2007). According to authoritative procedural accounts (Padesky, Reference Padesky and Salkovskis1996; Liese & Beck, Reference Liese, Beck and Watkins1997; Beck et al. Reference Beck, Sarnat, Barenstein, Falender and Shafranske2008), CBT supervision is highly structured and directive (e.g. entails detailed planning and problem-oriented agenda-setting), while prizing a collaborative relationship. The central objective of developing an ongoing cognitive case conceptualization, in order to identify appropriate evidence-based interventions, is achieved primarily through guided discovery, collaborative empiricism (e.g. behavioural experiments to test out hypotheses about the supervisee's assumptions and developing the most effective interventions), and traditional case-based discussion. These authors also encourage supervisors to supplement discussion with modelling and enactive methods (including educational role play, behavioural rehearsal, and corrective feedback). CBT supervision can also be distinguished by its emphasis on evidence-based principles and methods, a unique pedagogical style emphasizing experiential learning, utilizing direct observation and using reliable instruments to assess supervisee competence. A core feature of this approach involves assessing client progress through ongoing clinical monitoring using empirically supported instruments (Reiser, Reference Reiser, Watkins and Milne2014).

In terms of conceptualizing the infrastructure that embeds supervision, according to systemic accounts the dimensions include an organization's mission, values, staffing practices, professional standards, decision-making processes, management system, work culture, and general structure (Holloway, Reference Holloway, Watkins and Milne2014). The previous section offered the SOS model as our conceptualization of such dimensions. We discuss them sequentially below, and particular aspects are also detailed within the papers that follow within this Special Issue.

Operationalization: What are the key elements to measure within the supervision infrastructure?

Our selected conceptualization should indicate which variables to measure, such as factors within the organizational climate. For example, one study of supervision contrasted two different milieus for the care of dementia sufferers, a collective living (CL) unit and a traditional nursing home (Kihlgren & Hansebo, Reference Kihlgren, Hansebo, Watkins and Milne2014). The CL unit staff received 1 month of training on dementia care before the unit opened, followed by supervision, support and monthly feedback during the 22-month study period. These interventions were designed to improve their care practices. Specifically, the style of caring in the CL unit engaged the residents in decisions about their unit, encouraged autonomy, and promoted dignity. These organizational changes were associated with a significantly smaller deterioration in intellectual functioning for the CL unit patients. Therefore, these CL unit changes seem to be key elements and should be measured, in order for supervision to be as effective as possible. Several general instruments are available for measuring such organizational factors, including ‘readiness for change’ tools (e.g. Helfrich et al. Reference Helfrish, Li, Sharp and Sales2009; Shea et al. Reference Shea, Jacobs, Esserman, Bruce and Weiner2014). More specific to supervision are instruments like The Clinical Learning Environment, Supervision and Nurse Teacher Evaluation Scale (CLES+T; Saarikoski, Reference Saarikoski, Watkins and Milne2014), and the Manchester Clinical Supervision Scale (MCSS; White & Winstanley, Reference White, Winstanley, Watkins and Milne2014). These tools measure the ‘cultural environment’, the ‘leadership style’ of the unit manager, and the ‘resource’ aspect (i.e. importance and availability of supervision in the workplace). At the micro-environmental level of the supervision group there are also instruments that tap the organizational system (e.g. Ogren & Sundin, Reference Ogren and Sundin2009), but overall there are few supervision-related measurement tools, and even fewer which are psychometrically sound (Wheeler & Barkham, Reference Wheeler, Barkham, Watkins and Milne2014).

But there are welcome exceptions. Godley et al. (Reference Godley, Garner, Smith, Meyers and Godley2011) present a comprehensive implementation model that includes systematic measurement of training and supervisory effectiveness using standardized measures. To their credit, they implemented a fully manualized CBT treatment that was supported by initial therapist training workshops, follow-up supervisor training (including review of taped supervision sessions), certification of competent supervisors, bi-weekly telephone-based coaching and support for therapists, and ongoing fidelity monitoring of both therapists and supervisors.

Experimentation: What kinds of research will best guide us?

In previous reviews of research on supervision (Milne, Reference Milne2009), we have drawn upon the fidelity framework (Borrelli et al. Reference Borrelli, Sepinwall, Ernst, Bellg, Czajkowski, Greger, DeFrancesco, Levesque, Sharp, Ogedegbe, Resnick and Orwig2005) to provide an organizing principle to capture the complexity entailed in evaluating supervision systemically. This framework captures the successive stages of the design, training, delivery, receipt and enactment of supervision (see Reiser & Milne, Reference Reiser and Milne2014, for a comprehensive description of the framework). This framework is designed to address the following types of questions about supervision: ‘What is the right thing to do?’ (Design); ‘Has the right thing been done?’ (Training); ‘Has it been done right? (Delivery); ‘Did it result in the right outcome?’ (Receipt); and, ‘Did it result in the right impact?’ (Enactment). The fidelity framework appears to be valuable in pinpointing problems in providing support for supervisors. This is similar to the ‘organizational behaviour management’ approach to diagnosing organizational functioning. But such wide-ranging evaluations are rare. In our review of studies evaluating the outcomes of supervision, we determined that these fidelity framework variables had not been measured adequately in about 50% of the studies that we reviewed (Reiser & Milne, Reference Reiser and Milne2014). Therefore, a first step in improving implementation research on supervision will be to fully address the successive stages within the fidelity framework, as they apply to developing and implementing an organizational approach to supporting supervisors.

In order to address fidelity and other research questions, it is likely that the use of large group, randomized controlled designs (RCTs) will be cumbersome, resource intensive and premature. In addition, RCTs have become associated with failed top-down implementation efforts (e.g. clinical guidelines; Greene, Reference Greene2014). Instead, we recommend ‘upstream’, qualitative, quasi-experimental studies concerning pragmatic issues, combined with rigorous small sample research designs. These might include the effectiveness of activities like supervisor training, surveys to establish priorities and progress, together with n = 1 designs to study supervision processes intensively (e.g. Milne et al. Reference Milne, Reiser and Cliffe2013). These types of designs are also consistent with underlying premises of the CBT model, involving the use of collaborative empiricism to develop specific, well-tailored interventions.

Interpretation: What conclusions can we draw, to guide the supervision infrastructure?

Such ‘small n’ designs offer high internal validity and seem well-suited to our present knowledge base, being sufficient for drawing some important inferences about the effects of various kinds of institutional support on supervision. For example, Milne et al. (Reference Milne, Reiser and Cliffe2013) compared standard CBT supervision with evidence-based clinical supervision, and were able to draw helpful conclusions about the effectiveness of consultation on the supervisor's performance. Other organizational factors may supplement or replace arrangements like consultancy. In a best-evidence synthesis of 24 supervision studies, Milne et al. (Reference Milne, Sheikh, Pattison and Wilkinson2011) noted that the contextual variable ‘administrative support’ was consistently identified as an important implementation variable. Beidas & Kendall (Reference Beidas and Kendall2010) also identified organizational support as a key factor in maintaining the ongoing transfer of learning for therapists, and others have noted how an organizations policy and culture greatly shape the implementation of EBP (e.g. Proctor et al. Reference Proctor, Landsverk, Aarons, Chambers, Glisson and Mittman2009).

It is clear that there is still a dearth of research focused on the impacts of organizational and contextual factors affecting the development of an enriched supervision culture. Based on the current undeveloped state of research in this domain, it is difficult to draw substantive, general conclusions. One can only assert that there is an ongoing need for consideration of systems-contextual factors in future supervision research. Local research and service evaluation efforts, grounded in the best-available research and theory, may therefore be the best short-term guide to developing such a culture. However, the point of this step in the SOS cycle is to ensure that any appropriate implications are duly drawn out, for the benefit of research and practice.

Dissemination: How should we train and develop supervisors?

Supervisor training is a vital mechanism within a systematic, evidence-based strengthening of supervision, and it is the aspect of support that has been most studied. Yet surprisingly little is known about how we might best train supervisors. Indeed, it appears that the majority of supervisors do not even receive training in supervision, leading to the oft-quoted lament that ‘something does not compute’ (Watkins, Reference Watkins and Watkins1997, p. 604). In a recent ‘informal’ survey across 12 countries, Watkins & Wang (2014, p. 193) determined that to date only Australia had a mandatory system for approving supervisors, with experience as a clinician continuing to be the only requirement to be met in becoming a supervisor in the other 11 countries. Despite the advent of competence frameworks (Falender et al. Reference Falender, Cornish, Goodyear, Hatcher, Kaslow, Leventhal, Shafranske, Sigmon, Stoltenberg and Grus2004; Roth & Pilling, Reference Roth and Pilling2007; Olds & Hawkins, Reference Olds and Hawkins2014), the content of training in these countries varied considerably (Fleming, Reference Fleming, Fleming and Steen2012). By contrast, within an SOS infrastructure, relevant (needs-led) competencies would be combined with supervisor training methods, ones that are based on expert consensus (Kaslow et al. Reference Kaslow, Borden, Grus, Collins, Campbell Forrest, Illfelder-Kaye and Nelson2004; Milne et al. Reference Milne, Scaife and Cliffe2009) and research findings (e.g. Godley et al. Reference Godley, Garner, Smith, Meyers and Godley2011). It is surely time to consider replacing supervisor training as something of a local art form with such evidence-based approaches.

Although there are now a few promising studies concerning the training of supervisors (Watkins & Wang, Reference Watkins, Wang, Watkins and Milne2014), little is known about the kind of complementary support arrangements that should be in place to nurture supervisors and to foster the transfer of their training. In a systematic review of 24 successful supervision studies, Milne et al. (Reference Milne, Aylott, Fitzpatrick and Ellis2008) noted that five types of contextual variables were thought to have moderated the effectiveness of supervision. These were the ‘general organizational context’ (e.g. administrative support and staff turnover), ‘intervention factors’ (e.g. the acceptability of supervision and the use of incentives); ‘research factors’ (e.g. reactivity to observation and unreliable observation tools); ‘learning factors’ (e.g. training was based on educational needs); and ‘participant factors’ (e.g. clinical outcomes, supervisee anxiety). As this list indicates, some of these variables were judged by the study authors to be boosters to supervision, while other variables were deemed barriers. There are logical and ethical requirements to make such ‘boosting’ support arrangements, as reflected in the UK Department of Health policy to support supervisors with ‘multiple knowledge and learning sources, technical and other resources’ (e.g. Department of Health, 2001, p. ix). But according to surveys, such infrastructure tends to be weak, undermining the provision of supervision (Harmse, Reference Harmse2001). In turn, this presents a barrier to accessing CBT (Stallard et al. Reference Stallard, Utwin, Goddard and Hibbert2007). It also represents a failure to systematize support for supervisors, an unacceptable strategic and moral oversight, leading to burnout and to dissatisfaction over arrangements in 82% of surveyed supervisors within one study (Gabbay et al. Reference Gabbay, Kiemle and Maguire1999).

Utilization: How can we support and guide the transfer of supervision training?

Thankfully, there are also some highly informative accounts of implementing training (and related developments), serving as useful guides to the trials and tribulations of innovation. For example, Lynch & Happell (2008) described an audit of the implementation of supervision in rural Australia. They highlighted the need for a strategic plan (incorporating the vision of the organization and place of the clinical supervision innovation), guided by the innovation literature (e.g. establishing a system for training, supported by implementation ‘champions’).

Two aspects of the Lynch & Happell (2008) training innovation intended to increase utilization were consultancy and supervision of supervision. In addition to the traditional focus on addressing organizational barriers and boosters, consultants are now being viewed as key agents within more active and prospective implementation strategies (Nadeem et al. Reference Nadeem, Gleacher and Beidas RS2013). Drawing on their experiences as consultants and informed by a review of the literature, Nadeem et al. (Reference Nadeem, Gleacher and Beidas RS2013) defined some core functions of implementation consultancy (e.g. accessing additional resources, encouraging engagement, facilitating the supervisory infrastructure, and fine-tuning). Supervision-of-supervision overlaps with consultancy in both form and function, though there is little by way of research. From a review, Milne (Reference Milne2009) concluded that ‘the supervision of supervisors is the most deficient area in the whole enterprise of clinical supervision’ (p. 186). But according to a survey by Wheeler & King (Reference Wheeler and King2000), the majority of supervisors reported receiving supervision of their supervision. In a rare study, Milne et al. (Reference Milne, Reiser and Cliffe2013) utilized a supervision manual, incorporating supervision guidelines, to develop a supervisor's fidelity to CBT supervision. An account of the supervision-of-supervision procedure was provided in Milne & Reiser (Reference Milne, Reiser, Watkins and Milne2014).

Evaluation: Which data will provide the optimal corrective feedback?

Any healthy system requires information on progress to adapt, in a form which empowers corrective action. In terms of supervision there have been particular challenges in developing specific feedback instruments (Wheeler & Barkham, Reference Wheeler, Barkham, Watkins and Milne2014), such as the diverse definitions of supervision (Milne, Reference Milne2007), and the divergence over defining supervisory success. Even when a measurement system is in place, questions may remain concerning the most valuable feedback data. Regarding supervisor training, researchers have tended to apply educational evaluation criteria, such as acceptability, learning, transfer and clinical benefit (Kirkpatrick, Reference Kirkpatrick, Craig and Bittel1967). Within clinical service evaluations, practical solutions to measurement problems have included the clinical outcome monitoring approach within IAPT (Richards, Reference Richards, Watkins and Milne2014). A broader approach to the evaluation of a system such as SOS is the approach proposed by Proctor et al. (Reference Proctor, Silmere, Raghavan, Hovmand, Aarons, Bunger, Griffey and Hensley2011). They helpfully delineated this taxonomy of implementation outcomes:

  • acceptability (e.g. satisfaction with the simplicity and credibility of an innovation),

  • adoption (uptake and utilization),

  • appropriateness (perceived fit or compatibility),

  • feasibility (actual fit or suitability),

  • fidelity (adherence and service quality),

  • costs (cost-effectiveness),

  • innovation penetration (e.g. improved service and clinical outcomes),

  • innovation sustainability (durable, incorporated, routinized).

We believe that to be optimal, feedback should encompass such outcome dimensions, possibly best studied in a stepwise manner, as per the fidelity framework mentioned earlier.

Conclusions

We believe that the SOS model represents a novel, CBT-friendly, systematic and feasible approach to supervisor support and development, with firm roots in EBP. It has the capacity to indicate the organizational conditions under which clinical supervision might flourish and the potential to integrate these conditions within an infrastructure that might be expected to vigorously support and robustly develop supervision. The SOS model also indicates how progress might be monitored.

But implementing the SOS model requires extensive organizational development (‘innovation’), a highly challenging process that is perhaps under-emphasized above. Furthermore, in deriving from a centrally disseminated model of research and development (Department of Health, 2001), our model is ironically at risk of rejection, because (among other things) it has not been presented as sufficiently adaptable to local needs. We recognize that in certain contexts some of the SOS model's tasks should carry much greater weight (e.g. attention to restorative interventions, at the expense of training); and that the different tasks may at times be best addressed in a different order. Assuming such intelligent application, we believe that the SOS model's tasks are all necessary for optimal results.

Declaration of Interest

None.

Learning objectives

By the end of this paper the reader should be able to:

  1. (1) List six variables associated with supporting the provision of clinical supervision within large healthcare organizations.

  2. (2) Summarize the SOS model.

  3. (3) Provide an illustration for one of the tasks within the SOS model.

References

Recommended follow-up reading

Rotheram-Borus, MJ, Swendeman, D, Chorpita, BF (2012). Disruptive innovations for designing and diffusing evidence-based interventions. American Psychologist 67, 463476.CrossRefGoogle ScholarPubMed

References

Beck, J, Sarnat, JE, Barenstein, V (2008). Psychotherapy-based approaches to supervision. In: Casebook for Clinical Supervision: A Competency-Based Approach (ed. Falender, C. & Shafranske, E.), pp. 57–96. Washington, DC: American Psychological Association.Google Scholar
Beidas, RS, Kendall, PC (2010). Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clinical Psychology: Science & Practice 17, 130.Google ScholarPubMed
Borrelli, B, Sepinwall, D, Ernst, D, Bellg, AJ, Czajkowski, S, Greger, R, DeFrancesco, C, Levesque, C, Sharp, DL, Ogedegbe, G, Resnick, B, Orwig, D (2005). A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10 years of health behaviour research. Journal of Consulting and Clinical Psychology 73, 852860.CrossRefGoogle Scholar
Callahan, JL, Almstrom, CM, Swift, JK, Heath, CJ (2009). Exploring the contribution of supervisors to intervention outcomes. Training & Education in Professional Psychology 3, 7277.CrossRefGoogle Scholar
Department of Health (1994). Research and development in the new NHS. London: Department of Health.Google Scholar
Department of Health (2001). Working together-learning together: a framework for life-long learning for the NHS. London: Department of Health.Google Scholar
Dorsey, S, Pullmann, MD, Deblinger, E, Berliner, L, Kerns, SE, Thompson, K, Unutzer, J, Weiss, JR, Garland, AF (2013). Improving practice in community-based settings: a randomized trial of supervision–study protocol. Implementation Science 8, 89, 111.CrossRefGoogle ScholarPubMed
Falender, C, Cornish, JAE, Goodyear, R, Hatcher, R, Kaslow, NJ, Leventhal, G, Shafranske, E, Sigmon, ST, Stoltenberg, C, Grus, C (2004). Defining competencies in psychology supervision: a consensus statement. Journal of Clinical Psychology 60, 771785.CrossRefGoogle ScholarPubMed
Fleming, I (2012). Developments in supervisor training. In: Supervision and Clinical Psychology, 2nd edn (ed. Fleming, I. & Steen, L.), pp. 7795). London: Routledge.Google Scholar
Gabbay, MB, Kiemle, G, Maguire, C (1999). Clinical supervision for clinical psychologists: existing provision and unmet needs. Clinical Psychology & Psychotherapy 6, 404412.3.0.CO;2-B>CrossRefGoogle Scholar
Godley, SH, Garner, BR, Smith, JE, Meyers, RJ, Godley, MD (2011). A large-scale dissemination and implementation model. Clinical Psychology: Science and Practice, 18, 6783.Google ScholarPubMed
Greene, LR (2014). Dissemination or dialogue? American Psychologist 69, 708709.CrossRefGoogle ScholarPubMed
Harmse, AD (2001). Support systems for supervisors in the Social Work profession. Dissertation Abstracts International, A: The Humanities and Social Sciences 61, 3351.Google Scholar
Hawkins, P, Shohet, R (2000). Supervision in the Helping Professions. Milton Keynes: Open University Press.Google Scholar
Hayes, JR (1989). The Complete Problem-Solver. Hillsdale, NJ: Erlbaum.Google Scholar
Helfrish, DH, Li, Y, Sharp, ND, Sales, AE (2009). Organizational readiness to change assessment (ORCA): Development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework. Implementation Science 2009, 4: 38.CrossRefGoogle Scholar
Hoge, MA, Migdole, S, Cannata, E, Powell, DJ (2014). Strengthening supervision in systems of care: exemplary practices in empirically-supported treatments. Clinical Social Work Journal 42, 171181.CrossRefGoogle Scholar
Holloway, EL (2014). Supervisory roles within systems of practice. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 598621. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Kadushin, A (1976). Supervision in Social Work. New York: Columbia University Press.Google Scholar
Kaslow, NJ, Borden, KA, Grus, C, Collins, FL, Campbell Forrest, L, Illfelder-Kaye, J, Nelson, PD, et al. (2004). Competencies conference: Future directions in education and credentialing in professional psychology. Journal of Clinical Psychology 60, 699712.CrossRefGoogle ScholarPubMed
Kihlgren, M, Hansebo, G (2014). Organizational change and supervision. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 155176. Chichester: Wiley.CrossRefGoogle Scholar
Kirkpatrick, DL (1967). Evaluation of training. In: Training and development handbook (ed. Craig, R. L. & Bittel, L. R.), pp. 87112. New York: McGraw-Hill.Google Scholar
Kolb, DA (1984). Experiential Learning. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
Liese, BS, Beck, JS (1997). Cognitive therapy supervision In: Handbook of Psychotherapy Supervision (ed. Watkins, C. E.), pp. 114133. New York: Wiley Google Scholar
Lynch, L, Happell B, (2008). Implementation of clinical supervision in action: Part 2: Implementation and beyond. International Journal of Mental Health Nursing 17, 6572.CrossRefGoogle ScholarPubMed
McHugh, RK, Barlow, DH (2010). The dissemination and implementation of evidence-based psychological treatments: a review of current efforts. American Psychologist 65, 7384.CrossRefGoogle ScholarPubMed
Milne, DL (2007). An empirical definition of clinical supervision. British Journal of Clinical Psychology 46, 437447.CrossRefGoogle ScholarPubMed
Milne, DL (2009). Evidence-based Clinical Supervision. Chichester: BPS Blackwell.Google Scholar
Milne, DL, Aylott, H, Fitzpatrick, H, Ellis, MV (2008). How does clinical supervision work? Using a best evidence synthesis approach to construct a basic model of supervision. The Clinical Supervisor 27, 170190.CrossRefGoogle Scholar
Milne, DL, Reiser, RP (2014). SAGE. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 402415. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Milne, DL, Reiser, RP, Cliffe, T (2013). An n = 1 evaluation of enhanced CBT supervision. Behavioural & Cognitive Psychotherapy 41, 210220.CrossRefGoogle Scholar
Milne, DL, Scaife, J, Cliffe, T (2009). How should we train effective supervisors? A British consensus on facilitating experiential learning. Clinical Psychology Forum 203, 712.CrossRefGoogle Scholar
Milne, DL, Sheikh, AI, Pattison, S, Wilkinson, A (2011). Evidence-based training for clinical supervisors: a systematic review of 11 controlled studies. The Clinical Supervisor 30, 5371.CrossRefGoogle Scholar
Moran, AM, Coyle, J, Boxall, D, Nancrow, SA, Young, J (2014). Supervision, support and mentoring interventions for health practitioners in rural and remote contexts: an integrative review and thematic synthesis of the literature to identify mechanisms for successful outcomes. Human Resources for Health 12, 10.CrossRefGoogle ScholarPubMed
Nadeem, E, Gleacher, A, Beidas RS, (2013). Consultation as an implementation strategy for evidence-based practices across multiple contexts: unpacking the black box. Administration & Policy in Mental Health & Mental Health Services Research 40, 439450.CrossRefGoogle ScholarPubMed
Ogren, M-L, Sundin, E (2009). Group supervision in psychotherapy. Main results from a Swedish project on psychotherapy supervision in a group format. British Journal of Guidance & Counselling 37, 129139.CrossRefGoogle Scholar
Olds, K, Hawkins, R (2014). Precursors to measuring outcomes in clinical supervision: a thematic analysis. Training & Education in Professional Psychology 8, 158164.CrossRefGoogle Scholar
Padesky, CA (1996). Developing cognitive therapist competency: teaching and supervision models. In: Frontiers of Cognitive Therapy (ed. Salkovskis, P. M.), pp. 266292. London: Guilford Press.Google Scholar
Proctor, B (1988). A cooperative exercise in accountability. In: Enabling and ensuring (ed. Marken, M. & Payne, M.), pp.2134. Leicester: Leicester National Youth Bureau and Council for Education and Training in Youth and Community Work.Google Scholar
Proctor, EK, Landsverk, J, Aarons, G, Chambers, D, Glisson, C, Mittman, B (2009). Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research 36, 2434.CrossRefGoogle ScholarPubMed
Proctor, E, Silmere, H, Raghavan, R, Hovmand, P, Aarons, G, Bunger, A, Griffey, R, Hensley, M (2011). Outcomes of implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration & Policy in Mental Health & Mental Health Services Research 38, 6576.CrossRefGoogle ScholarPubMed
Reiser, RP (2014). Supervising cognitive and behavioural therapies. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 493517. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Reiser, RP, Milne, DL (2014). A systematic review and reformulation of outcome evaluation in clinical supervision: applying the fidelity framework. Training & Education in Professional Psychology 8: 149157.CrossRefGoogle Scholar
Richards, DA (2014). Clinical case management supervision: using clinical outcome monitoring and therapy progress feedback to drive supervision. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 518529. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Roth, AD, Pilling, S (2007). A competence framework for the supervision of psychological therapies (http://www.ucl.ac.uk/CORE/).Google Scholar
Saarikoski, M (2014). The supervision scale. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 416430. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Shea, CM, Jacobs, SR, Esserman, DA, Bruce, K, Weiner, BJ (2014). Organizational readiness for implementing change: a psychometric assessment of a new measure. Implementation Science. Published online: 10 January 2014. doi:10.1186/1748-5908-9-7.CrossRefGoogle Scholar
Stallard, P, Utwin, O, Goddard, M, Hibbert, S (2007). The availability of cognitive behaviour therapy within specialist child and adolescent mental health services (CAMHS): a national survey. Behavioural and Cognitive Psychotherapy 35, 501505.CrossRefGoogle Scholar
Stirman, SW, Calloway, A, Toder, K, Miller, CJ, DeVito, AK, Meisel, SN, Xhezo, R, Evans, AC, Beck, AT, Crits-Christoph, P (2013). Community mental health provider modifications to cognitive therapy: Implications for sustainability. Psychiatric Services 64, 10561059.CrossRefGoogle Scholar
Watkins, CE (1997). Some concluding thoughts about psychotherapy supervision. In: The Handbook of Psychotherapy Supervision (ed. Watkins, C. E.), pp. 603616. New York: Wiley.Google Scholar
Watkins, CE, Milne, DL (2014). Clinical supervision at the international crossroads: current status and future directions. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E & Milne, D. L.), pp. 673696. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Watkins, CE, Wang, CDC (2014). On the education of clinical supervisors. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E & Milne, D. L.), pp. 177203. Chichester: Wiley-Blackwell.CrossRefGoogle Scholar
Wheeler, S, Barkham, M (2014). A core evaluation battery for supervision. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D), pp. 367385. Chichester: Wiley.Google Scholar
Wheeler, S, King, D (2000). Do counselling supervisors want or need to have their supervision supervised? An exploratory study. British Journal of Guidance and Counselling 28, 279290.CrossRefGoogle Scholar
White, E, Winstanley, J (2014). The Manchester Clinical Supervision Scale, MCSS-26. In: The Wiley International Handbook of Clinical Supervision (ed. Watkins, C. E. & Milne, D. L.), pp. 386401. Chichester: Wiley.Google Scholar
Figure 0

Fig. 1. The SOS model, a systematic way of thinking about supporting our supervisors.

Submit a response

Comments

No Comments have been published for this article.