Policy to implementation: evidence-based practice in community mental health – study protocol

Please download to get full document.

View again

All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
  Policy to implementation: evidence-based practice in community mental health – study protocol
  STUDY PROTOCOL Open Access Policy to implementation: evidence-basedpractice in community mental health  –  studyprotocol Rinad S Beidas 1* , Gregory Aarons 2 , Frances Barg 3 , Arthur Evans 1,4 , Trevor Hadley 1 , Kimberly Hoagwood 5 ,Steven Marcus 6 , Sonja Schoenwald 7 , Lucia Walsh 1 and David S Mandell 1 Abstract Background:  Evidence-based treatments (EBTs) are not widely available in community mental health settings. Inresponse to the call for implementation of evidence-based treatments in the United States, states and countieshave mandated behavioral health reform through policies and other initiatives. Evaluations of the impact of thesepolicies on implementation are rare. A systems transformation about to occur in Philadelphia, Pennsylvania, offersan important opportunity to prospectively study implementation in response to a policy mandate. Methods/design:  Using a prospective sequential mixed-methods design, with observations at multiple points intime, we will investigate the responses of staff from 30 community mental health clinics to a policy from theDepartment of Behavioral Health encouraging and incentivizing providers to implement evidence-based treatmentsto treat youth with mental health problems. Study participants will be 30 executive directors, 30 clinical directors,and 240 therapists. Data will be collected prior to the policy implementation, and then at two and four yearsfollowing policy implementation. Quantitative data will include measures of intervention implementation andpotential moderators of implementation ( i.e. , organizational- and leader-level variables) and will be collected fromexecutive directors, clinical directors, and therapists. Measures include self-reported therapist fidelity to evidence-based treatment techniques as measured by the Therapist Procedures Checklist-Revised, organizational variables asmeasured by the Organizational Social Context Measurement System and the Implementation Climate Assessment,leader variables as measured by the Multifactor Leadership Questionnaire, attitudes towards EBTs as measured bythe Evidence-Based Practice Attitude Scale, and knowledge of EBTs as measured by the Knowledge of Evidence-Based Services Questionnaire. Qualitative data will include semi-structured interviews with a subset of the sample toassess the implementation experience of high-, average-, and low-performing agencies. Mixed methods will beintegrated through comparing and contrasting results from the two methods for each of the primary hypotheses inthis study. Discussion:  Findings from the proposed research will inform both future policy mandates around implementationand the support required for the success of these policies, with the ultimate goal of improving the quality of treatment provided to youth in the public sector. Keywords:  Evidence-based practice, Community mental health, Policy, Implementation, Fidelity, Organizationalvariables * Correspondence: rbeidas@upenn.edu 1 Department of Psychiatry, University of Pennsylvania Perelman School of Medicine, 3535 Market Street, 3rd floor, Philadelphia, PA 19104, USAFull list of author information is available at the end of the article ImplementationScience © 2013 Beidas et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the CreativeCommons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, andreproduction in any medium, provided the srcinal work is properly cited. Beidas  et al. Implementation Science  2013,  8 :38http://www.implementationscience.com/content/8/1/38  Background Evidence-based treatments (EBTs) are treatments thathave been evaluated scientifically and show evidence of efficacy [1]. Despite well-established evidence of EBTsfor youth with psychosocial difficulties [1], it takes up to17 years for these treatments to make their way intocommunity settings [2]. In response to the call for imple-mentation of EBTs [3], systems have mandated behavioralhealth reform [4] through policies and other initiatives.Evaluations of the impact of these policies on implementa-tion are rare [5]. While policies may be important driversof implementation, they are likely necessary but notsufficient. In particular, organization- and leader-level var-iables may moderate the relationship between policy andimplementation.A burgeoning literature has applied evidence fromorganizational theory to mental health service organizations[6] and found that specific organizational level constructsinfluence adoption and sustainability of new practices.Constructs of particular interest include organizational cul-ture, organizational climate, and implementation climate.Organizational culture is defined as shared beliefs and ex-pectations of a work environment, whereas organizationalclimate is defined as shared perceptions about the workenvironment ’ s impact on worker well-being [7]. Organ-izational climate has been associated with both implemen-tation and youth outcomes [8]. Even more compelling,interventions that improve organizational climate can im-prove implementation of EBTs in the community [9].Distinct from organizational climate, implementationclimate is defined as staff beliefs regarding the degree towhich an innovation is expected, rewarded and sup-ported by their organization [10]. Little empirical meas-urement of implementation climate has been conductedin mental health services research [11], but researchfrom other disciplines suggests that it is highly predictiveof implementation [12].Leadership may also drive implementation of EBTs, al-though few studies have examined its effects. One modelof effective leadership [13] comprises five factors: indi- vidual consideration (consideration for each employee ’ scontributions and needs), intellectual stimulation (poten-tial to stimulate employee thinking), inspirational motiv-ation (potential to inspire and motivate employees),idealized influence attributed (ability to instill pride in em-ployees), and idealized influence behavior (ability to instill values, beliefs, purpose, and mission in employees) [13].Preliminary research on the associations among leadershipand organizational variables has found that high-quality leadership is important in times of system change andmay reduce poor organizational climate and subsequentstaff turnover [4]. High-quality leadership is also associ-ated with better staff attitudes towards adopting EBTs[14]. It is therefore critical to investigate if high-quality leadership and characteristics of leaders ( e.g  ., attitudes)predict more successful implementation of child EBTs inthe face of a policy mandate. Systems transformation in Philadelphia The City of Philadelphia ’ s Department of BehavioralHealth and Intellectual DisAbility Services (DBHIDS) iscommitted to transforming their public system into onethat is evidence-based for both adults and children. Thebehavioral health care of Medicaid-enrolled individualswith Philadelphia is managed through Community Be-havioral Health (CBH), a quasi-governmental adminis-trative service organization. Since 2007, DBHIDS hasengaged in pilot EBT implementation projects in thepublic mental health system. In 2012, the Commissionerof DBHIDS (AE), assembled the Evidence-Based Practice& Innovation Center (EPIC), a task force of expert aca-demics and leaders at DBHIDS, to develop a coordinatedapproach and centralized infrastructure that supportsproviders in implementing, utilizing, and sustainingEBTs. The contributions of EPIC will be phased. Thefirst phase entails compiling lessons learned from pilotEBT implementation projects, engaging community stakeholders, and selection of an implementation frame-work to guide the building of the infrastructure. Onceestablished, EPIC will provide support in a number of areas, including: system-wide promotion of evidence-based principles, building of provider capacity for EBTs,operational support, developing an infrastructure fortraining and ongoing support, and potentially implemen-tation of financing models to promote sustainability ( e.g  .,enhanced rates for implementation of EBTs). Currently,EPIC is in the first phase; the completion of the processand infrastructure are anticipated in the next fiscal year.Based on the activities of EPIC, a recommendation willbe made by the regulating body, DBHIDS, on implemen-tation of EBTs; we operationally define this recommen-dation as a policy mandate.The systems transformation about to occur inPhiladelphia offers a rare and important opportunity toprospectively study implementation in response to apolicy mandate from inception to implementation. Theobjective of the proposed research is to observe how community mental health providers (CMHPs) respondto a system-level policy designed to increase implemen-tation of EBTs for youth and adults with mental healthdifficulties, and to investigate if organizational and lead-ership characteristics moderate the association betweenpolicy and implementation. Specifically, the overall ob- jectives of the study are to answer the following ques-tions within CMHPs: Does a policy mandate impactimplementation of EBTs in community mental health?;Do organizational- and leader-level variables moderate therelationship between policy and implementation of EBTs?; Beidas  et al. Implementation Science  2013,  8 :38 Page 2 of 9http://www.implementationscience.com/content/8/1/38  What factors characterize the differences among providerswith low, average, and high implementation? Conceptual framework and causal model The proposed research activities are based on the con-ceptual model of EBT implementation in public servicesectors proposed by Aarons and colleagues [15]. Thisfour-phase multi-level ecological model of the imple-mentation process for EBTs in public sector settings isboth a process and explanatory framework. The processsteps include exploration, preparation, implementation,and sustainment (the EPIS model). Within each phase,particular contextual variables are relevant to the outer(external to the provider at the service system level) orinner (internal to the provider implementing an EBT)context. We will prospectively measure a subset of vari-ables from the EPIS model to examine their associationwith implementation effectiveness in CMHPs serving youth (Figure 1). The current study will assess: the impactof an outer context change ( i.e ., policy) on implementa-tion; and how inner context variables, organizational andleader characteristics, moderate the relationship betweenpolicy and implementation.All three aims draw on the following causal model.Policy, an outer context variable, is defined as a recom-mendation and support made by a regulating body topromote implementation of EBTs. In this causal model,policy is directly related to the dependent variable, im-plementation of EBTs [16]. Inner context variables, spe-cifically organizational- and leader-level variables, arehypothesized to moderate this association. In publichealth, the impact of policy is well documented, as many studies have shown that seatbelt usage can prevent in- jury and death. Policies have been enacted to requireseatbelt usage and have resulted in reduced mortality and injury [17]. However, little is known about how policy impacts and interacts with organizational characteristicsto affect provider behavior change [16]. We hypothesizethat a policy mandate that is made by a city regulatingagency (DBHIDS) will potentially have a powerful impacton provider behavior change. Methods/design Aim 1: to evaluate child-serving CMHPs response to asystem-wide policy mandating implementation of EBTs Aim 1 tests the causal relationship: Does a system-levelpolicy impact implementation of EBTs in child-servingCMHPs? Participants There are over 100 CMHPs in Philadelphia that provideoutpatient services to youth (information providedthrough personal communication, Community Behav-ioral Health, 2012). We will enroll at least 30 CMHPs;enrolled CMHPs will serve a combined total of at least80% of youth receiving publicly-funded mental healthservices in Philadelphia. In each agency, we will recruitthe executive director, clinical director, and 80% of theclinicians (estimated 8 to 10) per agency. This willproduce a total sample of 30 provider organizations, 30executive directors, 30 clinical directors, and 240 to 300clinicians. Outer Context: System level policy from the Department of Behavioral HealthImplementation of evidence-based treatments ( i.e  ., fidelity to evidence-based treatment techniques and training in evidence-based treatments)Leader-level variables Transformational leadershipAttitudes Inner ContextOrganizational variables Organizational cultureOrganizational climateImplementation climate Figure 1  Causal model. Beidas  et al. Implementation Science  2013,  8 :38 Page 3 of 9http://www.implementationscience.com/content/8/1/38  Procedure We will prospectively measure CMHP response to thepolicy generated from the DBHIDS task force. Responseis operationally defined as implementation of child EBTs.In order to have multiple indicators of implementation[18], we have defined implementation in two ways. Theprimary outcome is clinician fidelity to techniques usedin child EBTs. The secondary outcome is more proximalto the policy change and the reach of EBTs at the clin-ician level. Thus, this includes number of clinicianstrained in a specific EBT at any given data collectionpoint. We will measure these outcome variables threetimes over five years in enrolled CMHPs. Measures Dependent variable: implementation In Aim 1, the implementation outcomes include fidelity to EBT techniques and number of clinicians trained inEBTs. Fidelity  We selected fidelity, ‘ the degree to which an interventionwas implemented as it was prescribed ’  [18], as the pri-mary implementation outcome given its documented as-sociation with youth outcomes [19]. A number of otherimplementation outcomes could have been selected,such as acceptability, feasibility and adoption [18]. Ul-timately, we decided to focus on fidelity because it ismost proximal to youth outcomes, the desired end goalof implementing EBTs. Fidelity will be measured threetimes using self-reported clinician fidelity to EBTs andbrief observation. Therapist Procedures Checklist-Revised (TPC-R) The TPC-R [20] is a 62-item psychometrically validatedself-report clinician technique checklist that assessescomponents of EBTs used in session that cut across themost widely used modalities (cognitive, behavioral, fam-ily and psychodynamic). Factor analysis has confirmedthe four-factor structure, test-retest reliability is strong,and the instrument is sensitive to within-therapistchanges in technique use across child clients [20]. Therapy Procedures Observational Coding System  – Strategies (TPOCS-S) [  21 ]  Because self-reported fidelity often does not match ac-tual behavior [22], and to avoid demand characteristicson reporting the use of EBTs, we will use brief observa-tion to corroborate clinician self-report. We will ran-domly select 10% of therapy sessions in one week of asubset of the clinicians enrolled (n = 120) for observa-tion. We will use the TPOCS-S to code for presence orabsence of EBT techniques and intensity to which thera-pists use these strategies in session. The TPOCS – S is anobservational measure of youth psychotherapy proce-dures. The TPOCS – S shows good inter-rater reliability and its five subscales ( e.g  ., Behavioral, Cognitive, Psycho-dynamic, Client-Centered, Family) show good internalconsistency and validity [21].One of the challenges to measuring fidelity in multipleagencies is that agencies may select to receive trainingand implement different EBTs based on the populationsthat they serve. Therefore, it is necessary either to usedifferent fidelity measures, many of which are not vali-dated, across agencies based on which EBT they imple-ment, or to use a general validated measure that allowsfor identification of common elements across EBTs. Weselected the TPC-R because it is a psychometrically vali-dated general fidelity measure that identifies fidelity totechniques ( e.g  ., cognitive restructuring) used by theclinician that are non-specific to a particular treatment.We also elected to include an observational measure of practice to ensure that self-report is accurate. Training Our secondary implementation outcome comprises anumerical count of clinicians trained in EBTs. We willgather this information by asking clinicians to completea brief survey regarding their training in EBTs selectedby the task force to be implemented. We will also pro- vide a list of EBTs and ask if they have been trained inany of the modalities, or used them with one or moreclients in the past year.  Aim 2: to examine organization- and leader-level variablesas moderators of implementation of EBTs Recent research suggests that organizational- [8] andleader-level variables may be important proximal predic-tors of implementation of EBTs. Because an outer con-text policy is a distal predictor of implementation, innercontext variables likely play an important role in imple-mentation success. We will examine organizational- andleadership-level variables as moderators of the associ-ation between policy and implementation. Participants See Aim 1. To measure organizational level constructssuch as climate and culture, 80% of clinicians from eachCMHP will complete the measures described below. Wewill also collect relevant information from executive andclinical directors. Procedure In addition to the information gathered in Aim 1, wewill prospectively measure organizational- and leader-level variables in CMHPs. Organizational variables in-clude organizational culture, organizational climate, andimplementation climate. Leader-level variables include Beidas  et al. Implementation Science  2013,  8 :38 Page 4 of 9http://www.implementationscience.com/content/8/1/38  transformational leadership, leader knowledge of EBTs,and attitudes toward EBTs. We will collect leadership dataon both the executive director and the clinical director. Measures Organizational climate and culture Organizational Social Context Measurement System(OSC)  The OSC [6] is a 105-item measure of the socialcontext of mental health and social services organiza-tions. The OSC measures organizational culture, organ-izational climate, and work attitudes. We considered anumber of measures that assess organizational variables( e.g  ., Organizational Readiness for Change [23], Organ-izational Readiness for Change Assessment [24]). How-ever, the OSC is the gold-standard in public sectorsettings in the United States and measures organ-izational culture and climate, two variables that are crit-ical in our causal model. The OSC has national normsand can be used to create organizational profiles thatare associated with organizational functioning. Further,the OSC has strong psychometric properties, includingconfirmation of the measurement model, and acceptableto high reliability on responses, moderate to high withinsystem agreement, and significant between system dif-ferences [25]. Implementation climate Implementation Climate Assessment (ICA) The ICA [26] is a 57-item scale that measures imple-mentation climate that assesses the following constructs:educational support for EBTs, agency focus on EBTs,program focus on EBTs, agency recruitment of staff forEBTs, program selection of staff for EBTs, recognitionfor EBT use, rewards for EBT use, staff acceptance of EBTs, and supporting staff use of EBTs. Initial psycho-metrics are strong with good face validity and alphas inthe .88 to .94 range, suggesting adequate reliability [26].No other measures exist in mental health services tomeasure implementation climate, a construct first identi-fied as an important predictor of implementation in thebusiness literature [12]. Leadership Multifactor Leadership Questionnaire (MLQ) This is a measure that assesses transformational leader-ship in organizations and asks individuals to report onthe extent to which the executive and clinical directorsengage in specific leadership behaviors. The MLQ willbe administered separately for each of the leaders ( i.e .,executive and clinical directors); therapists will report ontheir leaders, and leaders also report on their own be-havior. The MLQ [13] is a widely used measure that is validated across national and international settings andindustries, including public sector services. The MLQ isthe gold-standard tool to measure transformational lead-ership from the organizational literature, and psycho-metric analyses have confirmed the factor structure of the measurement model [27]. Attitudes Evidence-Based Practice Attitude Scale (EBPAS) The EBPAS [28] is a well-validated, 15-item self-reportquestionnaire that assesses constructs related to imple-mentation of EBTs: appeal, requirements, openness anddivergence. We selected the EBPAS because it is one of the most widely used measures in implementation sci-ence, and its psychometrics are very strong. The EBPASdemonstrates good internal consistency, subscale alphasrange from .59 to .90 [29], and its validity is supportedby its relationship with both therapist-level attributesand organizational characteristics [30]. Knowledge Knowledge of Evidence-Based Services Questionnaire(KEBSQ) The KEBSQ [31] is a 40-item self-report instrument tomeasure knowledge of the common elements of EBTs.We selected it because it is the only knowledge ques-tionnaire that has any psychometric data suggesting itsreliability and validity in assessing knowledge of com-mon elements of EBTs, specifically temporal stability,discriminative validity, and sensitivity to training [31].The OSC, ICA, and MLQ will be aggregated acrossclinicians from each agency to create organizational-level constructs if exploration of the data supports this( i.e ., concordance between reporters). Dependent variables: implementation (fidelity andtraining). See Aim 1  Aim 3: to qualitatively delineate the implementation process for a subset of CMHPs Through Aims 1 and 2, we will quantitatively estimate re-sponse to a policy on implementation of EBTs and moder-ators of implementation. Activities under Aim 3 will resultin qualitative data from a subset of agencies to understandkey informants ’  perspective about the implementationprocess, and will generate information about the mecha-nisms through which organizational- and leader-level vari-ables may drive implementation [15,32]. We will use qualitative methods to expand and more deeply under-stand quantitative findings from Aims 1 and 2. We willuse a purposive sampling strategy [33] to identify individ-uals who will participate in semi-structured interviews at2 CMHPs that are high-performing, 2 that are average-performing, and 2 that are low-performing. The interviewswill explore the views and perspectives of executive direc-tors and clinicians regarding their experience with the Beidas  et al. Implementation Science  2013,  8 :38 Page 5 of 9http://www.implementationscience.com/content/8/1/38
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!