Abstract
Background: Practice facilitation (PF) is a promising but relatively new intervention supporting data-driven practice change. There is a need to better detail research-based facilitation methods, which must balance intervention fidelity and time restrictions with the flexibility required for the intervention. As part of a multi-level 4-armed cluster randomized clinical trial (RCT), 32 rural primary care practices received PF for 1 year. We evaluated the feasibility of having facilitators guide practices to perform 4 key driver domain activities, implemented as Plan-Do-Study-Act (PDSA) cycles, to better understand facilitation “exposure.” We describe the intervention and activity length such that our experiences may be useful to other PF research efforts.
Methods: Thirty-two practices serving rural patients involved in the Southeastern Collaboration to Improvement Blood Pressure Control engaged with a facilitator to develop and implement PDSAs nested within key drivers of change domains. Numbers of months practices worked on activities deemed most likely to be sustained were captured along with practice satisfaction data.
Results: All practices engaged in at least 4 domain-level activities, and 59% of the PDSAs were active for at least 3 months. There was variation by domain in the average length of the PDSA activities. Ninety-seven percent (31 of 32) of practices recommended similarly structured facilitation services to other primary care practices, and 84% (27 of 32) noted substantive changes in their care processes.
Conclusion: In this trial, it was feasible for PFs to engage practices in at least 4 Key Driver quality improvement activities within 1 year, which will inform PF methods and protocol development in future trials.
Introduction
Background
African American adults have significant hypertension (HTN) and uncontrolled HTN disease burden.1,2 Research teams and funding agencies continue to work to understand how to improve outcomes for this population and identify resources to support patients working to control their blood pressure and the primary care clinical staff who serve them.
Practice facilitation is a promising approach to redesigning care where practice facilitators (PFs) guide primary care practice staff and providers to understand how to implement data-driven quality improvement (QI) strategies.3 Berta et al. describe practice facilitation as a concerted, social process that focuses on evidence-informed practice change and incorporates aspects of project management, leadership, relationship building, and communication.4 Practice facilitation interventions have been associated with improved care processes and outcomes for patients with asthma,5 diabetes,6 elevated risk for cardiovascular disease,7 and hypertension.8 Facilitated interventions have resulted in improved general testing behaviors for cancer,9 enhanced adoption of screening guidelines among practices10,11 and improved workflows that support care for patients using opioids.12 Unlike traditional consulting models, PFs guide practices to become their own agents of change.13
Although the evidence for the impact of practice facilitation is building, it is still a relatively new strategy, and questions remain. For example, which methods of practice facilitation work in which settings? How can the practices' exposure to PFs or the interventions that PFs guide be captured? A recent meta-analysis of 43 practice facilitation interventions and associations with practices' adoption of evidence-based guidelines noted a relationship between the “intensity” of facilitation, defined as number of contacts and length of time spent in meetings including facilitators, and practice stakeholders, but no relationship between the duration of the overall PF intervention on guideline adoption.14 More recent trials have captured the amount of time PFs spend in direct contact with practice stakeholders but have not included these time exposures as variables in their analyses.15,7
A basic tenet of PF activities is to use the Plan, Do, Study, Act, or “PDSA” framework.16 PDSAs are a series of steps to gain valuable learning, knowledge, and continuous improvement of an existing or newly implemented process.17 A PDSA cycle is an iterative 4 step management tool used in a variety of businesses to organize improvement activities and is rooted in the work of Shewhart, Deming, and others.16 It is the action step used to carry out small changes and rapidly understand if changes result in improvements in outcomes. In the “Plan” stage, goals and outcomes are defined, the “Do” stage is when the plan is conducted, and data are collected, the “Study” stage is then used to evaluate if outcomes are enhanced, and the “Act” stage then moves the process forward by making small data-driven changes.18
Although there is no unified framework for evaluating the implementation of PDSA cycles, QI enthusiasts and researchers continue to rely on them to drive change in diverse organizations. In a systematic review, Taylor et al19 analyzed several key parameters to assess the efficacy of PDSA cycles, including duration of both “single isolated cycles” and iterative chains of PDSAs. There is no set recommended length of PDSAs and, by their very nature, can be brief. They also can be iterative, thus involving making small tweaks to processes continuously until a group determines that the cycles are completed or abandoned. In Taylor's work, the authors noted a marked range in the duration of individual PDSA cycles. For instance, in 1 case, there were 3 cycles completed in a single day, while in another example, 1 cycle was completed in 16 months.19
Our aim in this article is to share our experiences with PF in the Southeastern Collaboration (SEC) to Improve Blood Pressure Control cluster-randomized trial and evaluate if it was feasible for PFs to guide the implementation of at least 4 Key Driver Domain activities using the PDSA framework, all within a 1-year time interval for all 32 primary care practices randomized to receive practice facilitation services. As a central tenet of PF is its highly flexible, customized approach, it was not clear if our goal of at least 1 QI activity in each of the 4 domains in 1 year was feasible.
Herein we describe the SEC's practice facilitation intervention and its assessment framework in hopes that some of our approaches may be useful to others studying practice facilitation.
Methods
Setting
Details of the SEC cluster randomized trial are described elsewhere20. Briefly, the aim of the trial was to understand if a practice- or patient-level intervention, alone or in combination, delivered in primarily rural primary care and community settings in North Carolina and Alabama, can positively impact blood pressure control among African American adults with uncontrolled hypertension, possibly by addressing factors at both the practice- and patient-level that aim to address social determinants of health in addition to a traditional focus on standardizing care, engaging in data-driven practice change, using team-based interventions, and supporting patients in self-management activities. The practice-level intervention was practice facilitation, and the patient-level intervention was a telephone-delivered peer coaching self-management intervention. A total of 69 practices serving rural-dwelling citizens were engaged in the trial. In this 4-armed trial, practices were randomized to receive (1) 1 year of practice-level facilitation, (2) patient-level peer coaching for 25 recruited patients/practice, (3) both interventions together, or (4) enhanced usual care resources that were provided to practices in all study arms, which included access to online educational and self-management tools,21 home blood pressure cuffs for the 25 recruited subjects, and other patient-facing educational materials. PFs led the practices through change activities intended to improve hypertension control among their patients. Importantly, PFs were trained to guide practices to select at least 1 QI activity in each of 4 domains of a Key Driver of Implementation framework—a framework that supports moving people beyond knowledge acquisition to actual implementation in practice (Figure 1).22 As the study was focused on enhanced HTN control, Domain level activities targeted HTN control (Figure 1 and Table 1 for HTN examples). We partnered with the North Carolina Area Health Education Center's Practice Support Program's leadership and seasoned practice facilitators in this project who endorsed the use of PDSAs and the Model for Improvement in efforts to guide practices in their efforts to use data to drive change, thus the reason for selecting this QI framework in this feasibility analysis.
The PF Intervention and 4 Key Drivers of Implementation
Thirty-two practices were randomized to receive the PF intervention and engaged with a facilitator via onsite and remote communication (video conference, e-mail, phone), where they co-developed and implemented Key Driver Domain level activities using PDSA cycles and methods. Activities targeted enhanced hypertension control for all patients served. PFs continuously guided practices to use QI methods as part of their SEC study engagement but also to use indefinitely as other challenges arise.
The 4 domains of the Key Drivers of Change framework are (1) Clinical Information Systems (CIS), where data from a practice's own patients and practice experiences is used to drive change; (2) Standardized Care Processes (SCP), where care is standardized to assure that evidence-informed care is being delivered to all patients with a condition of focus and at all visits where appropriate; (3) Optimized Team Care (OTC), where clinical teams work together to share workloads and patient care tasks such that the burden of QI work does not fall on just a few, and (4) Self-Management Support (SMS), where clinical staff work with patients to take action to enhance their health and health outcomes by engaging in activities that are performed outside of the office setting.
PF Training
We hired two experienced PFs and four individuals with clinical, public health, or quality improvement experience for the study. Six PFs completed the 13-week PF Certificate Program at Millard Fillmore College of the State University of New York at Buffalo (Table 1). Priority topics included learning the basics of QI and data collection/measurement, understanding the chronic disease model,23 using tools to help practices map workflows, implementing PDSA cycles, and optimizing the use of care teams. The program concluded with a 40-hour field experience where trainees receive 1-on-one mentoring by experienced PFs serving practices in Eastern North Carolina. As practice change requires staff and provider behavior change, the PFs were formally trained in motivational interviewing and engaged in role-playing experiences. PFs used a HTN study-specific implementation guide adapted from prior PF interventions in North Carolina and other publicly available resources.24 The guide's contents reinforced how to abstract practice level HTN control data and how to capture/score the breadth and depth of involvement of practice staff engaged in Key Driver domain level activities monthly using an ordinal scale, the Key Driver Implementation Scale (KDIS) (Table 2).
Intervention Rollout
Within 2 weeks of a practice's randomization to the PF arm, the assigned PF met with the practice staff and providers to review the study objectives, explain how they would be guided through a collaborative process of developing and implementing PDSAs that fit within the Key Driver framework, and how activities needed to target BP control among their patient populations, thus not just those recruited into the study.
Based on prior reports of suboptimal measurement techniques in community practices,8,25 facilitators were encouraged to engage practices in activities that addressed appropriate BP measurement early on in the intervention period in the Standardized Care Process domain. PFs also understood the need to engage practices in clinical information systems PDSA cycles to help practices understand their HTN control data and understand if change activities resulted in improved HTN control. Practices could engage in as many PDSA cycles as they wished as long as 1 PDSA cycle was completed in each of the 4 Key Driver domains.
PFs met with their practices in person at least once monthly. Each practice selected a Practice Champion from among their staff who served as the point of contact for the PF and study staff. Champions were selected for their interest in BP control and level of enthusiasm for the study.
Data Collection
PFs Monthly Assessments of Each Practice Using the Key Drivers of Implementation Scale (KDIS)
The PFs were asked to score each of their practices monthly using the KDIS, adapted for the SEC trial. The KDIS is an assessment instrument consisting of 3 to 5 options that captures progress in each domain. PF-rated KDIS assessments have been correlated with practice-level outcomes (Figure 2).26,27 Early in the intervention period, discussions during the twice-monthly PF calls served to calibrate the PFs to approach and score their assessments similarly. The KDIS served the dual purpose of evaluating practice progress and the 4 domains and allowing the PF to view progress over time.
Key Driver Domain of PDSA Cycles and Cycle Length
The PFs documented all PDSA activities and nested these in their corresponding Key Driver Domains. PFs documented the start and end dates of each PDSA cycle. PFs noted the type of activity, how it related to BP control, and field notes about challenges encountered. Near the end of each practice's 12-month intervention period, PFs were asked to identify, in partnership with the practice champions, 1 Key Driver PDSA activity they deemed most engaging and most likely to be included as part of practice behaviors beyond the end of the PF intervention. We identified these as the “best” PDSAs in the study database.
Results
Of 32 practices, 14 were affiliated with our study's North Carolina sites (5 with University of North Carolina and 9 with East Carolina University) and 18 with the Alabama (University of Alabama-Birmingham [UAB]) site. Fifteen of the 32 were community health centers, the mean number of full-time providers was 3.7, just under a quarter of the patients were Medicaid beneficiaries, and 58% were African Americans (Table 3). Most practices had some prior experience performing QI projects, and one-third had received patient-centered medical home recognition (Table 1). These descriptive data demonstrate the diversity of the practices in our cohort that included very small practices where a provider may only be onsite a few days per week versus practices that were larger community health centers that include dental, social work, and other services along with traditional primary care services.
PF Practice Case Load
The number of practices that received facilitation services per PF ranged from 5 to 11. The monthly caseload of practices ranged from 1 to 7 (Figure 3). The final workforce over the length of the study included 4 PFs. For UAB, 1 PF received training in case a substitute was needed (it was not), and 1 PF left the project early in the intervention phase and was replaced (shown in the first 2 rows as PF1a and PF 1b in Figure 3).
PDSA Cycles in Each of 4 Key Driver Domains
All 32 practices successfully implemented at least 4 PDSA activities, including 1 from each Key Driver domain. Examples of PDSA activities grouped by domain are summarized in Table 4.
PDSA Activity Duration
There were 128 “best” PDSA cycles available to analyze the PDSA cycle duration (see Figure 4 – PDSA lengths by Key Driver). Overall, 76 (59.4%) PDSA cycles were maintained for at least 3 months. Within Key Driver domains, 72% of Clinical Information System activities were maintained for 3 or more months, 66% of Self-Management support, 53% of Standardized Care Processes, and 47% of the Optimized Team Care activities were maintained for at least 3 months. In addition, there is noted variation in the spread of the data over time among the Key Driver PDSAs where some CIS and SMS activities were maintained over longer periods while OTC and SCP activity time lengths had relatively less variation (Figures 4 and 5).
Discussion
In this article, we describe the SEC PF intervention and the framework devised to assess PF activities as they engaged their practices in Key Driver domain level PDSA activities. We demonstrated that at least 4 QI activities around improving BP control, 1 in each domain of our Key Driver framework, could be completed by each practice over a 1-year period. The more engaging and durable of these QI activities varied in their duration; 59% were sustained for 3 or more months. As practice facilitation, by design, is purposefully flexible and adaptive to practice needs, we feel this is an important step in identifying ways to capture “exposure” to facilitated interventions in research trials performed in real-world settings.
We observed differences in duration of PDSA cycles by domain. We were not surprised to observe a longer duration of activities in the Clinical Information Systems domain. In our ongoing and past trials involving facilitation services, we note that practices often need significant time and guidance to learn how to access and report reliable patient- and population-level data, especially for practices new to electronic health records or reporting requirements.25,26,8,19,27 In contrast, engagement in activities in the Standardized Care Processes domain required less time; in general, these are activities more familiar to clinicians, with long-standing practice guidelines emphasizing evidence-informed care, as reflected in the fact that 84% reported prior experience with QI. The team-based care domain was a newer concept for many of our SEC practices but was deemed easier to accomplish in shorter timeframes than the other Key Driver PDSA cycles. This finding may have been due to the activities involving small groups of staff within the confines of their own practices. In contrast, self-management support activities that rely on the inclusion of patients and may involve activities over time or over several office visits (for example, if they documented home BPs or were able to meet self-care goals) required more time. These issues of one's locus of control and other factors can influence the length of time needed to develop, implement, and complete improvement activities. Future facilitation trials may find these observations helpful as they design their interventions.
Regarding prior literature, authors have shared descriptive information about how long some aspects of PDSA cycles lasted, but few have compared the relative duration of PDSA cycles in the larger context of Key Drivers of Change. We are not aware that prior reports have included data on the duration of time spent on various types of PDSA cycles. In fact, we are not aware that prior authors have attempted to estimate the amount of time exposed to PFs and the impact of facilitation services on additional time practices spend on improvement activities when their PF is not directly involved.18,28 The systematic review by Taylor et al28 includes multiple manuscripts that provide some information on “temporal duration of cycles,” but the details are limited. For instance, in 1 study, a practice implemented a process for performing diabetic foot exams by dedicating 2 weeks of staff time to doing so29 while in another study, authors described that a “Do” part of a PDSA cycle lasted 2 months30 without further quantification. Other authors noted how often subsequent PDSA cycles would occur to reassess how well a practice maintained a desired behavior,31 how long an effort's initial PDSA cycle took32,31 or how long an entire project lasted,33 but none attempted to ascertain if a certain number of improvement activities could be attended to within a period or explored whether certain activities could be maintained over time, as we did in our study. Taylor calls for the research community to continue to explore the extent to which PDSA methods are successfully deployed to better draw conclusions from such studies going forward.
It is noteworthy that although we have yet to analyze the BP outcome data in the parent trial and thus understand if BP control is enhanced in any of the 4 study arms, our practice satisfaction survey data obtained within 1 month of the end of the PF intervention period demonstrates that 31 of 32 practices (97%) receiving facilitation services would recommend PF to other similar practices. In addition, 27 of 32 practices (84%) made substantive changes in the management of patients with hypertension due to engaging with their PF, which may provide some confidence that even without outcome data, other practices may be interested in and able to implement similar changes.
Strengths and Limitations
Strengths of our study include the engagement of rural primary care practices in the Southeast, an area with a high burden of chronic diseases. The PF intervention was developed from a foundation of prior work, and a highly experienced PF (MM) led the ongoing calls throughout the intervention period, providing invaluable advice. Our findings need to be interpreted in the context of some limitations, 1 of which being that the 12-month intervention period effectively truncated PDSA cycles initiated near the end of the year. This finding would result in a bias toward observing shorter duration of PDSA cycles. Although 3 the 4 PFs did not feel that this was a factor in their work, as they started most PDSAs early in their engagement with their practices, 1 facilitator shared that she often waited until closer to the last quarter to start the Self-Management Support PDSA work; thus, the data for that specific domain could be biased toward being shorter in length than the other domain PDSAs. Other limitations include that our method for capturing time spent engaged in the activity was reported, not directly observed. Future studies could consider logs or other methods to more precisely assess the time practice staff spent on a specific PDSA cycle. We also included only the most engaging (known as the “best”) activity in each Key Driver domain at each practice; it was beyond the project's scope to exhaustively inventory every QI activity undertaken by the practices over the intervention year. Lastly, we acknowledge that our use of the term “PDSAs” may not align with their original intent or how the QI community views PDSAs. As they are by design rapid cycle processes that build on each other, we recommend that going forward, we call them “Key Driver Domain-Specific Improvement Activities.” However, as we used the PDSA term throughout our study and with our practice, research, and community stakeholders, we agreed to use this taxonomy in this and other SEC study scholarly products.
In addition, for the research community, it is worth noting that our PFs shared that ideally, they would have had 18 months to engage practices in our intervention. They felt that the initial 3 to 6 months of time with practices, many of which were new to QI and research, may be most productive if they could dedicate that time to relationship building and spending more time on formative processes, like working with practices to understand the “how to and why should they” engage in data-driven QI and practice-based research before the steadfast focus on implementing PDSA cycles. Such foundational work could impact the length and other aspects of how PDSA cycles are implemented subsequently. Despite the need to accomplish this work in a year, feedback from practice leadership collected 1 month after the intervention ended, overwhelmingly supported that the experience was positive and that practices would recommend facilitation to other primary care practices. In addition, several of our experienced PFs noted that this method of focusing on PDSA length and completing at least 4 domain-level activities within a year could be replicated in other practices and when working to improve outcomes for other health conditions. The clear focus on performing activities by domain was seen to organize, standardize, and clearly articulate expectations to practices in this study versus other facilitated projects where time frames and expectations were less well defined.
Conclusions
We demonstrated the feasibility of implementing and sustaining practice-based QI activities that map to 1 of 4 domains of Key Drivers of Change over a 1-year period. The PFs were able to assess the number of months each practice spent on each QI activity. Although the method warrants further validation, our preliminary findings suggest there is variation in the length of activities at the Key Driver Domain level. Our experience supports that it is feasible to use this method and the Key Driver framework in the setting of a pragmatic trial to capture exposure to PF services. This method may be particularly important to consider as other research teams develop study protocols and research strategies that include PF interventions that, by nature, must allow for the flexibility required to engage clinical practices in trials.
We fully appreciate that our objective was to understand if it was feasible to implement Key Driver activities in this 1-year period and invite others to add to the evidence regarding the impact of practice facilitation and to learn from our experiences. Through this collaboration, we can continue to explore various methods that may better help to understand dose, context, and structure of how facilitation services are implemented in and among clinical settings and ultimately to analyze how such factors are related to key study outcomes.
Acknowledgments
We wholeheartedly thank Muna Anabtawi, Alyssa Adams, and Paula Lipman, for their support during the development of this manuscript; they each brought their own individual and essential expertise to the work.
Notes
This article was externally peer reviewed.
Funding: This work was supported by the National Heart, Lung, and Blood Institute 4UH3HL130691-04 and National Institutes of Health, through NC TraCS Grant Award Number UL1TR0024899. Clinicaltrials.gov# NCT02866669.
Conflict of interest: The authors have no conflicts of interest to disclose.
To see this article online, please go to: http://jabfm.org/content/34/5/991.full.
- Received for publication April 1, 2021.
- Revision received June 15, 2021.
- Accepted for publication June 15, 2021.