Abstract
Purpose: The purpose of this project was to evaluate the real-world usability and usefulness of a revised version of the published Agency for Healthcare Research and Quality “Improving Your Office Testing Process” toolkit, designed to help primary care practices standardize and systematize laboratory testing processes.
Method: We used a multiple case study approach to evaluate toolkit implementation in 2 primary care practices with existing quality improvement (QI) infrastructure. We collected qualitative data at baseline, midpoint (3 to 4 weeks), and follow-up (7 to 8 weeks postimplementation). Data included key informant interviews and practice site observations. Nineteen clinicians and staff participated in the interviews. Thematic analysis was used to summarize (1) how practices used the toolkit for guiding lab testing process improvement (usefulness), and (2) ease of use and practice experience with using the toolkit (usability).
Results: The toolkit was perceived as easy to use and easy to follow step by step. Two components of the toolkit were particularly useful: guidance on data gathering to inform quality improvement and tools for effective practice-patient communication. The toolkit's practice and patient assessments facilitated practice-specific insights into the lab processes considered most harmful to patients and informed improvement activities.
Conclusion: The usability and usefulness of the toolkit were related to the characteristics of the toolkit itself (adaptability, simplicity, and design quality and packaging, and guidance in planning) and practice processes (presence of practice champions and implementation teams). In a set of 2 practices in which laboratory testing process improvement was a high priority and where well-established QI infrastructure exists, the toolkit was easy to use with little technical assistance.
- Health Services Research
- Implementation
- Patient Safety
- Primary Health Care
- Qualitative Research
- Quality Improvement
- United States Agency for Healthcare Research and Quality
Family and internal medicine clinicians order laboratory tests for nearly one-third of patient encounters,1 and an estimated 15% to 54% of medical errors in primary care are attributed to laboratory testing processes.1,2 Laboratory testing errors are more likely than other ambulatory errors to be associated with an increased risk of harm.3 A lack of systematic, standardized laboratory testing processes is noted as an underlying contributor to these patient safety concerns.4 Small- to medium-sized primary care practices frequently lack formal, standardized, and efficient procedures in the overall testing process.5,6 To address this unmet need, the Agency for Healthcare Research and Quality (AHRQ) “Improving Your Office Testing Process” toolkit (henceforth, “the toolkit”) was developed to be a comprehensive, evidence-based set of quality improvement tools for practices to standardize and systematize lab testing processes known to improve patient safety.7
The toolkit, developed by Eder and colleagues,7 consists of instructions to help practices (1) assess patient safety processes in need of improvement and (2) plan, implement, and evaluate changes in workflows, roles, documentation, and patient communication materials. Before the pilot study described in this article, we engaged stakeholders (clinicians, lab staff, patients, and the toolkit creators) in improving the original version of the toolkit. Key modifications included both the look and feel of the toolkit (eg, parallel construction and page format across sections and consistent terminology) and framing of the content (eg, explain the purpose of each step in the toolkit to facilitate selection of relevant components).8 The need remained to examine the extent to which the revised toolkit (Table 1) was practical and feasible for use in real-world primary care settings and suitable for widespread implementation and dissemination.5 The objective of this article is to report the evaluation results from a pilot implementation of the revised toolkit.
The evaluation questions were (1) to what extent is the revised toolkit easy to use in real-world primary care settings? Specifically, what were practice experiences with implementation of the toolkit, including barriers and facilitators to its use? and (2) to what extent was the toolkit useful for guiding lab testing process improvements? What changes, if any, to lab testing processes were made based on toolkit guidance?
Methods
Design Overview and Framework
The toolkit implementation evaluation design was a multiple case study approach, suitable for understanding phenomenon in “real-life context.”9 Two practices pilot tested the use of the revised toolkit with minimal to no assistance from the evaluators. The evaluation framework was the Technology Acceptance Model (TAM).10 According to the TAM, perceived ease of use (usability) and perceived usefulness of a system are the key drivers of attitudes, intentions, and actual use of a system. TAM constructs and definitions informed the qualitative interview guide topics (described below). The toolkit primarily provides guidance on the quality improvement (QI) process as it applies to a practice's laboratory testing process. The term toolkit implementation refers to how practices followed the toolkit's steps and tools to prioritize, plan, and act on planned lab testing process changes.
Setting and Participants
Two primary care practices participated between April and August, 2017, 1 from each of 2 practice-based research networks (PBRNs): the Building InvestiGative practices for better Health Outcomes Research Network and the American Academy of Family Physicians (AAFP) National Research Network. The practice eligibility criteria were primary care specialty (internal or family medicine), more than 4 clinicians, ability to prioritize participation during the project period, and interest in improving laboratory testing processes. Recruitment began with an email to PBRN member practices inviting them to contact the project staff. Twenty-one practices expressed interest in participating following email communications from PBRN staff with brief project descriptions, and 2 were selected based on eligibility and ability to prioritize the project during the required timeline. These practices included a large (>10 clinicians) family medicine (FM) residency practice and a medium size (5 to 10 clinicians) general internal medicine (GIM) practice. Each practice identified a primary practice contact and convened an implementation team consisting of 4 to 5 practice personnel (clinicians, practice managers, and laboratory and medical support staff). Practices received $2,500 for their participation.
Procedures
Table 2 summarizes the evaluation events, data collection schedule, participants, and duration of each activity at each practice. The evaluation team included a health services researcher (BK), a qualitative researcher (DF), and a research assistant (PF), with input from an interdisciplinary group of researchers and clinicians.
Instruments and Evaluation Procedures
We developed baseline, midpoint, and follow-up semistructured interview guides to evaluate toolkit usability and usefulness based on the TAM. Baseline interview guide topics included motivation to participate, priorities for improvement, anticipated barriers to toolkit implementation, current lab testing process steps, and current QI activities. Midpoint and follow-up topics included overall experiences with improving lab testing processes using the toolkit; changes made to lab testing practices (toolkit usefulness); toolkit elements found to be helpful/not helpful, easy/difficult to read, understand, and follow; efficient/inefficient; feasible to use without external resources and adaptations (usability); and suggested toolkit changes.
A practice representative completed a practice characteristics survey. Following the initial site visit and without assistance from the evaluation team, practices used the toolkit in any way they chose (ie, in any order, using or not using any components, and changing components) to guide process improvement of a self-identified laboratory testing concern. One exception is that the evaluation team provided each practice with a current state process map, within 2 weeks of the baseline site visit. The evaluation team conducted a half-day follow-up site visit. On discovering process changes were planned but had not yet occurred at follow-up, the evaluation team followed up with the practice contact again 1 month later to assess progress toward implementing planned changes. Key informant interviews were conducted by experienced qualitative researchers (BK and DF).11 Interviews were recorded and professionally transcribed verbatim.
This protocol was approved as exempt by the Colorado Multiple Institutional Review Board and the AAFP institutional review board.
Analysis
Evaluators maintained detailed notes in the semistructured interview guide template during all interviews, discussed high-level themes immediately following interviews, and prepared joint memo forms documenting themes and observations. Using a case-based matrix approach, 1 team member (PF) organized notes into relevant sections of a matrix.12 Evaluators reviewed the content of the matrix and then met to confirm that laboratory testing process descriptions were accurate and to identify toolkit usability and usefulness themes within and across dimensions of the matrix (ie, where practices reported similar vs different experiences) using a cross-case synthesis method.13 Audio recordings and transcripts were used to verify accuracy and to select quotations to contextualize key themes. After preparing practice descriptions and summaries of identified themes, the practices reviewed the findings to ensure conclusions were consistent with their experience.
Results
Results include descriptions of the practice characteristics and context (Table 3) and their toolkit implementation processes and perceived toolkit usability and usefulness.
General Toolkit Implementation Process
The implementation teams in each practice followed similar steps: assessment, planning, implementation, and reassessment. Each practice convened meetings with their implementation team, reviewed the toolkit, and selected assessments to use from the toolkit. After collecting data, the implementation teams interpreted the data and brainstormed possible solutions. The teams met approximately weekly during the implementation period to select relevant portions of the toolkit and brainstorm solutions based on data generated from baseline assessments.
Time to Implement the Toolkit
The toolkit implementation period was expected to last 6 to 8 weeks. At follow-up, the GIM practice reported this timeline was adequate (although had not actually implemented planned changes within the 8-week time frame), whereas the FM residency practice reported they would need an additional 4 weeks to implement planned changes. Practices reported the timeline was affected by implementation team medical leave, staff and provider turnover, and challenges with scheduling meetings, in addition to time required for gathering data as instructed by the toolkit.
Toolkit Usability
Overall, both practices reported a positive experience with the toolkit, which they viewed as a helpful, easy to use step-by-step guide. Practice members believed that breaking down the lab testing process into multiple steps, as depicted in the Improvement Process section of the toolkit, was key for making the project feel manageable.
“Making improvements in that bubble is sort of an insurmountable task. So breaking it down by the steps and identifying potential gaps in a particular step is great.” (Practice manager, GIM practice)
Both practices appreciated that they could select components most relevant to their practice and could adapt existing templates. For instance, both practices skipped the Assessing Office Readiness section. The toolkit provided a starting point for assessment and planning materials, which helped expedite the QI process.
“It is been nice…to just make copies of something and not have to worry about recreating the wheel.” (Faculty, FM residency)
Both practices needed to make modifications to at least 1 component of the toolkit to fit their practice (eg, changing the font size and changing response options to match practice standards), but this was not perceived as burdensome.
The practices described the toolkit as appropriate for team members both with and without QI experience. Those new to QI tended to follow the guide in order. Those with mature QI skills tended to quickly scan the toolkit and select the pieces perceived as relevant.
“It is been nice because…we've got people at different comfort levels with quality improvement process. This is nice for all-comers. Someone like <the lab manager> who has done a lot of QI stuff here, this is probably nothing that is news to her.” (Faculty, FM residency)
Facilitators
Practices reported that there was a supportive practice environment that allowed them to allocate time, leverage existing QI resources and standing meetings, and motivate team members to work on the project. Both practices perceived the project as high priority, partly due to its having common goals with other practice initiatives. Furthermore, both practices experienced regular inefficiencies and frustrations with laboratory testing, from confusing interfaces for ordering tests to trouble communicating effectively with patients. Medical assistants volunteered to work on the project because they faced these frustrations daily, even if they had already met employer expectations for participating in QI projects that year. Leaders and managers were perceived as supportive of the project; they permitted the allocation of administrative time for the implementation team to meet, gather assessment data, and design and implement changes, as well as time for the implementation team to present the project at all-hands practice meetings.
Barriers
Although use of the toolkit itself was perceived as fairly efficient, practices felt it was hard to maintain momentum in the midst of normal busy practice activities. Both practices experienced periodic delays due to unexpected medical leave and team turnover, as well as normal staffing challenges due to limited availability of medical assistants. It was especially challenging to keep a QI project top of mind in the residency practice, given this project spanned the annual resident turnover and cycling through rotations. Finally, the anticipated project timeline was too short. Although we expected the project to take 6 to 8 weeks, in both practices this was only enough time to conduct baseline assessments, brainstorm improvements, and plan to implement improvements. The time required to implement the toolkit and make process changes was largely attributed to waiting time between meetings, waiting for responses from others in the practice completing assessments, and time out of office, rather than the work itself requiring a significant amount of time or effort.
Toolkit Usefulness and Opportunities for Improvement
Both practices reported the most useful components of the toolkit were the patient assessments and handouts, the “Assessing Your Testing Process Survey” (completed by practice clinicians and staff), and the brainstorming tools to aid in selecting aspects of a laboratory testing process to target for improvement. These tools provided the practices with the data necessary to identify the step in their lab testing process in greatest need of improvement. The practices had specific ideas about the parts of their respective lab testing process they wanted to improve before the project started. Following this assessment, both practices refocused on different aspects of the testing process. Beyond the value of the assessments, both practices felt the toolkit fell short in guiding design and implementation of practice changes; instead, they relied on their own creativity, insights from others in the practice, and past QI experience to design improved processes. They felt case stories from other practices and other evidence-based guidance on planning and implementation would improve toolkit usefulness. Practices reported that the state process map activity led by the evaluation team (intended as data collection rather than intervention) was useful for them and recommended adding process mapping guidance to the toolkit; the final revised version of the toolkit reflects this addition. Table 4 summarizes the 2 practice case stories, highlighting which tools they used, how they adapted tools, and what changes the tools informed.
Discussion
The AHRQ Improving Your Laboratory Testing Processes toolkit was perceived as useful and usable by the 2 practices that tested it. The step-by-step nature of the toolkit led to the perception that laboratory testing process improvement is manageable. The pilot practices demonstrated that not all components of the toolkit need to be used to make meaningful changes; rather, practices were able to select tools most relevant to their purposes and adapt them as needed. Notably, the toolkit is not prescriptive in terms of describing an ideal laboratory testing process; primary care practices may have many variations of the process. Instead, the toolkit helps practices to objectively and systematically assess their existing process and address potential pitfalls in patient safety and clinic efficiency.
Both practices used the toolkit with little assistance from the evaluation team, aside from the process mapping activity (since added to the toolkit), indicating that external technical assistance is not needed. The overall conclusion of the need for little technical assistance is a promising result, given that a previous study found that practices were generally unable to implement a laboratory testing toolkit without external assistance.5 The participating practices may have found the toolkit was easy to use because of the supportive practice environment for QI and priorities around laboratory testing process improvement. Practices that require foundational work on QI may need assistance and training in essential competencies for effective teamwork to deliver safer, patient-centered care.14⇓⇓–17
Although the active time and effort spent using the toolkit was not perceived as a burden, the expected 6- to 8-week implementation period was too short to both plan and implement changes in the laboratory testing process. This is consistent with another toolkit implementation study that found that a 6-week implementation period may have been insufficient.5 Thus, about 8 to 12 weeks of active effort may be reasonable, depending on the scope of changes to be made and the practice's readiness to make these changes.
A there may be other guides for improving office safety, they do not typically cover the entire process from clinical decision to return of results. For example, the Safety Assurance Factors for Electronic Health Record Resilience Guide on lab safety focuses on the electronic health record components of the process.18 We have not found another primary care lab safety guide produced with a practice- and patient-engaged approach. Patient and practice stakeholder input may have helped create a more useful and usable toolkit.
The results of this study reflect known factors associated with implementation of practice change, such as those described by the Consolidated Framework for Implementation Research (CFIR).19 In particular, the toolkit was implemented in 2 practices with a supportive “implementation climate,” defined in CFIR as “The absorptive capacity for change, shared receptivity of involved individuals to an intervention and the extent to which use of that intervention will be rewarded, supported, and expected within their organization.” Specifically, the frustrations with existing laboratory testing processes that drove clinicians and staff to volunteer for the implementation teams reflect 2 key elements of implementation climate: tension for change, “The degree to which stakeholders perceive the current situation as intolerable or needing change,” as well as relative priority, “Individuals' shared perception of the importance of the implementation within the organization.” Furthermore, both practices exhibited alignment with the CFIR construct of readiness for implementation, “Tangible and immediate indicators of organizational commitment to its decision to implement an intervention,” in terms of leadership engagement and available resources.
A number of other CFIR factors are evident as having supported the toolkit implementation—some related to the practice processes (presence of practice champions and formally appointed implementation leaders) and some related to characteristics of the toolkit itself (adaptability of the toolkit, toolkit simplicity and ease of use, design quality and packaging, and guidance in planning). Finally, although the evaluation team attempted to limit their influence or facilitation of the toolkit implementation, there was likely some effect of the evaluation on the practice's attention to and thoughtfulness toward the project (reflecting both the external change agents and the reflecting and evaluating CFIR constructs).
Limitations
Results may not be generalizable to all practice types and contexts, given this study was conducted in only 2 practices, both of which had existing QI knowledge and capability. This evaluation did not assess effectiveness for improving patient safety outcomes or quantify cost and resources for implementation. Due to the short implementation period, it was not feasible to assess the long-term effects or sustainability of the toolkit use or the QI initiatives it triggered. We did not compare the toolkit to other QI methods.
Conclusions
Primary care practices often lack the means to assess and implement process improvement; toolkits are 1 possible way of supporting practice-improvement efforts. Despite some beliefs that toolkits are not used and not useful, our study demonstrated that in practices with established QI infrastructure, a toolkit that is informed by stakeholders, well-designed, and relevant to practice priorities is a welcome resource. The implementation of QI processes using this lab safety toolkit required little technical assistance and was perceived as useable and useful for informing laboratory process improvement. Future research warrants study of the toolkit in a larger set of more diverse practices, study of impact of the toolkit on patient safety outcomes, and study of the interrelationship between use of the toolkit and other QI efforts, such as the patient-centered medical home. The final revised toolkit can be found at https://www.ahrq.gov/professionals/quality-patient-safety/hais/tools/ambulatory-care/labtesting-toolkit.html.
Acknowledgments
The authors thank all participants of this project. We would like to acknowledge the AAFP National Research Network Members and staff for providing essential support for this project.
Notes
This article was externally peer reviewed.
Funding: This work was supported by Contract: HHSP233201500025I, Task Order: HHSP23337004T (Adapting and Implementing Patient Safety Practices in Ambulatory Care).
Conflict of interest: none declared.
To see this article online, please go to: http://jabfm.org/content/32/2/136.full.
- Received for publication April 6, 2018.
- Revision received November 3, 2018.
- Accepted for publication November 6, 2018.