Skip to main content

Main menu

  • HOME
  • ARTICLES
    • Current Issue
    • Ahead of Print
    • Archives
    • Abstracts In Press
    • Special Issue Archive
    • Subject Collections
  • INFO FOR
    • Authors
    • Reviewers
    • Call For Papers
    • Subscribers
    • Advertisers
  • SUBMIT
    • Manuscript
    • Peer Review
  • ABOUT
    • The JABFM
    • The Editing Fellowship
    • Editorial Board
    • Indexing
    • Editors' Blog
  • CLASSIFIEDS
  • Other Publications
    • abfm

User menu

  • Log out

Search

  • Advanced search
American Board of Family Medicine
  • Other Publications
    • abfm
  • Log out
American Board of Family Medicine

American Board of Family Medicine

Advanced Search

  • HOME
  • ARTICLES
    • Current Issue
    • Ahead of Print
    • Archives
    • Abstracts In Press
    • Special Issue Archive
    • Subject Collections
  • INFO FOR
    • Authors
    • Reviewers
    • Call For Papers
    • Subscribers
    • Advertisers
  • SUBMIT
    • Manuscript
    • Peer Review
  • ABOUT
    • The JABFM
    • The Editing Fellowship
    • Editorial Board
    • Indexing
    • Editors' Blog
  • CLASSIFIEDS
  • JABFM on Bluesky
  • JABFM On Facebook
  • JABFM On Twitter
  • JABFM On YouTube
Research ArticleAbout Practice-Based Research Networks

Preventing the Voltage Drop: Keeping Practice-based Research Network (PBRN) Practices Engaged in Studies

Barbara P. Yawn, Allen Dietrich, Deborah Graham, Susan Bertram, Marge Kurland, Suzanne Madison, Dawn Littlefield, Brian Manning, Craig Smail and Wilson Pace
The Journal of the American Board of Family Medicine January 2014, 27 (1) 123-135; DOI: https://doi.org/10.3122/jabfm.2014.01.130026
Barbara P. Yawn
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MD, MSc
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Allen Dietrich
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MD, MPH
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Deborah Graham
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MSPH
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Susan Bertram
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
BSN, MSN
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Marge Kurland
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
BSN
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Suzanne Madison
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MPH
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Dawn Littlefield
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
CSC
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Brian Manning
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MPH
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Craig Smail
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MPH
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Wilson Pace
From the Department of Research, Olmsted Medical Center, Rochester, MN (BPW, SB, MK, SM, DL); the National Research Network, American Academy of Family Physicians, Leawood, KS (DG, BM, CS, WP); and the Dartmouth Medical Center, Dartmouth, NH (AD).
MD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • References
  • Info & Metrics
  • PDF
Loading

Abstract

Introduction: Practice-based research continues to evolve and has become a major methodology for many pragmatic studies. While early practice-based network projects were usually short term, current studies often introduce or compare practice innovations that require long-term evaluation. That change requires that practice sites remain engaged in research work for up to 5 years, a time that can allow for a significant “voltage drop,” or decline in active participation.

Methods: Over the past 15 years we have developed and adapted several strategies to facilitate and encourage the continued active engagement of practices in practice-based research network studies of up to 5 years' duration. The concepts, details, evaluation, and results (when available) of the strategies are described.

Results: Eight strategies that enhance practice sites' attention to enrollment, data collection and continued use of the implemented practice change are described.

Conclusion: The loss of momentum, or “voltage drop,” that happens in longer-term practice-based research network studies can be addressed using multiple strategies.

  • Methods
  • Practice-based Research Network
  • Program Effectiveness
  • Randomized Controlled Clinical Trials
  • Training

Practice-based research network (PBRN) research has become a valuable method of identifying and answering questions relevant to primary care patients and practices. As PBRN research has matured, studies have evolved from simple, short-term observational studies to longer-duration interventional studies that involve all members of the practice and enrolled patients.1 PBRN methodology literature also has progressed from reports of how to begin PBRNs2⇓⇓⇓–6 to publications relevant to conducting longer-term and more complex studies in established PBRNs.7⇓⇓⇓⇓–12

Practices usually begin a study with significant enthusiasm and willingness to learn and fulfill the study requirements.13,14 As time progresses, the reality of everyday life in a busy practice may usurp the time and effort needed for continued study participation. We have observed this phenomenon in our own studies, and we have labeled this participation decline as “voltage drop” (A. Dietrich, MD, personal communication).

Regular study visits to local or regional PBRN practices by practice enhancement assistants, practice enhancement and research coordinators, or regional study team members may reduce the loss of study momentum.15,16 Such activities seem to improve practice outcomes17⇓–19 but are not feasible for large, geographically dispersed studies that work to achieve a nationally representative sample of practices.

Over the past 10 years of working with practices to complete pragmatic trials3 we have identified several methods to facilitate initial engagement by the practices and to maintain the initial high level of enthusiasm and willingness to participate over the long study period. The purpose of this report is to describe these strategies and their impact with the purpose of supporting other researchers in implementing and enhancing their work in PBRNs. Other PBRNs have been engaged in similar translational and practice change implementation studies, so the strategies presented here may not be unique to our work but to date have not been published in the PBRN literature.

Methods

The strategies described in this report come from information collected over the past 10 years in 4 large, federally funded studies, each lasting up to 5 years (Table 1). We believe that these 4 long-term studies are most representative of the difficulties we have encountered in sustaining the practices' interest and commitment over several years in projects that introduced a significant practice change. Despite the variety of topics and varying study designs, many of the same implementation strategies and support techniques proved to be useful across studies. Each of the studies introduced a practice change: using spirometry to evaluate patients with asthma and chronic obstructive pulmonary disorder20; introducing screening, diagnosis, and management support for postpartum depression21⇓⇓–24; using new asthma tools (asthma APGAR tools) in managing children and adults with asthma25; and assessing the effect of 2 classes of long-acting bronchodilators on exacerbation rates in black adults with asthma.

View this table:
  • View inline
  • View popup
Table 1. The Four PBRN Studies Included in this Study

We included 96 practices in the four large pragmatic studies mentioned above. Of the total, 87 were family medicine practices, 5 were pulmonology practices, and 4 were pediatric practices. For each study except the spirometry study, we have lost from 1 to 4 practices because of a variety of problems, including dissolution of practices, physician illness with prolonged disability, and inability to recruit patients (Table 1). The practices included 576 physicians and 52 other clinicians (nurse practitioners and physician assistants), and each practice included one or more other practice study leaders, usually a practice nurse, social worker, or medical assistant.

While no specific evaluation activities were designed to rigorously test the value of any of the strategies, formal and informal evaluation strategies were used across all studies. Following central training, each attendee was asked to use a Likert scale to answer questions about the perceived value of the training, timing of the training, specific content related to study procedures, and information on the practice change. At the close of each of the studies, qualitative data were collected using semistructured interviews with the practice leaders and, in some cases, one or more other members of the practice teams. The interview included questions about the support provided by the central site; sessions to help solve implementation issues in the local practice; time constraints; which, if any, parts of the study intervention were now a “routine” part of the practice; and which elements would be continued and which elements were not feasible. The interviews also included open-ended questions allowing the practice leaders to make any other comments or suggestions they desired. Data from the interviews were analyzed by members of the research group who were familiar with qualitative research. Informal data sources included anecdotes and comments during case reviews, and calls by the liaison and principal investigator with the site leaders during the course of the studies.

Results

We report 8 general strategies (Table 2) that our team identified as useful for maintaining engagement in PBRN studies of up to 5 years' duration. Each strategy is designed to help prevent or overcome practice study fatigue and disengagement. Table 2 expands on the characteristics of each of the strategies beyond the comments presented here.

View this table:
  • View inline
  • View popup
Table 2. Summary of Elements of the Eight Methods

1. Centralized Training

Garnering sufficient time from busy practice staff for education about study details is difficult. Trying to train an entire practice about all the details of any study is unrealistic. To overcome these problems, we identified 2 study champions or site leaders at each practice: one a physician and the other a member of the nursing staff. These 2 people became the study “leads” and attended the centralized training.

Centralized training was usually done over a weekend in conjunction with another primary care meeting or at a “vacation” site to enhance interest in attendance. The sessions were designed to introduce study background and need, study procedures such as enrollment and informed consent, and the practice change or intervention; the format allowed interaction among site leaders and between site leaders and the central staff of the study. Funding for the central training was included in the budget for each study.

Training sessions were led by different members of the central team to allow the site champions to meet everyone on the central team. While the first 3 to 4 hours were primarily didactic, the rest of the sessions were interactive, practicing informed consent or use of the intervention tools.

When appropriate, randomization assignments were announced at the training session so that support could be provided to the “usual care” or control group, explaining their important role and reminding them that they will move to the intervention arm within 18 months. The control practices often realize quickly that they have a smaller initial study burden and will get the benefits of any successful interventions after the study intervention is complete.

At the close of the training session, each site was provided with a condensed slide set prepared for the site leaders to use to educate the rest of their practice members. Asking the site leaders to lead the presentation ensured that they understood the project sufficiently well to present the project to their colleagues and supported their identification as the site study leaders and champions.

All materials presented during the central training sessions were provided to each site in a binder that included all slides and handouts from the central training. The materials supported study refreshers between practices and central study team members on an as-needed basis.

Immediately after its completion, the centralized training received reviews of “very good” to “excellent” for meeting expectations, understanding of the intervention, and clear, readable slides. When presenting to their local practices, the practice leaders repeatedly commented that they were able to do this only because they had attended the centralized training. Site leaders also commented on the importance and value of interacting with other PBRN practices; for many sites this was the first PBRN study in which they had participated. One of the most common requests was to have a second meeting of practice leaders at the mid-point or close to the end of the study. We interpreted these requests as an endorsement of this type of training. The centralized training engages the attendees and helps them gain an in-depth understanding of the research question and methods for the study.

2. Practice Liaisons

Each practice was assigned a central site study liaison, with each liaison working with 4 to 10 practices, depending on the study complexity and other duties of the liaison. The liaison maintained weekly to biweekly contact with the practice leaders and was the first line of communication between the central site and the practice.

The relationship between the liaisons and the practice leaders was initiated at the centralized training and was instrumental to continued practice engagement. The liaison worked to develop a long-term personal relationship with the practice to allow easy and honest communication between the practices and the central study site. The liaison was usually the first to notice signs of slowed enrollment or failure to use the intervention suggested by delays in scheduling or completing regular liaison site contact. By brainstorming with the site leaders, the liaison, with the assistance of the study central investigators, was able to find reasons for slowed participation, disengagement, or “voltage drop” and developed strategies to solve the identified problems.

The importance and value of the site liaisons was a recurrent theme in the exit interviews for each of the completed studies. In all interviews, the practice leaders knew their liaisons by name, often commented that they kept her phone number immediately available, and enjoyed having the long-term relationship.

3. Frequently Asked Questions (FAQs)

Frequently asked questions (FAQs) began as just that—responses to a question asked by one site but with answers shared with all sites. They evolved into a method to communicate weekly with sites without phone calls that could be disruptive to the practice. The FAQs were sent to each participating practice on a specific day of the week, using a consistent format with the study logo at the top of each FAQ. FAQ communications evolved from faxes to E-mails, which allowed the use of color. Site champions were asked to print the FAQs and post them in an area where they could be seen by all staff members. Examples of FAQs are shown in Figures 1 and 2.

Figure 1.
  • Download figure
  • Open in new tab
Figure 1.

Example of frequently asked questions.

Figure 2.
  • Download figure
  • Open in new tab
Figure 2.

Example of frequently asked questions.

FAQ topics varied from updates on study implementation details to reminders of proper techniques for specific study procedures such as signing informed consents or completing enrollment logs. Studies with both intervention and usual care arms often required arm-specific FAQs at least part of the time, for example, when discussing intervention specific topics. To maintain interest and prevent the FAQs from being seen as nagging, FAQs also were used to recognize and congratulate sites for their efforts or to celebrate national and faith-based holidays, recognizing that sometimes the sites needed a break from our educational endeavors.

Acknowledgment that sites were aware of receiving the FAQs is illustrated in a comment we received following a blizzard and power outage at the central site that prevented sending of the FAQ on the assigned day: “We did not get our FAQ—are you there and OK?” Evidence of reading the FAQ content came when we reported some of the study practices' experiences with tornadoes, floods, earthquakes, and hurricanes. Following reports of these occurrences in the weekly FAQs, other practices responded with offers of help and emergency supplies.

As we became more creative with our FAQ graphics and used more color, sites responded with requests to use less color and simpler pictures to conserve color printer ink, demonstrating that the sites were printing out the FAQs. FAQs prompted calls from practices asking the liaisons to clarify improvements requested on the FAQ, such as making sure the “witness” signs the required lines on the informed consent. These anecdotal findings confirm that the FAQs generated interactions between the central study staff and the practices.

During exit interviews for the 3 studies that have been completed, the practice leaders reported that FAQs were read “almost every week and saved in the notebook” provided by the central site for FAQ collection. The practice leaders further reported that both the “funny” FAQs and the serious ones were useful and kept them thinking about the study and their roles of “identifying and enrolling patients” and “facilitating the intervention.” FAQs that included case reviews generated discussion among the physician leaders across practices. Examples included an asthma case that generated comments from 50% of the 29 physician practice leaders and a postpartum depression case that generated responses from 30% of the physicians and 70% of the nursing practice leaders.

4. Incentives

There are many types of incentives that can be used in PBRN studies. Using the FAQs to acknowledge sites that meet their individual monthly enrollment goals can be an incentive. Simple contests that query the site leaders about protocol details or even current medical events and reward correct answers with boxes of microwave popcorn or gift cards of nominal value to local restaurants are easy to accomplish. Most incentives are provided to the practice as a whole to reemphasize that it is the whole practice and not just the champions who are participating in the study.

Response rates to the simple contests have been high. In one contest all the practice leaders plus another 5 practice members answered (110% response). When enrollment slows, incentives that challenge sites to reach their individual monthly enrollment goals have resulted in increasing enrollments (13 of the 16 times such FAQs were sent). Incentives of recognition and praise— for example, an FAQ naming all the “winning” sites in an enrollment challenge—were also well accepted. The incentives were in addition to the site payments for study participation.

5. Case Reviews

Case reviews evolved from specific patient encounters that prompted discussion in an FAQ as being of general interest to all practice sites. Case reviews allowed the central site to better understand how the practice incorporated the intervention and to provide the sites with an opportunity to ask an expert about difficult cases. Case reviews were limited to the intervention sites.

Each case review was completed by conference phone call between the study's principal investigator (PI), the site's liaison, and all the members at the site who were available to attend. The case was presented by the enrolled patient's physician, resident, or nurse. The presenters from the site chose their own presentation formats, although we developed a written outline for anyone requesting it. Most cases required 3 to 5 minutes for the initial presentation, with another 10 to 15 minutes for discussion. The questions that followed the presentation included how the study intervention did or did not affect the patient's care, what might be the next steps in care, and other management issues pertinent to the specific patient, such as immunization status or adherence issues. The participation of the central PI allowed the site practice members to query about anything from how the investigator would have used the intervention tools to further diagnostic testing or alternative therapeutic strategies.

Case reviews were not an instant success: many practices were reluctant to participate in the first case reviews. However, almost every practice had someone brave enough to have their patient care discussed, and participation increased as the practices learned that the case reviews were intended to be learning sessions rather than judgmental sessions.

At the first case review it was often just the study PI and the presenter interacting. But once the interaction was shown to be positive and truly interactive, such as asking the presenter what they think is good and what could be improved in the care of the patient, others often joined the discussion. The discussions were intended to focus on the practice or systems changes and tools introduced during the study. Asking for additional suggestions often revealed other aspects of patient care, such as clarifying requirements for pneumococcal immunizations in people with asthma or querying about the yearly influenza immunization for the pregnant and postpartum women.

After the initial 1 or 2 case reviews the case reviews were well attended, especially in residency programs, and the practice staff asked questions about other complex cases. The PI, central site staff, and practice staff usually learned at least one new thing from these interactions.

One of the best indications that the case reviews were valued was the decreasing resistance to scheduling future case reviews. It was common at the end of the second or third case review to have the site staff make recommendations for the date, time, and patient for the next case. The practice's queries regarding the care of nonenrolled patients also suggests that the practices valued the time and collaborative attitude that these sessions exemplified.

6. Appreciation

In health care settings, expressions of appreciation may be uncommon. Many members of the study practices reported that they heard little praise but bore the brunt of many complaints. We applied several methods for expressing our appreciation for the site's participation in the studies. At the central training session, each site was provided with a framed certificate of appreciation for its participation in the study. These plaques can be displayed in the office waiting room and are often as simple as a framed page designed and printed at the central site. To facilitate acknowledgment of the work of the practice in their local community, we provided each site with tailored press releases to explain the study and the local clinic's role. These were designed for the practice leaders to forward to local newspapers or other media outlets.

The study liaison for each site provided encouragement, support, and praise during each interaction with the site leaders. All interactions with the sites ended with the liaison or investigator thanking the practice leaders or other practice staff for their continued work.

Sites have repeatedly told us they displayed their study participation certificates in their offices or front lobbies. Certificates from previous studies are frequently encountered during site visits. Five sites from the postpartum depression study sent us copies of newspaper articles that resulted from press releases. Accompanying the articles were the following comments: “Our patients really liked seeing us in the newspaper” and “some of our subspecialty colleagues were impressed with what we are doing.” The central site received many E-mails, cards, and notes from the practices, usually addressed to the practice liaison, acknowledging and thanking the central site for the small boxes of cookies or candy and holiday greetings sent in appreciation of the practice's work.

7. Institutional Review Board Submission Support

Institutional review board (IRB) submission support from the central study team was very important since many PBRN practices in our studies had limited experience with IRB submissions. The idea of submitting the protocol application to the local IRB or sending a yearly report to the IRB of record can be overwhelming for a staff not accustomed to preparing such reports. Learning to complete IRB reports can take time away from other important study activities such as enrolling patients or supporting an intervention. Having one person on the central team assigned to manage the IRB submissions freed the sites of this concern and assured the central team that there would be no lapse in the ability to enroll patients.

A proxy outcome of the effectiveness of the IRB support provided to the practices was the number of IRB refusals, queries, and delays that our studies experienced. In our submissions and yearly reports to >55 IRBs for periods up to 5 years each (>200 interactions), we have had 3 initial protocol approvals delayed by 30 to 60 days, 2 yearly reports that resulted in temporary holds on enrollment, and only one IRB request for additional data on a patient outcome. Considering the large number of IRBs and the requirements for up to 5 years of approval from each of the IRBs, the number of delays and requests for additional information has been minimal, and we believe this was the direct result of the support provided by the central study team during initial and yearly submissions of the reports.

8. Meeting the Academic Needs of Site Personnel

Many family physicians report the need for academic stimulation in their regular practices. For some physicians, such as residents and physicians seeking recertification, the requirements are more specific.

For example, all family medicine residents are required to complete some type of scholarly activity during their training. In 2 of our studies we have been able to provide the support and data for residents to complete research projects for this requirement that are complementary and parallel to the main study.

All physicians are required to complete training for maintenance of board certification through their appropriate medical specialty board. We developed a Maintenance of Certification program, sponsored by the American Academy of Pediatrics, based on one study's intervention by expanding its quality improvement focus. The cost of the program was minimal and provided the physician with Maintenance of Certification credit for 3 years. This program has been directly responsible for activating 2 of the pediatricians who were not practice leaders to incorporate the study intervention into their daily practices and to add further modifications that they believe enhance the intervention in their practices.

One of our studies has been able to support the doctoral dissertation of the practice's nonphysician leader. We have previously published other benefits of participation in PBRN studies as reported by the practices.1

Discussion

While each of our individual strategies seems to have had a favorable impact on re-energizing and supporting continued activity in one or more of our studies, we have used the combined strategies in a multicomponent approach. Waning interest is a potential problem for many types of practice activities and has been described most often in quality improvement activities.26⇓⇓⇓⇓⇓⇓–33 Over periods of 18 to 60 months, as required by our studies, current medical practices are likely to be faced with many external requirements for change, including the introduction or update of electronic medical records systems, payment restructuring, clinical systems changes, and staff turnover. Designing and implementing methods to maintain interest and enthusiasm for research requirements and practice changes seemed to prevent some of the “voltage drop” for the practices in the 4 studies discussed here.

The importance of building relationships between central team members and the practices by using a practice facilitator, practice enhancement assistant, or practice enhancement and research coordinator has been reported by others.15,17,18,34,35 While we agree that the ability to send facilitators directly into the practices can be very helpful, in widely dispersed practices, such as those in our national studies, the contact must be continued using distance communication formats such as the telephone, E-mails, and faxes. We also have more recently used video communication with free or low-cost shared services such as Skype.

Our attention to the learning and supportive environment was explicitly begun during our 2002 spirometry study funded by the National Heart, Lung and Blood Institute, and it might be considered similar to the concept of the formal “learning collaborative” introduced by the Institute for Healthcare Improvement (IHI) in 2003. However, unlike the early design of IHI's learning collaborative, we found it more feasible to have a single face-to-face meeting and to work with each site at a distance after that.36,37 Our system perhaps fits more closely with the Collaborative Networked Learning first described in 1987 by Findley,38 which is designed to occur via electronic dialogue between learners and experts sharing a common purpose—in our case implementation of practice changes and study-related activities.38 Our strategies of centralized training, regular contact, and interactive discussion with the site leaders, as well as the FAQs and case reviews, could be considered elements of collaborative learning. We consciously chose not to include some of the elements required by the IHI learning collaborative concept, such as the large time and personnel commitments. The IHI learning collaborative can require several months to work out a solution for an identified problem. Our approach with busy practices used much shorter time frames to determine how best to introduce practice tools and systems that had been previously developed and tested. The focus was on flexibility in implementing the tools and system changes rather than developing those tools and systems.39

Quality improvement initiatives also encounter similar experiences with “voltage drop.” The response to declining participation in quality improvement work initially was to declare that “what gets measured gets done” and develop multiple metrics to assess quality processes.40,41 Similar goals can be accomplished in research settings with audit and feedback.42 While some of our studies, such as the Asthma Tools study, use specific practice audit and feedback, we also consider the case reviews to be a form of assessing what is actually being done in the practice followed by an interactive discussion between the practices and the study staff.

Few other specific strategies can be found in the medical literature.26⇓⇓⇓⇓⇓⇓–33 The RE-AIM (Reach, Efficacy, Adoption, Implementation, and Maintenance) framework clearly denotes the importance of “maintenance,” which might be equated with continued participation in our research projects.43 Suggestions for supporting maintenance often include recommendations for continuing to support and empower those doing the quality improvement processes but without methods of operationalizing those recommendations. Our strategies incorporated support and empowerment while focusing heavily on implementing and sustaining the elements necessary to complete the studies.

The challenge of working with multiple IRBs has been discussed frequently.8 However, little has been published about IRB continuing reviews and the need to identify a central team member to oversee this work. With many IRBs required for each study, the need for an organized system becomes obvious. While some practices assisted with the IRB submissions, all were happy to have central support. Although this may be considered usual procedure for any PBRN study, it removes some of the time demand from the site and allows greater time for the core study activities.

Incentives and mini contests were used to heighten awareness of the need for ongoing patient enrollment into the studies and often as a simple fun break from everyday practice routines. The response rate to the incentives and mini contests was high and usually resulted in a note or picture when the incentive or prize was enjoyed by the entire practice.1,10,44

To support the need for health professionals to regularly complete continuing education requirements,45 we worked to provide continuing medical education credit for attendance at the in-person centralized training sessions through the American Academy of Family Physicians. We most recently added an opportunity for pediatricians to complete a module required for maintenance of certification, and we assisted family medicine residents in completing their required academic activities during their final 2 years of training.46,47 Supporting continuing medical education and academic activities has been favorably received by residency directors, residents, and practice physicians involved in our PBRN studies, and we believe that this has contributed to the willingness of the physicians to sustain their involvement with the studies.

Conclusion

PBRN, translational, and implementation research requires unique methods to assure success, especially as the studies become more complex and of longer duration. Each practice is a unique research partner whose strengths should be highlighted and enhanced. The practices are often inexperienced partners who require special attention when identifying and overcoming barriers to consistently completing all the requirements of the study and doing so over a sustained time period, especially the mid-study lag that we have labeled a “voltage drop.” The strategies we have outlined here were successful in sustaining a high level of practice involvement in our studies, which is essential for high-quality projects producing valid results. We believe that PBRN research will continue to be extremely valuable in transforming medical practice, and the use of our strategies can contribute to the overall success of the PBRN concept.

Notes

  • This article was externally peer reviewed.

  • Funding: none.

  • Conflict of interest: none declared.

  • Received for publication January 14, 2013.
  • Revision received July 19, 2013.
  • Accepted for publication August 12, 2013.

References

  1. 1.↵
    1. Yawn BP,
    2. Pace W,
    3. Dietrich A,
    4. et al
    . Practice benefit from participating in a practice-based research network study of postpartum depression: a National Research Network (NRN) report. J Am Board Fam Med 2010;23:455–64.
    OpenUrlAbstract/FREE Full Text
  2. 2.↵
    1. Green LA,
    2. Hickner J
    . A short history of primary care practice-based research networks: from concept to essential research laboratories. J Am Board Fam Med 2006;19:1–10.
    OpenUrlFREE Full Text
  3. 3.↵
    1. Woolf SH
    . The meaning of translational research and why it matters. JAMA 2008;299:211–3.
    OpenUrlCrossRefPubMed
  4. 4.↵
    1. Hayes H,
    2. Parchman ML,
    3. Howard R
    . A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN). J Am Board Fam Med 2011;24:576–82.
    OpenUrlAbstract/FREE Full Text
  5. 5.↵
    1. DeVoe JE,
    2. Gold R,
    3. Spofford M,
    4. et al
    . Developing a network of community health centers with a common electronic health record: description of the Safety Net West Practice-based Research Network (SNW-PBRN). J Am Board Fam Med 2011;24:597–604.
    OpenUrlAbstract/FREE Full Text
  6. 6.↵
    1. DeVoe JE,
    2. Likumahuwa S,
    3. Eiff MP,
    4. et al
    . Lessons learned and challenges ahead: report from the OCHIN SafetyNet West practice-based research network (PBRN). J Am Board Fam Med 2012;25:560–4.
    OpenUrlAbstract/FREE Full Text
  7. 7.↵
    1. Mold JW,
    2. Lipman PD,
    3. Durako SJ
    . Coordinating centers and multi-practice-based research network (PBRN) research. J Am Board Fam Med 2012;25:577–81.
    OpenUrlAbstract/FREE Full Text
  8. 8.↵
    1. Yawn BP,
    2. Graham DG,
    3. Bertram SL,
    4. et al
    . Practice-based research network studies and institutional review boards: two new issues. J Am Board Fam Med 2009;22:453–60.
    OpenUrlAbstract/FREE Full Text
  9. 9.↵
    1. Yawn BP,
    2. Bertram S,
    3. Wollan P
    . Introduction of Asthma APGAR tools improve asthma management in primary care practices. J Asthma Allergy 2008;1:1–10.
    OpenUrlPubMed
  10. 10.↵
    1. Pace WD,
    2. Fagnan LJ,
    3. West DR
    . The Agency for Healthcare Research and Quality (AHRQ) practice-based research network (PBRN) relationship: delivering on an opportunity, challenges, and future directions. J Am Board Fam Med 2011;24:489–92.
    OpenUrlAbstract/FREE Full Text
  11. 11.↵
    1. Williams RL,
    2. Rhyne RL
    . No longer simply a practice-based research network (PBRN) health improvement networks. J Am Board Fam Med 2011;24:485–8.
    OpenUrlAbstract/FREE Full Text
  12. 12.↵
    1. Gilbert GH,
    2. Richman JS,
    3. Gordan VV,
    4. et al
    . Lessons learned during the conduct of clinical studies in the dental PBRN. J Dent Educ 2011;75:453–65.
    OpenUrlAbstract/FREE Full Text
  13. 13.↵
    1. Graham DG,
    2. Spano MS,
    3. Stewart TV,
    4. Staton EW,
    5. Meers A,
    6. Pace WD
    . Strategies for planning and launching PBRN research studies: a project of the Academy of Family Physicians National Research Network (AAFP NRN). J Am Board Fam Med 2007;20:220–8.
    OpenUrlAbstract/FREE Full Text
  14. 14.↵
    1. Fagnan LJ,
    2. Handley MA,
    3. Rollins N,
    4. Mold J
    . Voices from left of the dial: reflections of practice-based researchers. J Am Board Fam Med 2010;23:442–51.
    OpenUrlAbstract/FREE Full Text
  15. 15.↵
    1. Nagykaldi Z,
    2. Mold JW,
    3. Aspy CB
    . Practice Facilitators: a review of the literature. Fam Med 2005;37:581–8.
    OpenUrlPubMed
  16. 16.↵
    1. Nagykaldi Z,
    2. Mold JW
    . The role of health information technology in the translation of research into practice: an Oklahoma Physicians Resource/Research Network (OKPRN) study. J Am Board Fam Med 2007;20:188–95.
    OpenUrlAbstract/FREE Full Text
  17. 17.↵
    1. Hogg W,
    2. Baskerville N,
    3. Nykiforuk C,
    4. Mallen D
    . Improved preventive care in family practices with outreach facilitation: understanding success and failure. J Health Serv Res Policy 2002;7:195–201.
    OpenUrlAbstract/FREE Full Text
  18. 18.↵
    1. Frijling BD,
    2. Lobo CM,
    3. Hulscher ME,
    4. et al
    . Multifaceted support to improve clinical decision making in diabetes care: a randomized controlled trial in general practice. Diabet Med 2002;19:836–42.
    OpenUrlCrossRefPubMed
  19. 19.↵
    1. Dietrich AJ,
    2. O'Connor GT,
    3. Keller A,
    4. Carney PA,
    5. Levy D,
    6. Whaley FS
    . Cancer: improving early detection and prevention. A community practice randomised trial. BMJ 1992;304:687–91.
    OpenUrlAbstract/FREE Full Text
  20. 20.↵
    1. Yawn BP,
    2. Enright PL,
    3. Lemanske RF,
    4. et al
    . Spirometry can be done in family physicians' offices and alters clinical decisions in management of asthma and COPD. Chest 2007;132:1162–8.
    OpenUrlCrossRefPubMed
  21. 21.↵
    1. Yawn B,
    2. Dietrich A,
    3. Wollan P,
    4. et al
    . TRIPPD: a practice based network effectiveness study of postpartum depression screening and management. Ann Fam Med 2012;10:320–9.
    OpenUrlAbstract/FREE Full Text
  22. 22.↵
    1. Yawn BP,
    2. Pace W,
    3. Wollan PC,
    4. et al
    . Concordance of Edinburgh Postnatal Depression Scale (EPDS) and Patient Health Questionnaire (PHQ-9) to assess increased risk of depression among postpartum women. J Am Board Fam Med 2009;22:483–91.
    OpenUrlAbstract/FREE Full Text
  23. 23.↵
    1. Yawn B,
    2. Madison S,
    3. Bertram S,
    4. et al
    . Automated patient and medication payment method for clinical trials. J Clin Trials 2012;4:1–9.
    OpenUrl
  24. 24.↵
    1. Yawn B,
    2. Dietrich A,
    3. Wollan P,
    4. et al
    . TRIPPD: A practice-based network effectiveness study of postpartum depression screening and management. Ann of Fam Med. 2012;10(4).
  25. 25.↵
    1. Yawn B,
    2. Bertram S,
    3. Kurland M,
    4. et al
    . Protocol for the asthma tools study: a pragmatic practice-based research network trial. Pragmat Observ Res 2013;4:1–12.
    OpenUrl
  26. 26.↵
    1. Hughes RG
    . Chapter 44. Tools and strategies for quality improvement and patient safety. In: Patient safety and quality an evidence-based handbook for nurses. Rockville, MD: Agency for Healthcare Research and Quality; 2008. Available from: http://www.ahrq.gov/professionals/clinicians-providers/resources/nursing/resources/nurseshdbk/nurseshdbk.pdf. Accessed November 11, 2013.
  27. 27.↵
    1. Bryant D
    . Maintaining your quality improvement achievements. May 7, 2009. Available from: http://ezinearticles.com/?Maintaining-Your-Quality-Improvement-Achievements&id=2312493. Accessed November 11, 2013.
  28. 28.↵
    1. Nagy K,
    2. Berkowitz B,
    3. Loewenstein M
    1. Rabinow P,
    2. Vilela M
    . Section 1. Achieving and maintaining quality performance. In: Nagy K, Berkowitz B, Loewenstein M, editors. Maintaining quality performance. Lawrence: Work Group for Community Health and Development, University of Kansas; 2013. Available from: http://ctb.ku.edu/en/tablecontents/sub_section_main_1387.aspx. Accessed November 11, 2013.
  29. 29.↵
    Institute of Medicine. Crossing the quality chasm: a new heath system for the 21st century. Washington, DC: National Academies Press; 2001: 164–80.
  30. 30.↵
    1. Wallin L,
    2. Boström AM,
    3. Wikblad K,
    4. et al
    . Sustainability in changing clinical practice promotes evidence-based nursing care. J Adv Nurs 2003;41:509–18.
    OpenUrlCrossRefPubMed
  31. 31.↵
    1. Shojania K,
    2. McDonald K,
    3. Wachter R,
    4. et al
    . Closing the quality gap: a critical analysis of quality improvement strategies: volume 1—series overview and methodology. Technical review, no. 9. AHRQ publication no. 04-0051-1. Rockville, MD: Agency for Healthcare Research and Quality; 2004.
  32. 32.↵
    1. Horbar JD,
    2. Plsek PE,
    3. Leahy K
    ; NIC/Q 2000. NIC/Q 2000: establishing habits for improvement in neonatal intensive care units. Pediatrics 2003;111(4 Pt 2):e397–410.
    OpenUrlPubMed
  33. 33.↵
    1. Kim GR,
    2. Chen AR,
    3. Arceci RJ,
    4. et al
    . Error reduction in pediatric chemotherapy: computerized order entry and failure modes and effects analysis. Arch Pediatr Adolesc Med 2006;160:495–8.
    OpenUrlCrossRefPubMed
  34. 34.↵
    1. Cook R
    . Primary care. Facilitators: looking forward. Health Visit 1994;67:434–5.
    OpenUrlPubMed
  35. 35.↵
    1. Nagykaldi Z,
    2. Mold JW,
    3. Robinson A,
    4. Neibauer,
    5. Ford A
    . Practice facilitators and practice-based research networks. J Am Board Fam Med 2006;19:506–10.
    OpenUrlAbstract/FREE Full Text
  36. 36.↵
    Collaboratives. Cambridge, MA: Institute for Healthcare Improvement. Available from: http://www.ihi.org/offerings/MembershipsNetworks/collaboratives/Pages/default.aspx. Accessed November 11, 2013.
  37. 37.↵
    Wikipedia. Collaborative learning. Available from: http://en.wikipedia.org/w/index.php?title=Collaborative_learning&printable=yes. Accessed November 11, 2013.
  38. 38.↵
    1. Quik WH,
    2. Wright N
    . Information sharing to transformation: antecedents of collaborative networked learning. World Acad Sci Engin Technol 2012;68:1477–86.
    OpenUrl
  39. 39.↵
    1. Roland M,
    2. Torgerson DJ
    . Understanding controlled trials: what are pragmatic trials? BMJ 1998;316:285.
    OpenUrlFREE Full Text
  40. 40.↵
    1. Williamson RM
    . Available from: http://www.swspitcrew.com/articles/What%20Gets%20Measured%201106.pdf. Accessed December 2, 2013.
  41. 41.↵
    Six Sigma. Community. Available from: http://www.isixsigma.com/community. Accessed July 10, 2013.
  42. 42.↵
    1. Ivers N,
    2. Jamtvedt G,
    3. Flottorp S,
    4. et al
    . Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev 2012;(6):CD000259.
  43. 43.↵
    1. Glasgow RE,
    2. Vogt TM,
    3. Boles SM
    . Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999;89:1322–7.
    OpenUrlCrossRefPubMed
  44. 44.↵
    1. Gibson K,
    2. Szilagyi P,
    3. Swanger CM,
    4. et al
    . Physician perspectives on incentives to participate in practice-based research: a greater Rochester practice-based research network (GR-PBRN) study. J Am Board Fam Med 2010;23:452–4.
    OpenUrlAbstract/FREE Full Text
  45. 45.↵
    ABFM maintenance of certification for family medicine. Lexington, KY: American Board of Family Medicine. Available from: https://www.theabfm.org/moc/index.aspx. Accessed July 10, 2013.
  46. 46.↵
    Maintenance of certification (MOC) four-part structure. Chapel Hill, NC: American Board of Pediatrics. Available from: https://www.abp.org/abpwebsite/moc/aboutmoc/maintenanceofcertification(moc)four-partstructure.htm. Accessed July 10, 2013.
  47. 47.↵
    American College of Graduate Medical Education. Scholarly activity guidelines. Review Committee for Family Medicine. Available from: https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramResources/120_Family_Medicine_Scholarship_Guidelines.pdf. Accessed July 10, 2013.
PreviousNext
Back to top

In this issue

The Journal of the American Board of Family     Medicine: 27 (1)
The Journal of the American Board of Family Medicine
Vol. 27, Issue 1
January-February 2014
  • Table of Contents
  • Table of Contents (PDF)
  • Cover (PDF)
  • Index by author
  • Back Matter (PDF)
  • Front Matter (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on American Board of Family Medicine.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Preventing the Voltage Drop: Keeping Practice-based Research Network (PBRN) Practices Engaged in Studies
(Your Name) has sent you a message from American Board of Family Medicine
(Your Name) thought you would like to see the American Board of Family Medicine web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
1 + 3 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Preventing the Voltage Drop: Keeping Practice-based Research Network (PBRN) Practices Engaged in Studies
Barbara P. Yawn, Allen Dietrich, Deborah Graham, Susan Bertram, Marge Kurland, Suzanne Madison, Dawn Littlefield, Brian Manning, Craig Smail, Wilson Pace
The Journal of the American Board of Family Medicine Jan 2014, 27 (1) 123-135; DOI: 10.3122/jabfm.2014.01.130026

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Preventing the Voltage Drop: Keeping Practice-based Research Network (PBRN) Practices Engaged in Studies
Barbara P. Yawn, Allen Dietrich, Deborah Graham, Susan Bertram, Marge Kurland, Suzanne Madison, Dawn Littlefield, Brian Manning, Craig Smail, Wilson Pace
The Journal of the American Board of Family Medicine Jan 2014, 27 (1) 123-135; DOI: 10.3122/jabfm.2014.01.130026
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Methods
    • Results
    • Discussion
    • Conclusion
    • Notes
    • References
  • Figures & Data
  • References
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • Examining the pharmacological and psychological treatment of child and adolescent ADHD in Australia: Protocol for a retrospective cohort study using linked national registry data
  • Practitioner Participation in National Dental Practice-based Research Network (PBRN) Studies: 12-Year Results
  • Clinician and Staff Perspectives on Participating in Practice-based Research (PBR): A Report from the Wisconsin Research and Education Network (WREN)
  • Guidance for Researchers Developing and Conducting Clinical Trials in Practice-based Research Networks (PBRNs)
  • Family Physicians are Complex Care Physicians and Quality of Care Advancement Experts
  • Google Scholar

More in this TOC Section

  • Practice-based Research Networks (PBRNs) Bridging the Gaps between Communities, Funders, and Policymakers
  • Lessons Learned from Developing a Patient Engagement Panel: An OCHIN Report
  • Lessons from Initiating the First Veterans Health Administration (VA) Women's Health Practice-based Research Network (WH-PBRN) Study
Show more About Practice-Based Research Networks

Similar Articles

Keywords

  • Methods
  • Practice-based Research Network
  • Program Effectiveness
  • Randomized Controlled Clinical Trials
  • Training

Navigate

  • Home
  • Current Issue
  • Past Issues

Authors & Reviewers

  • Info For Authors
  • Info For Reviewers
  • Submit A Manuscript/Review

Other Services

  • Get Email Alerts
  • Classifieds
  • Reprints and Permissions

Other Resources

  • Forms
  • Contact Us
  • ABFM News

© 2025 American Board of Family Medicine

Powered by HighWire