Skip to main content

Main menu

  • HOME
  • ARTICLES
    • Current Issue
    • Ahead of Print
    • Archives
    • Abstracts In Press
    • Special Issue Archive
    • Subject Collections
  • INFO FOR
    • Authors
    • Reviewers
    • Call For Papers
    • Subscribers
    • Advertisers
  • SUBMIT
    • Manuscript
    • Peer Review
  • ABOUT
    • The JABFM
    • The Editing Fellowship
    • Editorial Board
    • Indexing
    • Editors' Blog
  • CLASSIFIEDS
  • Other Publications
    • abfm

User menu

Search

  • Advanced search
American Board of Family Medicine
  • Other Publications
    • abfm
American Board of Family Medicine

American Board of Family Medicine

Advanced Search

  • HOME
  • ARTICLES
    • Current Issue
    • Ahead of Print
    • Archives
    • Abstracts In Press
    • Special Issue Archive
    • Subject Collections
  • INFO FOR
    • Authors
    • Reviewers
    • Call For Papers
    • Subscribers
    • Advertisers
  • SUBMIT
    • Manuscript
    • Peer Review
  • ABOUT
    • The JABFM
    • The Editing Fellowship
    • Editorial Board
    • Indexing
    • Editors' Blog
  • CLASSIFIEDS
  • JABFM on Bluesky
  • JABFM On Facebook
  • JABFM On Twitter
  • JABFM On YouTube
Research ArticleOriginal Research

Results of a Mixed-Methods Evaluation of Partnerships for Health: A Quality Improvement Initiative for Diabetes Care

Stewart Harris, Jann Paquette-Warren, Sharon Roberts, Meghan Fournie, Amardeep Thind, Bridget L. Ryan, Cathy Thorpe, Amanda L. Terry, Judith Belle Brown, Moira Stewart and Susan Webster-Bogaert
The Journal of the American Board of Family Medicine November 2013, 26 (6) 711-719; DOI: https://doi.org/10.3122/jabfm.2013.06.120211
Stewart Harris
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
MD, MPH, FCFP, FACPM
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jann Paquette-Warren
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
MSc
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Sharon Roberts
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Meghan Fournie
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
BHSc
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Amardeep Thind
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Bridget L. Ryan
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Cathy Thorpe
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
MA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Amanda L. Terry
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Judith Belle Brown
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Moira Stewart
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Susan Webster-Bogaert
From the Centre for Studies in Family Medicine, Schulich School of Medicine & Dentistry, The University of Western Ontario, London, Ontario, Canada (SH, JP-W, MF, AT, BLR, CT, ALT, JBB, MS, SW-B); and Renison University College, The University of Waterloo, Waterloo, Ontario, Canada (SR).
MA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • References
  • Info & Metrics
  • PDF
Loading

Abstract

Purpose: Quality improvement (QI) initiatives have been implemented to facilitate transition to a chronic disease management approach in primary health care. However, the effect of QI initiatives on diabetes clinical processes and outcomes remains unclear. This article reports the effect of Partnerships for Health, a QI program implemented in Southwestern Ontario, Canada, on diabetes clinical process and outcome measures and describes program participants' views of elements that influenced their ability to reach desired improvements.

Methods: Part of an external, concurrent, comprehensive, mixed-methods evaluation of Partnerships for Health, a before/after audit of 30 charts of patient of program physicians (n = 35) and semistructured interviews with program participants (physicians and allied health providers) were conducted.

Results: The proportion of patients (n = 998) with a documented test/examination for the following clinical processes significantly improved (P ≤ .005): glycosylated hemoglobin (A1c), cholesterol, albumin-to-creatinine ratio, serum creatinine, glomerular filtration rate, electrocardiogram, foot/eye/neuropathy examination, body mass index, waist circumference, and depression screening. Data showed intensification of treatment and significant improvement in the number of patients at target for low-density lipoprotein (LDL) and blood pressure (BP) (P ≤ .001). Mean LDL and BP values decreased significantly (P ≤ .01), and an analysis of patients above glycemic targets (A1c >7% at baseline) showed a significant decrease in mean A1c values (P ≤ .01). Interview participants (n = 55) described using a team approach, improved collaborative and proactive care through better tracking of patient data, and increased patient involvement as elements that positively influenced clinical processes and outcomes.

Conclusions: QI initiatives like Partnerships for Health can result in improved diabetes clinical process and outcome measures in primary health care.

  • Diabetes Mellitus
  • Health Care Team
  • Primary Health Care
  • Quality Improvement

The prevalence of type 2 diabetes and related health care costs are increasing worldwide and threatening the ability of countries to markedly improve the health of their populations.1,2 To help address this problem, primary health care reform strategies have been developed to transition from the traditional acute care approach to a chronic disease management approach.3⇓–5 It is hypothesized that a chronic disease management approach will enhance clinical processes, improve clinical outcomes, lead to a healthier population, and decrease the burden of chronic illnesses such as diabetes.6 In Canada, 3 strategies have received considerable attention: (1) the development of new funding models to support a team-based approach7; (2) the adoption of electronic medical records (EMRs) to improve documentation, surveillance, and provider collaboration8,9; and (3) the promotion of clinical practice guidelines (CPGs) to encourage early screening and optimal treatment.10,11 Implementation of these strategies unfortunately have faced many challenges, identifying the need for more effective and innovative programs to support primary health care providers in ameliorating their approach to diabetes care.7⇓–9,12

Despite the shortage of evidence to support the widespread use of quality improvement (QI) programs in health care, governments and other agencies around the world have launched QI initiatives to try to accelerate health care transformation.13,14 Thus far, evaluation of QI initiatives targeting diabetes have relied primarily on anecdotal and self-reported data, yielding varied results.6,13,15,16 The challenges in evaluating complex programs like QI initiatives have been well documented and the need for more rigorous study designs identified.17⇓⇓⇓–21 Because randomized controlled trials are often not possible or inadequate for evaluating programs implemented in naturalistic environments, nonexperimental designs must be strengthened by using external evaluation teams, collecting data concurrently with program implementation, and providing a clear scope of the program.22⇓⇓⇓⇓⇓⇓–29 An effective way to incorporate these key elements into a single research study is to use a comprehensive mixed-methods evaluation design.23,30

The purpose of this study was to use an external, concurrent, mixed-methods, multimeasure evaluation design to: (1) determine the effect of a QI program (Partnerships for Health) on clinical process and outcome measures for diabetes; (2) assess how the level of program involvement effected the results; and (3) obtain the views of program participants regarding the elements that influenced improvement in diabetes clinical processes and outcomes.

Background

In Southwestern Ontario, Canada, a government-funded program (Partnerships for Health) that applied the concepts of the Chronic Disease Prevention and Management Framework31 and QI methodologies32,33 was implemented between 2008 and 2011 to improve diabetes care in the region.31 Similar to the Chronic Care Model34 and the Expanded Chronic Care Model,35 the concepts of the Chronic Disease Prevention and Management Framework outline the need to enhance evidence-based, planned, and integrated collaborative chronic care in primary health care settings by promoting a population-based approach and emphasizing interactions between patients and practice teams.31 In terms of QI methodologies, the program embraced the Institute for Healthcare Improvement's Breakthrough Series approach32 of bringing multiple teams together 2 or 3 times (ie, learning sessions) over 6 to 15 months to learn from experts, and each other, as they plan and test practice changes.32,36 Furthermore, this approach encourages team members who attend learning sessions, usually 2 or 3 people, to work with additional team members in their organization to test and implement changes between the learning sessions (ie, action periods). Lastly, the program embedded the Model for Improvement as a strategy to test and evaluate changes on a few patients before implementing them at a practice or organizational level.33

The Partnerships for Health program targeted primary health care teams (family physicians, as well as practice- and community-based allied providers and administrative staff) and featured educational activities (series of offsite learning sessions); supportive activities (teleconferences, onsite practice coaching, web-based tools, and onsite information technology support); and reporting activities (QI efforts and clinical data). The program emphasized and facilitated (1) the establishment of a team-based approach, (2) enhanced use of information technology systems to better adhere to Canadian Diabetes CPGs10 and to participate in population-based tracking/surveillance, and (3) the promotion of patient self-management. Participation was voluntary. Program implementers have provided an overview of the program at http://www.questforquality.ca/content.asp?id=112.

Methods

Study Design

Part of an external, concurrent, comprehensive, mixed-methods, multimeasure evaluation, an audit of 30 patient charts per program physician (n = 35) was conducted before and 12 months after the program (accounting for clustering at the physician level) to measure the impact of the overall program and the level of program involvement on diabetes clinical processes and outcomes. Semistructured individual interviews with program participants, including physicians and allied-health providers, were conducted 12 months after the program to obtain their views of the elements that influenced improvement in diabetes clinical processes and outcomes. The study was approved by The University of Western Ontario Research Ethics Board.

Chart Audit

Measures

The proportion of patients with a documented albumin-to-creatinine ratio (ACR) test was the primary outcome measure because of the short evaluation timeline and suggestions by QI experts that changes in clinical processes occur more promptly during QI. ACR was selected because it has been shown to be tested in less than 50% of patients.37,38 Secondary clinical process measures were selected in accordance with recommendations from nationally published diabetes CPGs10 and were captured as binary variables. A test/examination was considered complete when it was documented at least once in a patient's chart during the period before or after the program. Treatment intensification was defined as follows: (1) glycemic: adding an oral agent or insulin and/or increasing the dose of oral agent or total dose of insulin; (2) hypertension: adding and/or increasing the dose of antihypertensive agent; and (3) cholesterol: adding and/or increasing the dose of a statin and/or switching to a more potent agent. Clinical outcome measures recommended by CPGs10 were recorded as continuous values. If more than one value was documented, the last recorded value in the before-and-after period was used. If no value was documented after the program, the baseline value was not carried forward and that patient's data were excluded from analyses. A full list of the measures is provided in Tables 1, 2, and 3.

View this table:
  • View inline
  • View popup
Table 1. Changes in Clinical Processes Measures from Before to After the Program (n = 998)
View this table:
  • View inline
  • View popup
Table 2. Percentage of Patients with Treatment Intensification After the Program
View this table:
  • View inline
  • View popup
Table 3. Change in Clinical Measures from Before to After the Program for the Entire Patient Sample and Patients Above Clinical Practice Guideline (CPG) Targets*

Data Collection

Physicians who consented to the evaluation were asked to generate a patient list using the International Classification of Diseases, Ninth Revision, diagnostic code 250 and to randomly assign study numbers to patients using a random list generated in Microsoft Excel (Microsoft Corp, Redmond, WA) by the evaluation team. An external auditor reviewed the charts in the practice until 30 eligible patients were completed. The audit before the program was inclusive of the 12 months before the physician start date in the program and the audit after the program included the 12 months following the start date of the program.

Sample and Eligibility

For patient charts to be eligible for audit, (1) the patient's physician had to be part of a practice site that participated in the program; (2) the patient's physician had to provide written, informed consent for an external auditor to review randomly selected patient charts; (3) the patient had to have a diagnosis of type 2 diabetes; (4) the patient had to be ≥18 years of age; and (5) there had to be at least one documented visit during the periods before and after the program. Charts of patients with type 1 diabetes, gestational diabetes, or prediabetes were excluded.

The evaluation team was not involved in the recruitment of program participants; therefore sample size estimates were based on (1) the number of program physicians; (2) a 25% change in the proportion of patients with a documented ACR test (primary outcome, ∝ = 0.05 and β = 0.10); (3) an interclass correlation coefficient of 0.21; and (4) 21 physicians each contributing 45 charts. Because the physician consent rate was higher than anticipated, the number of charts per physician was reduced to 30.

Data Analysis

Descriptive statistics were performed using SPSS software version 17(IBM, Chicago, IL). Multilevel regression analyses were conducted using Stata software (StataCorp, College Station, TX). Multilevel mixed-effects logistic regression (xtmelogit) was used for the dichotomous measures, and multilevel mixed-effects linear regression (xtmixed) was used for the continuous measures. Both types of analyses controlled for within-subject data and clustering at the physician level. Additional analyses were completed on a subsample of patients identified as above CPG targets: hemoglobin A1c (A1c) > 7%, blood pressure (BP) >130/80 mmHg, or low-density lipoprotein (LDL) >2.0 mmol/L. A stratified sample analysis was done for scores after the program to compare measures for patients of physicians with different levels of program involvement. Group A physicians were involved in educational activities (at least one learning session), supportive activities, and reporting activities, whereas group B physicians were involved or affected by local practice change only (at the same practice sites as group A physicians but did not attend any educational activities). P < .05 was considered statistically significant.

Interviews

Data Collection

To obtain the views of participants regarding the elements that influenced improved diabetes processes and outcomes, individual interviews were conducted approximately 12 months after the program by 1 of 2 trained interviewers. A semistructured interview guide was used. All individual interviews were tape-recorded and transcribed verbatim. All transcripts were reviewed for accuracy before starting the analysis process.

Sample

A purposive sampling approach was used, and maximum variation was sought according to health care providers' professional role, team/practice, and location (urban/rural). To be eligible for an interview, program participants had to have attended at least one learning session (group A physicians, allied providers, or administrative staff).

Data Analysis

Data analyses were conducted concurrently with data collection. J. Paquette-Warren, M. Tyler, and R. Caruso independently reviewed the transcripts and field notes to identify key concepts and themes. Coding templates were created as themes emerged from the data. The coding templates were used in a second round of analysis to verify that no key concepts or themes were missed. During the final 3 rounds of the iterative process with independent and team analyses, quotations were pulled from the data and cleaned. Crystallization and immersion39 were used to identify overarching themes (NVivo 8; QSR International, Doncaster, Victoria, Australia). Trustworthiness and credibility were maximized by verbatim transcription, field notes, and independent and team analyses.40

Results

Chart Audits

Thirty-five physicians consented to the chart audit (79% of program physicians), and 998 randomly selected patient charts were audited. Physician and patient demographics are provided in Table 4. Physician characteristics were comparable to national averages with the exception of a larger proportion of physicians practicing in new funding models and using EMRs.7,41

View this table:
  • View inline
  • View popup
Table 4. Chart Audit Physician and Patient Demographics

Clinical Processes

Clinical process measures showed statistically significant improvements in testing and documentation for all measures (ACR [primary outcome]; A1c [annual and quarterly]; cholesterol; serum creatinine; glomerular filtration rate; electrocardiogram; foot, eye, and neuropathy exams; body mass index; waist circumference; and depression screening) except blood pressure (Table 1). Odds ratios were higher for group A physicians for all significantly different clinical processes except glomerular filtration rate and waist circumference; however, none of these differences reached statistical significance. Intensification of glycemic, hypertension, and/or cholesterol treatment occurred in the entire sample, and more intensification was evident in the subsample of patients who exceeded CPG targets (Table 2).

Clinical Outcomes

There were significant improvements in clinical outcome measures, including the percentage of patients at target for LDL and BP, mean LDL cholesterol, mean systolic BP, and mean diastolic BP (Table 3). There was no significant change in the proportion of patients at target A1c, and although there was a small significant increase in mean A1c for the entire sample, the increase was not clinically relevant. Overall, mean A1c, systolic BP, diastolic BP, and LDL cholesterol significantly improved in the subsample of patients above the CPG A1c target (Table 3).

Qualitative Interviews

Fifty-five interviews were conducted with physicians (n=7), allied providers (n=38), and administrative staff (n=10) to capture their views of elements that influence diabetes clinical processes and outcomes.

Elements Influencing Clinical Processes

Participants described how the program gave them the knowledge, skills, and confidence to change how they deliver care: “When the practice leaders are confident and experienced with the [Chronic Disease Prevention and Management] Framework, it gives you the confidence to change your practice.” The elements identified as contributing to improved clinical processes were: (1) using a team approach and better care coordination; (2) establishing a better tracking mechanism for more proactive care and adherence to CPGs; and (3) being more patient-centered: “In terms of improving my practice as a whole, I really feel that it's taken a whole team … to start tracking more accurately.” “We have now become more patient-centric … depending on what the patients are wishing to do and how we can help them.” Participants explained how increasing patient involvement and supporting self-management seemed to improve patient knowledge, adherence to treatment, and skills in self-management: “I think that if you talked to patients they would feel more in charge of their own health and what they're doing. I think that's huge.” They expressed a strong opinion that these changes had positively affected clinical processes and outcomes and would lead to further improvements over time: “Patients are being seen on a regular basis…. They get the screening tests done, medications are being added or adjusted. I think it also points out [to patients] that it's really important for them to manage their diabetes. They're getting probably more referrals to other health care providers to help manage their diabetes. So ultimately it should improve their care.”

Elements Influencing Clinical Outcomes

The elements that influenced achieving better clinical outcomes were identified as: (1) having accurate data to inform clinical decisions; (2) health status and room for improvement; and (3) the progressive nature of diabetes. “Some target numbers will not change just due to the progressive nature of the disease…. If we did absolutely everything right for a certain number of people we would still lose control over time.” Other elements included: (1) provider and patient comfort with treatment intensification; (2) comorbidities, side effects, and treatment interactions; and (3) personal goals and environmental factors. “Some patients have a beautiful A1c, but … they are low 3 or 4 times a week…. We change medication and they get rid of their lows, but their A1c is now 7.3 instead of 6.8…. We want to see it under 7, but at what cost to the patient?”

Beyond clinical outcomes were themes about improvement in patients' general health, mental health, and quality of life: “I think patient satisfaction is one thing. Patient involvement and self-management are components that are improving the overall health of the patient. Even if we don't see it with their diabetes numbers, patients are going to see advantages in their quality of life.”

Discussion

Our study used a mixed-methods approach (external chart audit and individual interviews) to assess the effect of a QI program on diabetes clinical processes and outcomes, and to capture the views of program participants related to the elements that influenced improvements in diabetes clinical processes and outcomes. At 12 months after the program, chart audit results showed significantly improved diabetes clinical processes and outcomes for the monitoring and management of glycemic control and related diabetes complication risk factors (cholesterol and hypertension). This was reflective of program participants' views that the program gave them the knowledge, skills, and confidence to change practice and that using a team approach/coordinated care, better patient tracking, and proactive care to meet CPG recommendations, and a more patient-centered approach influenced clinical processes and outcomes. Interview data revealed a lack of accurate patient data and comfort with treatment intensification as perceived barriers to improving patient outcomes, but chart audit results showed evidence of intensification of treatment for hyperglycemia, hypertension, and dyslipidemia. It may be that the perceived increase in patient involvement and active self-management helped to outweigh or overcome some of the identified barriers.

Chart audit results for the entire sample showed a small (7.2% to 7.3%) but statistically significant rise in A1c. This corresponds with program participants' concerns related to the progressive nature of the disease influencing outcomes. However, chart audit data revealed a well-controlled patient population at baseline, with a mean overall A1c of 7.2%, providing little opportunity for physicians to initiate clinically relevant improvements.10 Likewise, the interview results highlighted how health status and room for improvement influenced decisions to intensify treatment. Additional analysis supported the concept of clinical inertia and qualitative themes by showing a more significant reduction in A1c (8.2 to 8.0%; P < .01) for patients above CPG targets at baseline. Other barriers to improving clinical outcomes were identified as comorbidities, treatment interactions, and side effects; personal goals; and environmental factors.

This article has important implications for practice-based QI initiatives like Partnerships for Health that aim to improve the quality of care by shifting from an acute care approach to a focused chronic disease management approach. Evaluation of QI initiatives is critical to inform policy makers and influence how funding and support is provided to primary health care providers. Previous literature has yielded unclear results regarding the effectiveness of QI initiatives at improving diabetes care, showing positive change in clinical process measures and intensification of medication but no change in clinical outcomes.36,42⇓–44 Recent chart audit studies showed inconsistent improvement in A1c, BP, and LDL, depending on the length of the period after the initiative.16,45⇓–47 This study had a relatively short period after the program (1 year), yet significant improvements were found. Future research could employ a longer period to examine the impact over time and to determine the spread and sustainability of the results reported herein. Such research could provide evidence to determine whether program participants' views related to the continued improvement of clinical outcomes resulting from enhanced clinical processes (eg, team approach, collaborative and proactive care) and other program benefits (eg, increased patient knowledge and involvement, better general health status, mental health, and quality of life) have merit.

Limitations

Participation in the program and in the evaluation was voluntary, and demographics showed that the proportions of consenting physicians with EMRs and practicing in a new funding model were higher than the national averages.7,41 This introduces a potential participant bias (early adopters of health care reform strategies), affecting the results and limiting the potential for generalizability. The evaluation design did not include a control group, negating the opportunity to account for influential factors beyond the program. Groups with different levels of involvement in the program were compared, but, again, program involvement was voluntary, not randomly allocated, and although odd ratios showed increasing trends with a higher level of involvement, no statistical differences were found. Including all program participants in the evaluation irrespective of their level of involvement may have affected estimations of impact. By virtue of the positive findings and the lack of difference between groups, it seems that the approach taken by the program to target select members of a practice site to participate in educational activities and encourage them to work with colleagues at the practice site to redesign care is adequate to yield positive change in clinical process and outcomes. Future studies are needed to determine the ideal ratios related to the level of involvement of multiple team members in a practice site and should include a comparison to control practices. Lastly, the length of the period after the program may have affected the estimated impact of the program. Having a longer period after the program could have minimized the effect of this limitation.

Conclusions

QI initiatives like Partnerships for Health can lead to improved diabetes clinical processes and outcomes, and they can be detected as early as 1 year after the program. Additional studies are needed to assess the sustainability and spread of the improvements found, and to identify the program features that support the elements perceived by program participants as critical in achieving positive results. This information will contribute to replicating and enhancing the results reported here in future QI initiatives in primary health care.

Acknowledgments

The authors acknowledge the cooperation, support, and dedication of the Partnerships for Health team, steering committee, and participants. The authors also acknowledge the editorial assistance of Jordan Tompkins and the research assistance of Marie Tyler and Rosie Caruso at the Centre for Studies in Family Medicine, The University of Western Ontario, London, Ontario, Canada.

Notes

  • This article was externally peer reviewed.

  • Funding: Support for the Partnerships for Health Project was provided by the Government of Ontario. The initiative was sponsored by the South West Local Health Integration Network, with the South West Community Care Access Centre serving as the transfer payment agency, providing oversight of the project agreement and funding arrangement with the Ontario Government Ministry of Finance. SH holds the Canadian Diabetes Association Chair in Diabetes Management and the Dr. Ian McWhinney Chair of Family Medicine Studies. AT holds the Canada Research Chair in Health Services Research. MS is funded by the Dr. Brian W. Gilbert Canada Research Chair in Primary Health Care Research.

  • Conflict of interest: none declared.

  • Disclaimer: The views and opinions expressed herein do not necessarily represent the official policies of the Government of Ontario.

  • Received for publication August 17, 2012.
  • Revision received May 10, 2013.
  • Accepted for publication May 28, 2013.

References

  1. 1.↵
    International Diabetes Federation. IDF Diabetes Atlas, 4th ed. 2011. Available from: http://www.idf.org/sites/default/files/Economic%20impact%20of%20 Diabetes_0.pdf. Accessed July 28, 2011.
  2. 2.↵
    1. Danaei G,
    2. Finucane MM,
    3. Lu Y,
    4. et al
    . National, regional, and global trends in fasting plasma glucose and diabetes prevalence since 1980: systematic analysis of health examination surveys and epidemiological studies with 370 country-years and 2.7 million participants. Lancet 2011;378:31–40.
    OpenUrlCrossRefPubMed
  3. 3.↵
    1. Nasmith L,
    2. Ballem P,
    3. Baxter R,
    4. et al
    . Transforming care for Canadians with chronic health conditions: put people first, expect the best, manage for results. 2010. Available from: http://www.cahs-acss.ca/wp-content/uploads/2011/09/cdm-final-English.pdf. Accessed July 29, 2011.
  4. 4.↵
    1. Friedberg MW,
    2. Hussey PS,
    3. Schneider EC
    . Primary care: a critical review of the evidence on quality and costs of health care. Health Aff (Millwood) 2010;29:766–72.
    OpenUrlAbstract/FREE Full Text
  5. 5.↵
    World Health Organization. Preventing chronic diseases: a vital investment. 2005. Available from: http://www.who.int/chp/chronic_disease_report/full_report. pdf. Accessed May 2, 2011.
  6. 6.↵
    1. Coleman K,
    2. Mattke S,
    3. Perrault PJ,
    4. Wagner EH
    . Untangling practice redesign from disease management: how do we best care for the chronically ill? Annu Rev Public Health 2009;30:385–408.
    OpenUrlCrossRefPubMed
  7. 7.↵
    1. Hutchison B,
    2. Levesque JF,
    3. Strumpf E,
    4. Coyle N
    . Primary health care in Canada: systems in motion. Milbank Q 2011;89:256–88.
    OpenUrlCrossRefPubMed
  8. 8.↵
    1. Ludwick DA,
    2. Doucette J
    . Adopting electronic medical records in primary care: lessons learned from health information systems implementation experience in seven countries. Int J Med Inf 2009;78: 22–31.
    OpenUrlCrossRefPubMed
  9. 9.↵
    1. Terry AL,
    2. Thorpe CF,
    3. Giles G,
    4. Brown JB,
    5. Harris SB,
    6. Reid GJ
    . Implementing electronic health records: key factors in primary care. Can Fam Physician 2008;54:730–6.
    OpenUrlAbstract/FREE Full Text
  10. 10.↵
    Canadian Diabetes Association Clinical Practice Guidelines Expert Committee. Canadian Diabetes Association 2008 clinical practice guidelines for the prevention and management of diabetes in Canada. Can J Diabetes 2008;32(Suppl 1):S1–201.
    OpenUrl
  11. 11.↵
    1. Li R,
    2. Zhang P,
    3. Lawrence EB,
    4. Chowdhury FM,
    5. Zhang X
    . Cost-effectiveness of interventions to prevent and control diabetes mellitus: a systematic review. Diabetes Care 2010;33:1872–94.
    OpenUrlAbstract/FREE Full Text
  12. 12.↵
    1. Harris SB,
    2. Kapor J,
    3. Lank CN,
    4. Willan AR,
    5. Houston T
    . Clinical inertia in patients with T2DM requiring insulin in family practice. Can Fam Physician 2010;56:e418–24.
    OpenUrlAbstract/FREE Full Text
  13. 13.↵
    1. Schouten LM,
    2. Hulscher ME,
    3. van Everdingen JJ,
    4. Huijsman R,
    5. Grol RP
    . Evidence for the impact of quality improvement collaboratives: systematic review. Br Med J 2008;336:1491–4.
    OpenUrlAbstract/FREE Full Text
  14. 14.↵
    1. Jones K,
    2. Piterman L
    . The effectiveness of the breakthrough series methodology. Aust J Prim Health 2008;14:59–65.
    OpenUrl
  15. 15.↵
    1. Shojania KG,
    2. Ranji SR,
    3. McDonald KM,
    4. et al
    . Effects of quality improvement strategies for type 2 diabetes on glycemic control: a meta-regression analysis. JAMA 2006;296:427–40.
    OpenUrlCrossRefPubMed
  16. 16.↵
    1. Chin MH,
    2. Drum ML,
    3. Guillen M,
    4. et al
    . Improving and sustaining diabetes care in community health centers with the health disparities collaboratives. Med Care. 2007;45:1135–43.
    OpenUrlCrossRefPubMed
  17. 17.↵
    1. Craig P,
    2. Petticrew M
    . Developing and evaluating complex interventions: reflections on the 2008 MRC guidance. Int J Nurs Stud 2013;50:585–7.
    OpenUrlPubMed
  18. 18.↵
    1. Dubois N,
    2. Lloyd S,
    3. Houle J,
    4. Mercier C,
    5. Brousselle A,
    6. Rey L
    . Discussion: practice-based evaluation as a response to address intervention complexity. Can J Program Eval 2011;26:105–13.
    OpenUrl
  19. 19.↵
    1. Zimmerman BJ,
    2. Dubois N,
    3. Houle J,
    4. et al
    . How does complexity impact evaluation? Can J Program Eval 2011;26:v–xx.
    OpenUrl
  20. 20.↵
    1. Campbell M,
    2. Fitzpatrick R,
    3. Haines A,
    4. et al
    . Framework for design and evaluation of complex interventions to improve health. Br Med J 2000;321:694–6.
    OpenUrlFREE Full Text
  21. 21.↵
    1. Mittman BS
    . Creating the evidence base for quality improvement collaboratives. Ann Intern Med 2004;140:897–901.
    OpenUrlCrossRefPubMed
  22. 22.↵
    1. Craig P,
    2. Dieppe P,
    3. Macintyre S,
    4. Michie S,
    5. Nazareth I,
    6. Petticrew M
    . Developing and evaluating complex interventions: the new Medical Research Council guidance. Int J Nurs Stud 2013;50:587–92.
    OpenUrlCrossRefPubMed
  23. 23.↵
    1. Tashakkori A,
    2. Teddlie C
    1. Song MK,
    2. Sandelowski M,
    3. Happ MB
    . Current practices and emerging trends in conducting mixed methods intervention studies in health sciences. In: Tashakkori A, Teddlie C, editors. Mixed methods in social and behavioral research. 2nd ed. Thousand Oaks, CA: Sage Publications; 2010:725–47.
  24. 24.↵
    1. Tashakkori A,
    2. Teddlie C
    , editors. Mixed methods in social and behavioral research. 2nd ed. Thousand Oaks, CA: Sage Publications; 2010.
  25. 25.↵
    1. Tashakkori A,
    2. Teddlie C
    1. Bamberger M,
    2. Rao V,
    3. Woolcock M
    . Using mixed methods in monitoring and evaluation. In: Tashakkori A, Teddlie C, eds. Mixed methods in social and behavioral research. 2nd ed. Thousand Oaks, CA: Sage Publications; 2010:613–41.
  26. 26.↵
    1. Chen H
    . Practical program evaluation :assessing and improving planning, implementation, and effectiveness. Thousand Oaks, CA: Sage Publications; 2005.
  27. 27.↵
    1. Petrosino A
    . Answering the why question in evaluation: the causal-model approach. Can J Program Eval 2000;15:1–24.
    OpenUrl
  28. 28.↵
    1. Rush B,
    2. Ogborne A
    . Program logic models: expanding their role and structure for program planning and evaluation. Can J Program Eval 1991;6:93–106.
    OpenUrl
  29. 29.↵
    1. Creswell JW,
    2. Plano Clark VL
    . Designing and conducting mixed methods research. 2nd ed. Los Angeles: Sage Publications; 2011.
  30. 30.↵
    1. Crabtree BF,
    2. Chase SM,
    3. Wise CG,
    4. et al
    . Evaluation of patient centered medical home practice transformation initiatives. Med Care 2011;49:10–6.
    OpenUrlCrossRefPubMed
  31. 31.↵
    Ministry of Health and Long-Term Care. Preventing and managing chronic disease: Ontario's framework. May 2007. Available from: http://www.health. gov.on.ca/en/pro/programs/cdpm/pdf/framework_ full.pdf. Accessed July 29, 2011.
  32. 32.↵
    The Breakthrough Series: IHI's collaborative model for achieving breakthrough improvement. IHI Innovation Series white paper. Boston: Institute for Healthcare Improvement; 2003. Available from: www.IHI.org. Accessed January 27, 2011.
  33. 33.↵
    1. Langley GJ
    . The improvement guide: a practical approach to enhancing organizational performance. 2nd ed. San Francisco, CA: Jossey-Bass; 2009.
  34. 34.↵
    1. Wagner EH,
    2. Glasgow RE,
    3. Davis C,
    4. et al
    . Quality improvement in chronic illness care: a collaborative approach. Jt Comm J Qual Improv 2001;27:63–80.
    OpenUrlPubMed
  35. 35.↵
    1. Barr VJ,
    2. Robinson S,
    3. Marin-Link B,
    4. et al
    . The expanded chronic care model: an integration of concepts and strategies from population health promotion and the chronic care model. Hosp Q 2003;7:73–82.
    OpenUrlPubMed
  36. 36.↵
    1. Bricker PL,
    2. Baron RJ,
    3. Scheirer JJ,
    4. et al
    . Collaboration in Pennsylvania: rapidly spreading improved chronic care for patients to practices. J Contin Educ Health Prof 2010;30:114–25.
    OpenUrlPubMed
  37. 37.↵
    1. Harris SB,
    2. Worrall G,
    3. Macaulay A,
    4. et al
    . Diabetes management in Canada: baseline results of the group practice diabetes management study. Can J Diabetes 2006;30:131–7.
    OpenUrl
  38. 38.↵
    1. Harris SB,
    2. Leiter LA,
    3. Webster-Bogaert S,
    4. Van DM,
    5. O'Neill C
    . Teleconferenced educational detailing: diabetes education for primary care physicians. J Contin Educ Health Prof 2005;25:87–97.
    OpenUrlCrossRefPubMed
  39. 39.↵
    1. Crabtree BF,
    2. Miller WL
    1. Borkan J
    . Immersion/crystallization. In: Crabtree BF, Miller WL, eds. Doing qualitative research. 2nd ed. Thousand Oaks, CA: Sage Publications;1999:179–94.
  40. 40.↵
    1. Crabtree BF,
    2. Miller WL
    . Doing qualitative research. 2nd ed. Thousand Oaks, CA: Sage Publications; 1999.
  41. 41.↵
    National Physician Survey, College of Family Physicians of Canada, Canadian Medical Association, Royal College of Physicians and Surgeons of Canada. National Physician Survey. 2010 survey results. Available from: http://nationalphysiciansurvey.ca/surveys/2010-survey/2010-results/. Accessed January 9, 2012.
  42. 42.↵
    1. Daniel DM,
    2. Norman J,
    3. Davis C,
    4. et al
    . A state-level application of the chronic illness breakthrough series: results from two collaboratives on diabetes in Washington state. Jt Comm J Qual Saf 2004;30: 69–79.
    OpenUrlPubMed
  43. 43.↵
    1. Benedetti R,
    2. Flock B,
    3. Pedersen S,
    4. Ahern M
    . Improved clinical outcomes for fee-for-service physician practices participating in a diabetes care collaborative. Jt Comm J Qual Saf 2004;30:187–94.
    OpenUrlPubMed
  44. 44.↵
    1. Wang A,
    2. Wolf M,
    3. Carlyle R,
    4. Wilkerson J,
    5. Porterfield D,
    6. Reaves J
    . The North Carolina experience with the diabetes health disparities collaboratives. Jt Comm J Qual Saf 2004;30:396–404.
    OpenUrlPubMed
  45. 45.↵
    1. Goderis G,
    2. Borgermans L,
    3. Grol R,
    4. et al
    . Start improving the quality of care for people with type 2 diabetes through a general practice support program: cluster randomized trial. Diabetes Res Clin Pract 2010;88:56–64.
    OpenUrlCrossRefPubMed
  46. 46.↵
    1. Landon BE,
    2. Hicks LS,
    3. O'Malley AJ,
    4. et al
    . Improving the management of chronic disease at community health centers. N Engl J Med 2007;356:921–34.
    OpenUrlCrossRefPubMed
  47. 47.↵
    1. Chin MH,
    2. Cook S,
    3. Drum ML,
    4. et al
    . Improving diabetes care in midwest community health centers with the health disparities collaborative. Diabetes Care 2004;27:2–8.
    OpenUrlAbstract/FREE Full Text
PreviousNext
Back to top

In this issue

The Journal of the American Board of Family     Medicine: 26 (6)
The Journal of the American Board of Family Medicine
Vol. 26, Issue 6
November-December 2013
  • Table of Contents
  • Table of Contents (PDF)
  • Cover (PDF)
  • Index by author
  • Back Matter (PDF)
  • Front Matter (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on American Board of Family Medicine.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Results of a Mixed-Methods Evaluation of Partnerships for Health: A Quality Improvement Initiative for Diabetes Care
(Your Name) has sent you a message from American Board of Family Medicine
(Your Name) thought you would like to see the American Board of Family Medicine web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
3 + 6 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Results of a Mixed-Methods Evaluation of Partnerships for Health: A Quality Improvement Initiative for Diabetes Care
Stewart Harris, Jann Paquette-Warren, Sharon Roberts, Meghan Fournie, Amardeep Thind, Bridget L. Ryan, Cathy Thorpe, Amanda L. Terry, Judith Belle Brown, Moira Stewart, Susan Webster-Bogaert
The Journal of the American Board of Family Medicine Nov 2013, 26 (6) 711-719; DOI: 10.3122/jabfm.2013.06.120211

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Results of a Mixed-Methods Evaluation of Partnerships for Health: A Quality Improvement Initiative for Diabetes Care
Stewart Harris, Jann Paquette-Warren, Sharon Roberts, Meghan Fournie, Amardeep Thind, Bridget L. Ryan, Cathy Thorpe, Amanda L. Terry, Judith Belle Brown, Moira Stewart, Susan Webster-Bogaert
The Journal of the American Board of Family Medicine Nov 2013, 26 (6) 711-719; DOI: 10.3122/jabfm.2013.06.120211
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Background
    • Methods
    • Results
    • Discussion
    • Conclusions
    • Acknowledgments
    • Notes
    • References
  • Figures & Data
  • References
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • The role of quality improvement collaboratives in general practice: a qualitative systematic review
  • Inappropriate use of clinical practices in Canada: a systematic review
  • Impact of a primary healthcare quality improvement program on diabetes in Canada: evaluation of the Quality Improvement and Innovation Partnership (QIIP)
  • Impact of a provincial quality-improvement program on primary health care in Ontario: a population-based controlled before-and-after study
  • Family Physicians Improve Patient Health Care Quality and Outcomes
  • Google Scholar

More in this TOC Section

  • Evaluating Pragmatism of Lung Cancer Screening Randomized Trials with the PRECIS-2 Tool
  • Perceptions and Preferences for Defining Biosimilar Products in Prescription Drug Promotion
  • Successful Implementation of Integrated Behavioral Health
Show more Original Research

Similar Articles

Keywords

  • Diabetes Mellitus
  • Health Care Team
  • Primary Health Care
  • Quality Improvement

Navigate

  • Home
  • Current Issue
  • Past Issues

Authors & Reviewers

  • Info For Authors
  • Info For Reviewers
  • Submit A Manuscript/Review

Other Services

  • Get Email Alerts
  • Classifieds
  • Reprints and Permissions

Other Resources

  • Forms
  • Contact Us
  • ABFM News

© 2025 American Board of Family Medicine

Powered by HighWire