Intended for healthcare professionals

Analysis Measuring quality through performance

Making performance indicators work: experiences of US Veterans Health Administration

BMJ 2007; 335 doi: https://doi.org/10.1136/bmj.39358.498889.94 (Published 08 November 2007) Cite this as: BMJ 2007;335:971
  1. Eve A Kerr, associate director1,
  2. Barbara Fleming, chief officer, quality and performance2
  1. 1Center for Practice Management Research, Veterans Affairs Ann Arbor Healthcare System, PO Box 130170, Ann Arbor, MI 48113-0170, USA
  2. 2Veterans Health Administration, Washington, DC 20420, USA
  1. Correspondence to: E A Kerr ekerr{at}umich.edu

    Eve Kerr and Barbara Fleming explain how measuring performance helped transform a failing healthcare system

    Many healthcare organisations are having to confront the challenge of how to provide high quality care within a fixed (or sometimes shrinking) budget.1 The Veterans Health Administration, which provides care for over 5 million veterans within the largest integrated healthcare system in the United States, faced this problem in the early 1990s, when it was struggling to overcome a reputation for providing inferior and inefficient health care. In 1995 it began a programme to simultaneously improve the organisation and quality of its care, with performance monitoring having a key role.2 3 Within 10 years, it was lauded as providing the best care in the US.4

    The turnaround shows the value of monitoring performance and providing appropriate incentives to improve care. We explain how the organisation brought about the changes and look at some of the remaining challenges.

    Foundation for change

    The administration made several organisational changes as a foundation for the quality improvements.3 5 Firstly, it reorganised care into regional networks (veterans integrated service networks), which were provided with fixed resources and held accountable for managing all care within their facilities. Secondly, it shifted care to ambulatory settings, opening new outpatient clinics and closing many inpatient beds. Thirdly, the capacity of the administration's automated information system was improved to allow providers to access and enter all patient information within a unified electronic medical record, thus enhancing coordination of care.6

    A cornerstone of the efforts to transform care was the systematic use of data driven measures to monitor performance across several domains, including technical quality of care, access, functional status, and patient satisfaction.3 Many of the measures paralleled those developed by other US quality assessment organisations, but the administration also included measures to assess care of particular relevance to veterans.

    Initially, assessment focused primarily on process measures concerning outpatient management of chronic conditions (control of diabetes, use of inhalers for obstructive lung disease, diet and exercise counselling for hypertension and obesity, drug management and cholesterol testing after myocardial infarction) and preventive care (immunisations; screening for breast, colon, cervical, and prostate cancer; and counselling on alcohol and tobacco use). Currently, the administration assesses over 50 measures covering acute and chronic conditions as well as palliative and preventive care (box). An external contractor collects data quarterly by auditing the electronic medical records for a sample of veterans from all the administration's facilities. It also surveys a sample of patients at each facility about their healthcare experiences, satisfaction, and health status.

    Veterans Health Administration areas of performance measurement, 1997-2006*

    Chronic and acute care
    • Diabetes

    • Acute myocardial infarction

    • Obstructive lung disease

    • Obesity

    • Hypertension

    • Pain assessment

    • Major depression

    • Smoking cessation

    • Community acquired pneumonia

    • Acute coronary syndrome

    • Substance use disorders

    • Heart failure

    Preventive care
    • Influenza vaccination

    • Pneumococcal vaccination

    • Prostate cancer education/screening

    • Mammography

    • Cervical cancer screening

    • Colorectal cancer screening

    • Hyperlipidaemia screening

    • Alcohol screening

    • Tobacco screening

    *Some areas were not covered in all years, and measures within areas also varied by year

    In addition to monitoring quality, the administration instituted mechanisms to make it more likely that performance monitoring would drive quality improvement. Each regional director was held accountable through a performance contract, which included incentives equivalent to roughly 10% of the director's salary, for meeting specified quality standards. The director, in turn, held managers and clinicians accountable for the performance standards, and the performance results of each regional network and facility were widely available within the administration. Consequently, regional networks began to compete with each other on performance, and facilities within each network did the same.

    Although implementation of quality improvement initiatives was ultimately in the hands of individual networks and facilities, there were also centrally led quality improvement efforts. The administration also drew on researchers from the Department of Veterans Affairs' health services research and development service and from nine disease specific, quality enhancement research initiatives to systematically identify quality gaps and develop and assess interventions to close those gaps.7 8

    Response to measurement

    What is the evidence that quality improved in the areas monitored? Figure 1 shows change in three representative measures monitored from 1997 to 2006. The rate of β blocker administration after myocardial infarction rose from 83% in 1997 to 93% in 2006. Similarly, annual testing for glycaemic control rose from 85% in 1993 to 96% in 2007. Jha and colleagues showed that quality improved significantly from 1997 to 2000 on other measures. For example, influenza vaccination rates rose from 61% in 1997 to 78% in 2000, pneumococcal vaccination rose from 60% to 81%, and aspirin administration after myocardial infarction rose from 92% to 98%.9 They also showed that the absolute level of quality of care for veterans was higher than for patients covered by Medicare.9 Similarly, we showed that quality of diabetes care (which had been included in performance monitoring since 1997) was higher in the administration in 2000-2001 than in geographically matched commercial managed care plans for almost every aspect studied, including timely eye screening, testing glucose and lipid concentrations, and glucose and lipid control.10

    Figure1

    Fig 1 Changes in performance of Veterans Health Administration facilities on three quality measures, 1997-2007

    The improvements in care occurred mainly in conditions that were being monitored, as shown by results from a study comparing the quality of care for patients in the administration with that of a sample of people from 12 major US communities between 1997-2000.11 This study used a global quality assessment tool that comprised over 300 measures across 26 diseases, including 26 measures targeted by the administration's performance monitoring. Although overall care was higher for veterans than in the community, the advantage was greatest for the measures that the administration was using to monitor quality (such as retinal screening for people with diabetes) and spilled over beyond the targeted measures to the conditions covered by performance monitoring (such as diabetes). However, for conditions not part of the performance monitoring system, veterans had no advantage (fig 2).11

    Figure2

    Fig 2 Comparison of quality of care for veterans and national sample on performance measures monitored by Veterans Health Administration, measures related to monitored conditions, and measures unrelated to monitored conditions

    Although this was a retrospective, observational study, these results, taken together with evidence for improvement over time, suggest that performance on measures being monitored improves, that improvement may extend beyond the single measure to conditions being monitored, but that areas not being monitored are less likely to improve. Additionally, national surveys have shown that patient satisfaction in both the inpatient and outpatient settings is higher for veterans than in other surveyed settings.12

    Management challenges

    One of the most important decisions in facilitating change was to invest heavily in auditing electronic medical records. This enabled the administration to collect detailed clinical data that are unavailable electronically and helped to ensure that measures are clinically meaningful and evidence based. Instead, of limiting measures to those that can be constructed with administrative utilisation data,13 the administration has used clinical data to construct measures that incorporate exceptions (such as contraindications) and measure processes strongly linked to outcomes.14

    Although quality has improved in targeted clinical areas, the administration faces new challenges in improving care for all conditions. It is expanding measures for acute and hospital care, as well as for conditions faced by young veterans returning from conflict. Such expansion risks measurement overload—when measurement ceases to improve performance. To minimise this risk, the administration removes measures that get consistently high performance from the regional directors' contracts, although it continues to monitor them so that they can be placed back in the contract if performance drops.

    The administration is also looking at other ways to stimulate quality improvement through performance monitoring. It is currently implementing pay for performance initiatives to reward providers for higher performance, but further research is needed to identify whether rewarding individuals or teams is more effective, the types of quality measures that best motivate true quality improvement,14 15 and the levels of incentives necessary to further stimulate change in provider behaviour.

    The administration's experience has shown the valuable role that well constructed and clinically detailed measures of performance can have on improving quality of care, even without large monetary incentives for individual doctors. Nevertheless, monitoring can produce unintended consequences such as patient deselection,16 17 18 overtreatment of patients not likely to benefit from an intervention,14 19 20 21 22 and neglect of areas not covered in performance monitoring.11 Like other large healthcare organisations seeking to improve the quality of its care, the administration now needs to find ways to measure and ensure quality across the continuum of care and guard against unintended consequences of measurement.

    Summary points

    • Care provided by the US Veterans Health Administration has greatly improved over the past 10 years

    • Key to the transformation was use of clinically based measures to monitor performance in targeted areas

    • Competition between regions and financial incentives to regional directors helped drive change

    • Challenges remain to improve care in unmeasured areas

    Footnotes

    • ARTICLE
    • This is the second article in a series looking at use of performance indicators in the UK and elsewhere.

      This series is edited by Azeem Majeed, professor of primary care, Imperial College London (a.majeed{at}imperial.ac.uk) and Helen Lester, professor of primary care, University of Manchester (helen.lester{at}manchester.ac.uk).

    • Contributors and sources: EAK is associate professor of internal medicine, University of Michigan Medical School. She has studied and reported widely on quality of care in US, methods to improve quality assessment and performance in the Veterans Health Administration. This article arose from the first author's research experience, review of the literature, and experiences of the second author in leading quality improvement efforts in the Veterans Health Administration.

    • Competing interests: EAK has received research funding from the US Department of Veterans Affairs. The opinions presented here do not necessarily represent those of the Department of Veterans Affairs or the University of Michigan.

    • Funding: EAK's time for preparing this article was supported, in part, by the VA Quality Enhancement Research Initiative for Diabetes Mellitus (DIB #98-001) and by the Michigan Diabetes Research and Training Center Grant P60DK-20572 from the NIDDK of the National Institutes of Health.

    • Provenance and peer review: Commissioned; externally peer reviewed.

    References

    View Abstract