Accuracy of electronically reported "meaningful use" clinical quality measures: a cross-sectional study

Ann Intern Med. 2013 Jan 15;158(2):77-83. doi: 10.7326/0003-4819-158-2-201301150-00001.

Abstract

Background: The federal Electronic Health Record Incentive Program requires electronic reporting of quality from electronic health records, beginning in 2014. Whether electronic reports of quality are accurate is unclear.

Objective: To measure the accuracy of electronic reporting compared with manual review.

Design: Cross-sectional study.

Setting: A federally qualified health center with a commercially available electronic health record.

Patients: All adult patients eligible in 2008 for 12 quality measures (using 8 unique denominators) were identified electronically. One hundred fifty patients were randomly sampled per denominator, yielding 1154 unique patients.

Measurements: Receipt of recommended care, assessed by both electronic reporting and manual review. Sensitivity, specificity, positive and negative predictive values, positive and negative likelihood ratios, and absolute rates of recommended care were measured.

Results: Sensitivity of electronic reporting ranged from 46% to 98% per measure. Specificity ranged from 62% to 97%, positive predictive value from 57% to 97%, and negative predictive value from 32% to 99%. Positive likelihood ratios ranged from 2.34 to 24.25 and negative likelihood ratios from 0.02 to 0.61. Differences between electronic reporting and manual review were statistically significant for 3 measures: Electronic reporting underestimated the absolute rate of recommended care for 2 measures (appropriate asthma medication [38% vs. 77%; P < 0.001] and pneumococcal vaccination [27% vs. 48%; P < 0.001]) and overestimated care for 1 measure (cholesterol control in patients with diabetes [57% vs. 37%; P = 0.001]).

Limitation: This study addresses the accuracy of the measure numerator only.

Conclusion: Wide measure-by-measure variation in accuracy threatens the validity of electronic reporting. If variation is not addressed, financial incentives intended to reward high quality may not be given to the highest-quality providers.

Primary funding source: Agency for Healthcare Research and Quality.

Publication types

  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Adult
  • Aged
  • Cross-Sectional Studies
  • Electronic Health Records / standards*
  • Female
  • Humans
  • Likelihood Functions
  • Male
  • Meaningful Use*
  • Middle Aged
  • Predictive Value of Tests
  • Sensitivity and Specificity