Costs Associated with Data Collection and Reporting for Diabetes Quality Improvement in Primary Care Practices: A Report from SNOCAP-USA ======================================================================================================================================== * David R. West * Tiffany A. Radcliff * Tiffany Brown * Murray J. Cote * Peter C. Smith * W. Perry Dickinson ## Abstract *Purpose:* Information about the costs and experiences of collecting and reporting quality measure data are vital for practices deciding whether to adopt new quality improvement initiatives or monitor existing initiatives. *Methods:* Six primary care practices from Colorado's Improving Performance in Practice program participated. We conducted structured key informant interviews with Improving Performance in Practice coaches and practice managers, clinicians, and staff and directly observed practices. *Results:* Practices had 3 to 7 clinicians and 75 to 300 patients with diabetes, half had electronic health records, and half were members of an independent practice association. The estimated per-practice cost of implementation for the data collection and reporting for the diabetes quality improvement program was approximately $15,552 per practice (about $6.23 per diabetic patient per month). The first-year maintenance cost for this effort was approximately $9,553 per practice ($3.83 per diabetic patient per month). *Conclusions:* The cost of implementing and maintaining a diabetes quality improvement effort that incorporates formal data collection, data management, and reporting is significant and quantifiable. Policymakers must become aware of the financial and cultural impact on primary care practices when considering value-based purchasing initiatives. * Data Reporting * Health Policy * Practice-Based Research Network * Primary Health Care * Quality of Health Care * Quality Improvement The total cost of diabetes care is 2.3 times the cost of care for nondiabetic patients.1 These data are significant and well-accepted enough to form a credible business case to multiple stakeholders regarding the improved management of diabetes.2 It is this evidence that serves as a foundation for the emergence of payment policies and mechanisms designed to encourage guideline concordant diabetes care in primary care settings.3 Such value-based mechanisms may provide local and federal incentives for providers, including public recognition, financial incentives for reporting on specific quality benchmarks, and direct financial incentives for performance.4⇓–9 Initiatives and incentives related to the “meaningful use” of health information technology also will increase the pressure on practices to collect and report quality measurement data.10 Through these polices, the ability to benchmark care and show consistency or improvement already is being linked to payment, and the need for benchmarking likely will expand as payers become more oriented toward value-based purchasing.1 Quality measurement is also a core feature of the patient-centered medical home, which is expected to become a dominant model for primary care in the United States. Despite the ongoing implementation of these initiatives and their associated measures, little is known about the cost to primary care practices of collecting and reporting quality measures. Information regarding the costs and best methods of collecting and reporting quality measure data are vital for practices trying to make informed decisions about whether to adopt or implement new quality-improvement initiatives or to monitor initiatives that are already in place. The time and resources needed for individual primary care practices to provide all recommended care for all diabetic patients is often daunting, especially in the face of similar recommendations across other chronic diseases and preventive care. The further challenge of measuring and reporting data on these clinical recommendations is not well-studied. The economic burden on primary care practice must be better understood to place the needs of patients, payers, and policymakers within a context of a realistic financial model that allows sustained quality improvement efforts. This Agency for Healthcare Research and Quality–sponsored project (a Task Order funded through the Primary Care–Practice-Based Research Network Master Contract) aimed at measuring costs related to quality data collection for diabetes care. We conducted interviews, observations, and cost calculations for 6 primary care practices in Colorado that were participating in a diabetes quality improvement program. ## Methods ### Participating Practices When commissioning this study, the Agency for Healthcare Research and Quality required the engagement of 6 primary care practices as subjects of an in-depth analysis of costs related to diabetes data collection and reporting. We recruited small- to intermediate-sized, independent medical practices in Colorado that were engaged in the Improving Performance in Practice (IPIP), a national program sponsored by primary care accreditation boards and professional organizations that aimed at assisting practices with integrating practice improvement into their regular activities.11 IPIP had assigned each practice a quality improvement coach to improve care in a particular area. The practices were selected purposefully on the basis of a rating of relatively high engagement in IPIP by the practice quality improvement coaches. Furthermore, we chose diverse practices (see Table 1) that had implemented systems for collecting and reporting diabetes quality measures, including practices with and without electronic health records or membership in an independent practice association (IPA), with varying numbers of clinicians and patients with diabetes and with varying levels of use of the quality measure data for quality improvement and clinical change. All 6 practices identified were family medicine, and all practices that were approached agreed to participate. View this table: [Table 1.](http://www.jabfm.org/content/25/3/275/T1) Table 1. Practice Characteristics ### Data Collection We used an economic framework in which we sought to measure both the direct and indirect costs of implementing and maintaining reporting systems to track diabetes care and related quality improvement efforts. The protocol was approved by all applicable institutional review boards. We chose a mixed-methods approach because the quantification of cost components without contextual understanding of the practice is difficult. Although we did not conduct a thorough cost–benefit analysis, we did collect qualitative information regarding perceived benefits. We identified costs associated with both the implementation and maintenance of reporting operations that were specific to data collection and reporting operations. We operationally defined these costs as those directly related to the gathering of data elements expressly for the purpose of reporting predefined metrics associated with diabetes care to an outside entity. The data collection approach included (1) an initial round of structured key informant interviews with IPIP coaches, administrative staff, and practice leadership (the lead decision makers identified to us by each practice) to guide and test data collection methods and instruments; (2) direct observation of practices to monitor costs of ongoing processes; and (3) structured follow-up interviews with practice staff and IPIP personnel. Implementation costs were collected by interpolating responses based on the data collected from key informants with financial records. To quantify these costs, we incorporated the recollections of practice personnel and the detailed IPIP documentation of time spent by IPIP coaches and administrative personnel assisting each practice with implementation during a 1-year implementation period. The subsequent detailed cost interviews with practice staff focused on identifying any materials or capital purchases that were needed to implement the process. For the direct observation of maintenance activities in the practice, the research team identified and mapped each process identified through the initial key informant interviews.12 Process flow mapping has long been established as a useful method for making the implicit steps of complex activities both visible and clear. This technique has been used extensively for analyzing recurring decisions and processes involving multiple people and complex situations, and it is recognized as a critical component of event flow and sequencing analysis.13 This step included interviews with staff involved in each subprocess related to quality data collection and reporting and a time-and-motion study to observe (when feasible) the various steps employed, time required, and role of all involved clinicians and staff. Information regarding typical, slowest, and fastest expected times for each process was requested from key personnel. We also queried respondents regarding the perceived benefit of the quality improvement activities, as well as what other activities had been supplanted by the quality data collection. Typical staff roles included in this step of data collection included the practice manager, front desk personnel, medical assistants, and clinicians. Information collected from the process maps and time-and-motion study was used to allocate the time associated with the various processes. By necessity, our process mapping included the entire spectrum of activities perceived by the practice to be associated with their diabetes quality improvement system. We deemed this approach as critical to developing a full understanding of where in the processes of care data collection and reporting activities occurred, as well as for establishing possible linkages between these processes. Subsequent interviews focused on the collection of cost-related data and allowed us to collect detailed information from individuals identified as having the most comprehensive understanding of the time, materials, and processes related to quality data collection and reporting. The interview protocol included a detailed recall of ongoing costs, roles of various staff, and assistance from IPIP. Practices provided a list of personnel by role and their associated wage rates. We used this information to determine the cost of staff time associated with each discrete process for collecting quality data. Additional questions were tailored to obtain specific time and material costs for the practice plus any maintenance costs not captured through direct observation. Rather than developing predetermined categories of cost, we asked practice personnel about the specific costs that they incurred with follow-up with specific probes. Maintenance costs included activities that were continued beyond the time frame when the practice was first able to report diabetes quality measure data. Annualized costs were estimated based on the projections of the time and resources being consumed by the processes at the time of our interaction with practice personnel. Maintenance costs included supplies for tracking diabetic patients, personnel costs associated with updating the registry, activities related to sending and receiving quality data, and costs of addressing problems related to the data reporting system, among others. Major data inputs required in our model were personnel cost data by job category (expressed as hourly wage rates) and a brief description of the associated activities for each process, the personnel involved, and the time typically spent on each activity. Using this model, we summarized ongoing activities by frequency of occurrence, personnel involved, and time required for the activity. The model also allowed us to incorporate nonpersonnel costs (eg, supplies, equipment, fees) so that these costs also could be included into and allocated within our overall costs estimates. ### Analysis We developed specific analytic tools to compile the personnel cost for each major process related to data collection and reporting. These data were assembled with nonpersonnel cost data collected during the focused interview to develop an aggregate cost for each practice related to the data collection and reporting efforts. Personnel costs were based on wage rates reported by the practices. Because of the wide variation of fringe benefits available to staff, we applied a 22% fringe benefit rate to all personnel costs based on practice survey data reported by the Medical Group Management Association in 2008 in an effort to provide total cost calculations that more closely approximate what practices may experience throughout the nation.14 To validate our hourly wage data collected directly from practices, we identified the federal occupational code for each role within the studied practices, to the extent that a comparable job category could be identified. We then determined the national average hourly wage for each role to assure that local costs were representative of nationally prevailing costs. Because national and local data were very similar (as illustrated in Table 1), we used the local wage data to compute the actual personnel costs within each practice. Once the practice-specific costs were aggregated for implementation and maintenance, we were able to develop our aggregate cost estimates across the 6 study practices. These costs were then reduced to a per-capita dollar amount by dividing by the number of active diabetes patients reported to be in each practice. Aggregated costs also were reduced to a dollar amount per full-time equivalent (FTE) clinician by dividing the sum of personnel and nonpersonnel costs by the reported FTE clinicians from the participating practices. ## Results Key characteristics of the practices participating in this project can be found in Table 1. The practices ranged from 3 to 7 clinicians and from 75 to 300 patients with diabetes. Three had electronic health records, and 3 were using paper charts. Three practices were members of an independent practice association that provided support and incentives for quality improvement work in the practices. The characteristics of each practice are reflected in Table 1. Table 2 is a summary of the personnel cost data collected from each practice, by role. Personnel costs constituted the single largest category of cost associated with data collection and reporting. View this table: [Table 2.](http://www.jabfm.org/content/25/3/275/T2) Table 2. Hourly Wage Costs by Practice Role Table 3 provides a summary of aggregate costs for each practice according to organization involved. The period associated with incurring these costs included a 1-year implementation phase and approximately 1 year of ongoing reporting after implementation. We note that the proportion of total costs was higher for IPIP relative to the practices. View this table: [Table 3.](http://www.jabfm.org/content/25/3/275/T3) Table 3. Aggregate Costs by Organization Table 4 shows the cost estimates with the costs allocated between implementation versus maintenance periods. This allocation decision was based on careful review of the data across practices and discussion with key informants. View this table: [Table 4.](http://www.jabfm.org/content/25/3/275/T4) Table 4. Implementation and Maintenance Costs by Practice ### Deriving Unit Cost Estimates Although these overall costs across all 6 study practices are helpful, we determined that calculating the cost per practice, the cost per patient with diabetes (both per year and per month), and cost per clinician FTE also would be meaningful for policymakers needing to make sense of these data. The results of our calculations are summarized below. #### Implementation Costs The cost of implementation for the practices and IPIP when taken together (to represent the total cost of implementation) was estimated to be $15,552 per practice. These practices served an average of 208 diabetic patients, resulting in an estimated average cost per year for implementation to be approximately $74.77 per patient. Translating this to a “per patient, per month amount” would indicate the level of reimbursement necessary in a capitated payment environment to cover these costs to be approximately $6.23 per patient per month. The annualized cost per clinician FTE was approximately $1,180 (about $90 per month). #### Maintenance Costs The total annual maintenance costs for the practices and IPIP, when taken together (to represent the total cost of operation), were estimated to be approximately $9,553 per practice, $663.40 per clinician, or $45.93 per patient. Translating this to a “per patient, per month” amount, the average was about $3.83. #### Issues from Analysis Variations in costs among practices were much easier to understand by incorporating qualitative data. Several key issues emerge from that analysis: (1) costs may be higher in practices that utilize more expensive personnel for tasks that could be done by lower-cost staff members. An example of this is practice 6, in which one of the part-time physicians also functioned as the practice manager. This individual's heavy involvement in the process increased the overall diabetes quality improvement costs for that practice considerably. (2) Practices that initially incorporate quality data collection but do not implement solid systems for cleaning and maintaining that data may not be able to accomplish the desired practice-level changes in quality improvement and diabetes care. This was the case with practice 5, which initially implemented a registry system with IPIP's assistance but did not have a consistent system across the practice for maintaining the information in the registry. (3) When combined, the 3 practices (2, 3, and 6) that were members of an IPA expended more effort in data collection and reporting activities than the other 3 practices. The IPA provided some financial support and incentives for these activities, which may have impacted both the motivation and the ability of the practices to do more. (4) Only one of the practices (practice 1) reported an ability to pull their quality measures directly from their electronic health record (EHR), and the other practices indicated that the additional time, costs, and processes associated with double data entry into both a registry and the EHR were major barriers to their quality improvement efforts. (5) Practices with paper charts reported spending significant time and energy maintaining and checking the reliability of the information against data from the chart. (6) Costs from the involvement of the IPIP quality improvement coaches were substantial, but the practices indicated that they would not have been able to implement the quality improvement processes without the assistance of a coach. ### Benefits of Quality Improvement Efforts Practices were not able to provide quantifiable data regarding the return on investment for their quality improvement activities during the study period. Through qualitative interviews, all reported a perceived improvement in the quality of care provided as a result of the data collection and reporting efforts. One practice manager said the reports “paint a picture of the quality of care we're providing” and created a new awareness and focus on quality that had been lacking, helping to create a “culture of quality” at the practice. Clinicians also valued “the power of having better data for managing patients.” Practices mentioned the possibility of financial benefit as a result of their quality improvement efforts through bonuses, pay-for-performance, higher coding, group visits, and bringing patients in for services, but had difficulty quantifying these benefits. Most practices also reported that they had improved the organization and efficiency of work flow. One large benefit of the team-based quality improvement efforts was improved clinician and staff satisfaction, and most practices reported that staff members were more engaged and invested in the practice. ## Discussion This mixed-methods assessment of the cost and issues surround quality measurement collection and use in primary care practices yielded several interesting findings. These cost results are quite similar to those reported by Halladay et al,15 yet employed somewhat different methods, including qualitative interviews that provided additional perspectives on the practice contexts and the results of the quality improvement processes. In addition, the data we were able to collect allowed us to calculate an estimated per diabetes patient per month cost to place our results into a context more easily understood by payers and policymakers. Our findings illustrate that data collection and reporting are inextricably intertwined with the overall diabetes care quality improvement process. We found the cost of implementing and maintaining a diabetes quality improvement effort that includes formal data collection and reporting mechanisms to be significant and quantifiable. These practices reported a dependency on external resources to implement a diabetes quality improvement initiative successfully, although that could be an artifact of their involvement in the IPIP program. Practices also indicated that it would be difficult for them to continue their quality improvement efforts without the continued availability of these resources or financial incentives. Policymakers must become aware of the financial and cultural impact on primary care practices when considering value-based purchasing initiatives. It is incumbent on those desiring meaningful and durable change to assure that their requirements of primary care practices are associated with making available the requisite resources that adequately cover the cost of quality improvement and the collateral data collection and reporting. The quantification of costs related to diabetes quality improvement was challenging and subject to limitations. For some practices, key individuals present at the time of implementation were no longer employed there. In addition, clinicians and staff who were present during implementation were often unsure of the exact time and effort spent on implementation efforts. IPIP program costs were easier to estimate because of the availability of records regarding their time spent. Consistent throughout each practice was the ability to quantify the hourly wage rate associated with each individual with a role in data collection and reporting for diabetes quality improvement. What was less consistent was the ability to recall, list, and quantify other associated costs. Because we had multiple respondents, a range of responses often was available for the calculations. In such cases, we generally selected the median response or the response that was provided by the respondent with access to the most accurate information. Objective responses were generally similar both within and across practices. An additional limitation to the interpretation of these results may be the representativeness of the practices included in this project. The practice sample represented a good range of practice types and sizes and included both practices with EHRs and those with paper charting systems. All practices had implemented quality improvement and care redesign and were among the initial 50 practices in Colorado to participate in IPIP. It is possible that the practices in our sample may represent an “early adopter” group, with more internal motivation to implement innovations than the bulk of practices. Experiences with these practices suggest that they actually were representative of Colorado practices in other regards, and we have no evidence to suggest that their potential status as early adopters would impact their costs to a great extent. The evolution of EHRs has the potential to assist practices greatly and decrease the costs associated with quality measure data collection and reporting. The development of the meaningful use criteria for EHRs has put pressure on EHR companies to make this process easier, but at the time of this study the lack of ability to extract quality data from the EHR easily was a major obstacle. This remains an issue for many practices at this time, and we can only hope that future developments in health information technology will remove this unfortunate barrier. ## Conclusion Creating sustainable change is critical for successfully implementing systems of care that are value based. To be fully adopted and embraced, systems must be applicable to patient care across payment sources to assure economies of both scope and scale. In addition, providers must have adequate patient volumes across the various key payers to which a meaningful financial incentive for care processes or outcomes may be linked.16 However, the reality is that multiple well-meaning but disparate strategies of payers and regulators have resulted in overlapping reporting and documentation requirements being imposed on busy primary care practices, without a consistent strategy. Uniform expectations of primary care practices, coupled with meaningful financing mechanisms that recognize the cost of quality improvement, are critical precursors of a sustained model of care. If financial incentives (such as bonuses or incremental payment increases) are to be linked with outcomes, new systems and resources also would be needed to implement and manage the collateral data collection and reporting that will be required to earn financial incentives. ## Acknowledgments The authors thank the clinicians and staff from the Colorado practices that participated in this research project for their willingness to allow their efforts to be observed, and for frankly and openly answering our questions. In addition, we thank Dr. Marjie Harbrecht and her staff from HealthTeamWorks (formerly the Colorado Clinical Guidelines Collaborative) for their assistance with the recruitment of these practices and for providing much of the data required for our analyses. ## Notes * This article was externally peer reviewed. * *Funding:* Funding was provided by the Agency for Healthcare Research and Quality (AHRQ) under task order contract no. HHSA290200710008I (AHRQ task order officer, David Lanier, MD; task order leader, W. Perry Dickinson, MD). * *Conflict of interest:* none declared. * Received for publication February 14, 2011. * Revision received November 29, 2011. * Accepted for publication December 7, 2011. ## References 1. 1. National Diabetes Information Clearinghouse (NDIC). National diabetes statistics, 2011. Available at: [http://diabetes.niddk.nih.gov/dm/pubs/statistics/](http://diabetes.niddk.nih.gov/dm/pubs/statistics/). Accessed 19 March 2012. 2. 2. Beaulieu N, Cutler DM, Ho K, et al. The business case for diabetes disease management for managed care organizations. Forum Health Econ Policy 2006;9(1):1–36. 3. 3. Agency for Healthcare Research and Quality (AHRQ). Theory and reality of value-based purchasing. Available at: [http://www.ahrq.gov/qual/meyerrpt.htm#head3](http://www.ahrq.gov/qual/meyerrpt.htm#head3). Accessed 9 October 2008. 4. 4. Rosenthal MB, Fernandopulle R, Song HR, Landon B. Paying for quality: providers' incentives for quality improvement. Health Aff (Millwood) 2004;23:127–41. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToiaGVhbHRoYWZmIjtzOjU6InJlc2lkIjtzOjg6IjIzLzIvMTI3IjtzOjQ6ImF0b20iO3M6MjA6Ii9qYWJmcC8yNS8zLzI3NS5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 5. 5. Eslan A, Preheim C. Better and faster: how safety net providers are redesigning care. White paper prepared for the California Healthcare Foundation, John Snow International, January 2011. 6. 6. Institute for Healthcare Improvement. The IHI triple aim. Available at: [http://www.ihi.org/IHI/Programs/StrategicInitiatives/TripleAim.htm](http://www.ihi.org/IHI/Programs/StrategicInitiatives/TripleAim.htm). Accessed 1 October 2008. 7. 7. Kuhmerker K, Hartman T. Pay for performance in state Medicaid programs: a survey of state Medicaid programs. The Commonwealth Fund. Available at: [http://www.commonwealthfund.org/publications/publications\_show.htm?doc\_id=472891](http://www.commonwealthfund.org/publications/publications_show.htm?doc_id=472891). Accessed 3 April 2012. 8. 8. Rosenthal MB, de Brantes F, Sinaiko A, Frankel M. Bridges to excellence: recognizing high quality care. Am J Manag Care 2008;14(10):670–7. [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=18837645&link_type=MED&atom=%2Fjabfp%2F25%2F3%2F275.atom) [Web of Science](http://www.jabfm.org/lookup/external-ref?access_num=000259875700005&link_type=ISI) 9. 9. Wagner EH. Chronic disease management: what will it take to improve care for chronic illness? Eff Clin Pract 1998;1:2–4. [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=10345255&link_type=MED&atom=%2Fjabfp%2F25%2F3%2F275.atom) 10. 10. Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med 2010;363:501–4. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1056/NEJMp1006114&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=20647183&link_type=MED&atom=%2Fjabfp%2F25%2F3%2F275.atom) 11. 11. Improving Performance in Practice [homepage on the Internet]. Available at: [http://www.ipip.us/](http://www.ipip.us/). Accessed 18 October 2011. 12. 12. West DR, Westfall JM, Araya-Guerra R, et al. Using reported primary care errors to develop and implement patient safety interventions: a report from the ASIPS Collaborative. Advances in Patient Safety: From Research to Implementation. Volume 3: Implementation Issues. AHRQ Publication No. 05–0021-3. 2005. Rockville, MD: Agency for Healthcare Research and Quality; 2005. 13. 13. Miles MB, Huberman AM. Qualitative data analysis: a sourcebook of new methods. 2nd ed. Thousand Oaks, CA: Sage Publications; 1994. 14. 14. Medical Group Management Association. Cost survey for multispecialty practices: 2008 report based on 2007 data. Englewood, CO: Medical Group Management Association; 2008. 15. 15. Halladay JR, Stearns SC, Wroth T, et al. Cost to primary care practices of responding to payer requests for quality and performance data. Ann Fam Med 2009;7:495–503. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6ODoiYW5uYWxzZm0iO3M6NToicmVzaWQiO3M6NzoiNy82LzQ5NSI7czo0OiJhdG9tIjtzOjIwOiIvamFiZnAvMjUvMy8yNzUuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 16. 16. Integrated Healthcare Association [homepage on the Internet]. Available at: [http://www.iha.org/index.html](http://www.iha.org/index.html). Accessed 1 October 2008.