Abstract
Purpose: Provide credible estimates of the start-up and ongoing effort and incremental practice expenses for the Advancing Care Together (ACT) behavioral health and primary care integration interventions.
Methods: Expenditure data were collected from 10 practice intervention sites using an instrument with a standardized general format that could accommodate the unique elements of each intervention.
Results: Average start-up effort expenses were $44,076 and monthly ongoing effort expenses per patient were $40.39. Incremental expenses averaged $20,788 for start-up and $4.58 per patient for monthly ongoing activities. Variations in expenditures across practices reflect the differences in intervention specifics and organizational settings. Differences in effort to incremental expenditures reflect the extensive use of existing resources in implementing the interventions.
Conclusions: ACT program incremental expenses suggest that widespread adoption would likely have a relatively modest effect on overall health systems expenditures. Practice effort expenses are not trivial and may pose barriers to adoption. Payers and purchasers interested in attaining widespread adoption of integrated care must consider external support to practices that accounts for both incremental and effort expense levels. Existing knowledge transfer mechanisms should be employed to minimize developmental start-up expenses and payment reform focused toward value-based, Triple Aim–oriented reimbursement and purchasing mechanisms are likely needed.
The integration of behavioral and primary care services has become a relatively common element of recent efforts to transform basic health care provision.1,2 The Advancing Care Together (ACT) program is 1 example of this effort, supporting integrated care interventions across a number of practice sites in the state of Colorado.3 New interventions, such as behavioral health and primary care integration, require investments to initiate the intervention and incur ongoing expenses to implement them. Published information on the level and type of expenses incurred in undertaking the transformation to integrate primary and behavioral health care is extremely limited, despite a widespread understanding that financial support is a critical barrier to widespread and sustainable adoption.4,5 Practices that have made this transformation or are considering it have little basis to understand or anticipate the expenses they might incur. Similarly, payers and policy makers that may be interested in supporting such interventions have limited information to assess the likely extent or nature of reimbursement change that may be necessary to provide sufficient incentives and support to make the transition to behavioral health and primary care integration. The ACT program presented an opportunity to provide a descriptive case study of start-up and ongoing expenses across a variety of specific behavioral health and primary care integration interventions.
Practical expenditure data on health care interventions generally is extremely sparse in the academic literature.6⇓–8 Expenditures, where reported, are often provided as part of cost effectiveness or related evaluations.9,10 As such, they typically do not isolate practice expenditures, include start-up expenses, or report expenditures in categories relevant to typical practice activities. One study that is unique in this respect reported practice start-up and ongoing incremental expenses for primary care practices incorporating health behavior change services targeting at-risk drinking, healthy diet, physical activity, and smoking in primary care as part of the Prescription for Health (P4H) program.6 The P4H expenditure study developed a credible, standardized tool for capturing intervention-related expenditures at the practice level.11 This tool was applied with modest modifications to the practices in the ACT program to attain credible estimates of the incremental and effort expenses incurred to start-up and deliver these behavioral health and primary care integration interventions. Table 1 lists the types of expenses that were collected and reported for the study with definitions and examples.
Incremental expenses reflect expenses tied to new resources acquired and used to start up and deliver new interventions. This expense perspective provides practices with estimates of the amount of capital needed for start-up and the additional, net expenses of ongoing implementation. They also reflect estimates of the type of expenses that would typically feed into fee-for-service-type reimbursement calculations. Effort expenditures for start-up and delivery incorporate intervention expenses related to both new and existing resources. This expense perspective captures a more complete picture of the effort practices undertake to implement the intervention. Differences between effort and incremental expenses elicit important information that payers and policy makers must take into account in considering reimbursement policies that can provide meaningful incentives and support for these types of interventions. Thus, the purpose of this study is to report credible estimates of the start-up and ongoing effort and incremental practice expenses for the ACT behavioral health and primary care integration interventions in a manner that is informative to practice implementers as well as payers and policy makers.
Methods
The research team identified the expenditure collection process developed for the Prescription for Health (P4H) program as a relevant and applicable model for the purposes of the ACT program evaluation. The research team acquired the free EXCEL-based data collection tool and user's guidebook.12 Some general modifications were made to the data collection tool and guide book to accommodate additional information sought by the research team. This included creation of separate start-up expenditure tools designed to capture developmental and general start-up expenses and to allow practices to identify staff members that were newly hired for the intervention, if any. The overall tool and guide book modified for application to the ACT program remained largely identical to the original.
Although all the ACT interventions involved integration of primary and behavioral health care employing both behavioral health and primary care practitioners, each intervention was unique in its specifics. To accommodate the unique aspects of the ACT interventions, and as a normal part of applying the general tool to specific intervention and practice sites, each site was required to develop a flowchart of specific activities that defined their intervention and identified staff types that were involved in each step of the intervention. This intervention and site-specific information was used to create site-specific categories of staff and intervention specific activities within the tool to capture staff activity within the interventions.
The study was designed to take an intervention perspective. Although each intervention was attributed to a sponsoring practice, some involved joint efforts of separate behavioral health and primary-care practices that shared resources to accomplish the intervention. The study focused on expenses related to the start-up or delivery of the interventions, regardless of formal organizational boundaries. All other expenses incurred by the sponsoring practice or other practices involved in the intervention were excluded, as were any expenses related to ACT program evaluation activities.
As a descriptive case study encompassing practices implementing different specific interventions in different practice settings, the study was designed to provide information on the types and distribution of expense levels that might be found among real-world practices seeking to accomplish behavioral health and primary care integration. Thus, there was no a priori expectation that expense levels should converge on a mean. To describe and explore the extent and nature of expenditure variation across these different integration efforts and settings, findings are presented at the individual intervention level, as the means of interventions above and below the median (“high” and “low” expense groups), as well as the overall sample mean.
Setting and Sample
ACT was a comparative case study of diverse practices in Colorado implementing their own ideas about how they might integrate care in their local setting for patients with emotional and behavioral problems. Interventions were determined by the practices themselves, and practices participated in a learning collaborative, practice change facilitation, and site visits with the program staff and evaluation team. The study captured expenditure data on 10 of 11 ACT intervention sites.
Specific interventions and practice characteristics varied across the sample as described in other articles in this supplement. Table 2 provides a brief description of the practices and interventions that can be used to reference descriptive tables provided elsewhere in this supplement, while additionally highlighting practice and intervention characteristics that may influence start-up and ongoing expenditures. The specific characteristics highlighted in Table 2 include whether the intervention employed systematic screening of patients for behavioral health conditions or relied on clinician discretion, the number of direct staff full-time equivalents (FTEs) involved, whether new staff were hired for the intervention, and whether the intervention involved substantial capital asset purchase defined here as either intervention-specific information technology or physical space investments.
Data Collection
Staff members at each intervention site were identified as responsible for site data collection. These staff members were provided user's guidebooks and spreadsheet-based data collection instruments developed from the publicly available P4H user guidebook and instruments as noted above. Consistent with the P4H expenditure data collection process, practices were provided with 4 instruments: start-up, baseline, and 2 for ongoing expenses. A key element of the data collection process involved practices tailoring the data collection instruments to their specific intervention and practice environment within the standardized format. Examples of the expenditure tools are available from the authors.
For the ongoing expense instruments, research staff worked with practice teams to create a flowchart specific to their intervention. The completed flowchart provided the foundation for tailoring the ongoing expense instruments by identifying all the key steps practice staff provided as patients enter, traverse, and leave the intervention as well as specific staff types involved in each intervention step. The completed flowchart was used to tailor sections of the instrument that collected counts of new or ongoing patients in each month who experienced at least 1 step or aspect of the intervention in a selected month, as well as tables organized by intervention step and staff type used to collect time spent in each intervention step by staff type and ultimately used to calculate direct staff expense.
For the start-up instrument, which included separate worksheets for developmental and general start-up expenses, practice staff were requested to reflect on the relevant staff types and intervention-specific direct non-staff items encompassed in their start-up efforts, and to appropriately adjust the category labeling within each section of the start-up instrument. The start-up instrument was also modified to capture the FTE and time devoted to start-up activities of any new staff hired specifically for the intervention.
Practice staff used data from practice financial records, tracking, or other information systems, observation of time devoted to complete specific tasks, and direct staff recall to complete the tailored data collection instruments. Data for start-up expenses encompassed the entire start-up period as identified by each practice. Baseline and ongoing expense tools collected data for a single month each. The “baseline” month was defined as the last month before implementation, whereas the ongoing months represented 2 selected months of practice activity after initial implementation, which were intended to reflect a steady state of operations of the intervention. Data collection occurred during 2013 and 2014, spanning data for periods as early as the last quarter of 2011 due to 1 practice's longer (24-mo) reported start-up period.
Initial data submissions from the practices were reviewed by the research team for apparent accuracy and consistency. Apparent errors, inconsistencies, or missing data were reviewed, discussed and corrected as necessary in an iterative process with practice staff. Although baseline data were not used in the reported study results directly, they were used to help assess general consistency of reported expenses from start-up to ongoing periods, as well as the extent of any nonintervention-based trends in expenditure levels over time. The data review process, although extremely valuable in acquiring complete and accurate data, added an additional 3 to 6 months to time frame for completion of data collection for each practice.
Data Analysis
Start-up and ongoing expenses were calculated and reported separately. Start-up expenses were reported in total and the percentage of total that were developmental. Ongoing expenses were reported as the average of the 2 sample ongoing months of data collected. For each expense type, effort and incremental expenditures were calculated and reported as defined in Table 1. Incremental expenditures consisted of new expenses related to the intervention and a portion of non-staff overhead related to any new intervention hires. New expenses were explicitly identified within the data, such as expenses for new hires' effort on the intervention and all non-staff direct expenditures. Incremental overhead expense was estimated by calculating the ratio of new staff to total staff salaries and assigning that portion of non-staff overhead expenses as incremental.
For each of these general expense categories, we calculated and reported total, staff, and non-staff expenses, with the latter 2 categories further divided into direct and indirect (administrative staff or overhead) expenses. Start-up expenses are reported on a cash basis (ie, without asset depreciation) as they reflect a complete, 1-time set of expenditures. Average monthly start-up expenses are reported in addition to the totals to provide a more standardized comparison given the varying start-up times across practices. Ongoing expenses are reported on an accrual basis (ie, with depreciation of assets) as they reflect a continuing flow of practice expense.
All ongoing expenses are reported on a per-patient basis to allow a more standardized comparison across practices. A patient was defined as any individual who experienced at least 1 step or aspect of the intervention during a reporting month. This definition of a patient differs from the REACH (the extent to which the integration program was delivered to the identified target population) analyses for ACT practices reported elsewhere, which focused on newly screened individuals.13
Given the relatively short period covered by the expenditure data, no adjustments were made for time or inflation. Because most of the data are from 2013, all expenses are assumed to reflect 2013 dollars. Where depreciation was applied, a 5-year useful life assigned for computer hardware and other general depreciable assets, whereas a 15-year useful life was assigned for capital improvements to buildings or other existing space. The Colorado Health Foundation provided grant funding to support data collection and analysis for this study. The Oregon Health & Sciences University Institutional Review Board approved this study protocol.
Results
Start-Up Expenses
Practice start-up effort expenses and the percentage of total start-up expenses reported as developmental are presented in Table 3 for each of the 10 ACT practice sites, along with the overall practice-site average and the averages of the 5 highest and lowest practice sites by total start-up expense. Average total start-up effort expenses were $44,076 per practice and ranged from a low of $914 to a high of $185,949. The 5 highest start-up expense practices had average total start-up expenses of $80,848, whereas the 5 lowest practices averaged $7,304.
The broad range of total start-up effort expenses seemed in part to reflect differences in direct non-staff expenses and/or the length of start-up. Among the 5 highest practices non-staff direct expenses represented 45.3% of total start-up expenses, compared with 18.3% for the 5 lowest. Similarly, whereas the overall average start-up period was 7.9 months, the 5 highest start-up expense practices had average start-up periods of 10.8 months compared with 4.9 months for the 5 lowest. The 3 practices with large asset investments (No. 19, 18, and 7) were all part of the 5 highest expense practices and represented 3 of the 4 longest start-up periods. Two of these (No. 18 and 7), were the largest intervention sites with greater than 50 direct staff FTEs involved in the intervention, which may also have contributed to the extended start-up period and increased start-up expenditures.
On average, developmental start-up expenses represented a large proportion of total average start-up expenses at 42.4, 42.6, and 39.7% for overall, top, and 5 lowest, respectively. Although consistent across the practice averages reported in the table, this relationship varied greatly across individual practices. Five of the practices (No. 19, 16, 9, 7, and 13) reported developmental start-up expenses that were more than 50% of their total start-up expenses. For the remaining 5 practices, developmental expenses were no more than a third of the total, and typically much less. Practices with higher proportional developmental expenses (>50%) tended to have longer start-up periods, with 4 of the 5 practices longer than the average start-up duration.
Practice start-up incremental expenses are presented in Table 4. Total start-up incremental expenses had an overall average of $20,788 per practice, averages of $39,956 and $1621 for the 5 highest and lowest, and ranged from $122 to a high of $129,251. The reduction from the level of effort expenditures reflects the significant portion of staff expenses and non-staff overhead that came from existing resources. The overall and 5 highest average incremental start-up expenses were slightly less than half of average start-up effort expenses (47.2 and 49.4%, respectively), whereas the average for the 5 lowest practice incremental expenses was less than a quarter of effort expense (22.7%).
Five practices hired new staff related to the intervention (No. 18, 9, 7, 4, and 10) and thus had some incremental start-up expenses for direct staff. Two practices (No. 7 and 12) hired temporary support staff as part of developmental and general start-up activities and thus had some incremental start-up expenses for administrative staff. Non-staff direct expenses were unchanged, given that they are by definition incremental expenses. Thus, the 3 practices with large asset investments (No. 19, 18, and 7) had the highest incremental expenses. Non-staff direct expenses dominated the incremental start-up expenses generally, resulting in the same 5 highest and lowest expense practices for effort and incremental start-up expenses. None of the non-staff overhead expenses were identified as “new” resources.
Ongoing Expenses
Practice ongoing effort expenses are presented in Table 5. Monthly ongoing effort expenses had an overall average of $40.39 per patient, averages of $62.89 and $17.88 for the 5 highest and lowest respectively, and a range from $14.89 to $123.34. The 10 practices averaged 299 patients per month involved in the interventions, with the 5 highest ongoing expense practices averaging 145 patients per month and the lowest 5 averaging 454 patients.
The 5 highest and lowest expense practices correspond almost exactly with those identified as employing clinician discretion vs systematic screening in Table 2, respectively, suggesting scale effects related to the type of screening employed. In general, practices with systematic screening were more likely to have a lower volume of patients who move beyond the screening phase, and thus less “density” of intervention activity per patient. Discussion with practice staff and other ACT researchers suggested that this seems particularly true in this set of interventions, with “clinician discretion” interventions tending to have more substantial intervention activity beyond screening that further contributes “density” of intervention activity per patient. Practice criteria from Table 2, beyond screening type, did not seem to have any consistent influence on ongoing expenses levels.
Practice ongoing incremental expenses are presented in Table 6. Total ongoing incremental expenses had an overall average of $4.58 per practice, averages of $8.19 and $0.03 for the 5 highest and lowest, and ranged from a low of $0 to a high of $16.41. The 5 lowest practices all had ongoing incremental expenses of less than $2.00 per patient, with 3 of these at $0. Overall incremental monthly expenses were 11.3% of overall effort expenses, and represented 13.0% and 0.2% across the highest and lowest expense groups, respectively, reflecting the large portion of effort that was based on existing resources.
The highest and lowest 5 practices based on ongoing incremental expenses were not the same as those for ongoing effort expenses. Hiring of new staff for the intervention was the main determinant of ongoing incremental expenses. Thus, there was no clear relationship between levels of ongoing effort and incremental expenses. The high and low incremental expense groups have a reversed relationship to patient volume from ongoing effort expenses. This relationship, however, does not seem to be clearly aligned with screening type or other practice characteristic from Table 2.
Discussion
Practices with higher start-up expenses tended to have higher direct non-staff expenses, for example, interventions with large asset investments, and/or longer start-up periods. Incremental start-up expenses were less than half of start-up effort expenses on average, and less than 1 quarter for the lowest 5 practices, indicating that existing resource use plays a large part in start-up effort. Variation in incremental start-up expenses was dominated by non-staff direct expenses (ie, assets purchased specifically for the intervention). Notably, the same 5 practices fell in the upper and lower expenditure practice groups from effort to incremental start-up expenses (and with nearly the same ranking within groups), suggesting that non-staff direct expenses are the primary driver of start-up expense differences in general.
Although the low expense practices average for both effort and incremental expenses seem to be modest in relation to overall practice expense levels, the high expense practice averages could be significant impediments to many practices. Health care interventions with high asset-related start-up expenses, such as those involving health information technologies, may require external subsidies or support to assure broad implementation.14,15
Some opportunities seem to exist to reduce start-up expenses in cases where interventions are expanded internally or replicated externally. Given that developmental expenses represented 42.4% and 44.5% of start-up effort and incremental expenses overall, the “marginal” start-up expenses of expanding an intervention within the organization could be less than 60% of initial expenses, particularly in the case of higher-expense start-ups. The extent of this expansion “discount” would reflect how much of developmental expenses would actually need to be replicated. External replication may have less opportunity for such “expansion discounts,” but are reflected in the role of standardized intervention models, learning collaboratives, and other knowledge-transfer mechanisms in reducing the “cost” and aiding the diffusion of innovation.16⇓–18
Ongoing monthly practice effort expenses suggest that implementing integrated primary and behavioral health care had nontrivial effects on practice resources and workflow. Variation in expenditure levels across the ACT practices reflect the differences in the integration strategies, such as systematic screening vs clinician discretion, but likely include effects related to execution, organizational setting, and other practice and setting-specific characteristics.
Ongoing monthly practice incremental expenses were found to be only a fraction of effort expenses. Overall average incremental expenses were only 11.3% of overall average effort expenses. For the 5 highest expense practices, incremental expenses were 13% of effort expense, whereas for the 5 lowest expense practices they were less than 1% of effort expenses. Notably, 3 practices implemented their interventions without any measured incremental expenditures, essentially using only existing resources. This begs the question of how practices bear additional effort using existing resources. Practice staff may have been simply working harder or longer, or existing effort may have been supplanted by the new intervention activities. The former raises concerns about staff burnout, whereas the latter raises concerns about potential degradation of practice performance outside the intervention.
On the one hand, the level of per-patient ongoing incremental expenses can be seen in a relatively positive light. Across a variety of interventions within the ACT program designed to integrate behavioral health and primary care, the incremental per-patient expenses were generally low and in some cases virtually nonexistent. Achieving this type of care integration seems likely to impose modest expense increases on the greater health care system as a whole. In contrast, the vast difference between effort and incremental expenses raises significant issues regarding incentives and support for practices undertaking innovative practice change. Typical health care reimbursement, such as standard fee-for-service, bases reimbursement largely on relative expense levels. Thus, if ACT practices were to receive additional compensation for their innovations, it would likely be related to expected incremental expenses. This creates a potential “effort barrier” given that there is little external incentive or support for practices to innovate where most of the expense is borne by the practice.
This analysis may help explain why the diffusion of innovation in health care is often slow to nonexistent. Without extrinsic rewards that reasonably compensate health care providers for adopting innovative practices, the diffusion of innovation is reliant solely on the intrinsic desire of providers to innovate. These intrinsic benefits must be weighed against the intrinsic effort costs of doing so. Innovative reimbursement or purchasing mechanisms that recognize value, and not simply expense, are likely necessary to provide the external push and support necessary for widespread adoption of integrated behavioral health and primary care interventions.19⇓⇓⇓–23
Limitations
There are a variety of limitations regarding this study. A primary limitation is that the study data are all based on practice self report. Some of the data, such as salaries and benefits of staff, come from formal reporting systems that may involve external verification processes (eg, audits of financial records). A considerable portion of the data, however, is based on practice estimates such as the amount of time spent in specific intervention activities or proportions of administrative effort and overhead associated with the interventions. Inaccuracies in these estimates could impart significant bias in the reported expense levels. Applying more formal or externally validated processes to assign expenses to the intervention would likely improve accuracy. However, the intent of the study was to apply an existing tool that combines self report with face validity assessment by the research team to provide generally credible estimates of practice intervention expenses, and iterations with practice staff, support from research and practice facilitation staff, and detailed information from the evaluation team for ACT mitigated oversights and unreasonable estimates.
A second limitation concerns the generalizability of the practice data presented, all from Colorado practices. Practice interventions represented within the ACT program share commonality in their general intent to integrate behavioral health and primary care. Beyond that these practices were intentionally chosen for the diversity of their approaches to integration and varied significantly in their specifics and organizational features. Thus, the range of the expense data presented was expected and can only be used to draw broad conclusions.
The process for calculating incremental expenses in this study differed from the process used in the P4H study from which the study tools were drawn. In this study, incremental expenses were measured by expenses that could be explicitly identified as new resources acquired for the intervention. The original P4H study calculated incremental expenses as the difference in total expenses during the ongoing phase from a baseline pre-intervention level. These 2 methods have different strengths and weaknesses. The P4H approach can capture net incremental expenses (ie, intervention effects that may raise or lower expenses) but does not assure that found expenditure changes can be clearly attributed to the intervention. The process used in this study assures that incremental expenses are directly attributable to the intervention, but may miss other indirect effects on expense levels.
Conclusion
Overall, the incremental start-up and ongoing expenses reported across the variety of ACT program behavioral health and primary care integration interventions suggest that widespread adoption would likely have a relatively modest effect on overall health systems expenditures. From a practice perspective, and particularly when measuring effort or “full” practice expenditures, start-up and ongoing expenses are not trivial and may well pose barriers to adoption. Payers and purchasers interested in attaining widespread adoption of integrated care must consider means to provide external support to practices that account for both incremental and effort expense levels. Development of knowledge-transfer mechanisms such as standardized intervention models and tools or learning collaboratives could help to reduce start-up effort expenditures. Reimbursement and purchasing mechanisms that reflect the full value of these interventions from a Triple Aim perspective, and not simply incremental expenses, are likely necessary to assure the widespread diffusion of these practice innovations.
Acknowledgments
The authors would like to acknowledge the efforts and patience of ACT practice staff in providing data for this study including Pam Wise Romero, Matt Engel, Shannon Tyson Poletti, Michele Steward, Glenn Kotz, Candice Talkington, Julie DeSaire, Carol Schlageck, Cheryl Young, Lori Bryan, Laura Engleman, Katrin Seifert, Caitlin Barba, and others.
Notes
This article was externally peer reviewed.
Funding: The Colorado Health Foundation, Advancing Care Together: Creating Systems of Care for the Whole Person: Program Evaluation, (primary investigator: Larry A. Green, MD).
Conflict of interest: none declared.
- Received for publication February 8, 2015.
- Revision received April 15, 2015.
- Accepted for publication April 17, 2015.