Impact of Alternative Payment Methodology on Primary Care Visits and Scheduling =============================================================================== * John Heintzman * Erika Cottrell * Heather Angier * Jean O'Malley * Steffani Bailey * Lorie Jacob * Jennifer DeVoe * Maria Ukhanova * Erin Thayer * Miguel Marino ## Abstract *Background:* In 2013, Oregon initiated an Alternative Payment Methodology (APM) Experiment for select health centers, initiating capitated payments for patients with Medicaid. *Objective:* To use electronic health record data to evaluate the impact of APM on visit and scheduling metrics in the first wave of experiment clinics. *Research Design:* Retrospective clinic cohort. Difference-in-differences analysis using generalized linear mixed modeling across 2 time thresholds: the initiation of APM and the start of the Affordable Care Act Medicaid expansion. *Subjects:* Eight primary clinics enrolled in APM on March 1, 2013 and 10 comparison clinics not enrolled in APM during the study period (July 1, 2012 to February 28, 2015). *Measures:* Independent variable: intervention status of the clinics (APM or comparison). Dependent variables: total patient encounters, total alternative encounters, new patient visits, provider appointment availability, number of appointment overbooks and no-shows/late cancellations. *Results:* Comparison clinics had smaller patient panels and more advanced practice providers than APM clinics, but both had similar proportions of Hispanic, Medicaid, and uninsured patients. APM clinics had a 20% greater increase in same-day openings than non-APM clinics across the APM implementation (Relative Ratio, 1.20; 95% CI, 1.02 to 1.42). Otherwise, there were minimal differences in APM clinics and control clinics in wait times, visit rates, patient no-shows, and overbooks. *Conclusions:* APM clinics experienced a greater increase in same-day visits over the course of this experiment, but did not significantly differ from comparators in other visit metrics. Further research into other impacts of this experiment are necessary and ongoing. * Appointments and Schedules * Health Insurance * Health Policy * Health Services * Health Care Systems * Medicaid * Medically Uninsured * No-Show Patients * Oregon * Patient Protection and Affordable Care Act * Primary Health Care In the last decade, the United States has pursued several large-scale health reform initiatives1,2 with the goal of achieving the Triple Aim of improving patient experience, reducing costs, and addressing population health outcomes.3 Many of these initiatives have focused on reforming health care payment,2,4,5 moving away from fee-for-service models that reimburse providers based on the volume of care delivered to “global” or capitated payment models where providers receive a set amount of money to care for a population of patients2,6⇓⇓–9; such initiatives have produced significant increases in primary care quality outside the United States.10,11 Specifically, the Affordable Care Act (ACA) contained provisions for state and region-specific demonstration projects to investigate different payment models.2 However, robust, detailed, and scalable findings demonstrating the impact of such projects on the a wide variety of outcomes has been lacking.5 Such evaluations are crucial to understand how to create, sustain, and refine payments to best support primary care and health system innovation overall. In 2013, the Oregon Health Authority implemented an Alternative Payment Methodology (APM) demonstration project whereby multiple community health centers self-selected to receive per-member-per-month capitated payments for empaneled patients with Medicaid coverage in lieu of standard fee-for-service Medicaid payments for office visits.12 This APM program was limited to Federally Qualified Health Centers (FQHCs) and Rural Health Clinics (RHCs), which have a different payment structure than other primary care clinics. The stated intent of Oregon's APM was to adjust payment for participating FQHC/RHCs to allow and encourage high-quality, efficient patient-centered health care, incentivizing value of services over volume of visits.13 APM payments excluded visits for mental health, dental, and obstetrics services. The initiative did not mandate specific workflow changes; clinics implemented any changes with general guidance from the Oregon Health Authority but on their own time frame. This inevitably resulted in variation in the specific changes made by APM clinics; studying this type of broad policy change necessitates including such variability to ensure external validity. While the theoretical basis for this type of payment reform shifting from volume-based care to value-based care has been well established,1,6,9,14 there remains uncertainty about its ability to impact certain aspects of the health care system. There is scant evidence in the health care literature regarding the impact of payment changes on day-to-day clinical operations in primary care, and in particular, whether alternative payment models incentivize nontraditional modes of provider-patient interactions that are both high quality and cost effective. The majority of existing research on the impact of payment reform on the capacity of primary care clinics to deliver care consistent with the Triple Aim used administrative data or surveys of providers and clinic staff, and were not able to directly measure clinic processes and workflows.8,9,15⇓–17 Therefore, the purpose of this study was to evaluate this ongoing natural experiment by using a novel data source, namely electronic health record (EHR) data, which contains important indicators of operational capacity, to evaluate the impact of the APM initiative on longitudinal primary care visits and schedule metrics. We hypothesized that APM primary care clinics, as compared with non-APM clinics, would experience 1) higher new patient visits rates but stable overall visit numbers (more visits with new patients plus reduced demand for visits with established patients equals stable overall visit numbers); 2) a higher rate of visits with advanced practice providers (eg, physician assistants, nurse practitioners) as clinics can use these less expensive providers to meet fluctuating same-day needs; 3) shorter wait times; 4) more same-day access; and 5) fewer no shows/late cancellations compared with clinics that did not receive APM payments. No-shows were chosen as a metric under the hypothesis that with greater flexibility in general, appointments would be more efficiently utilized in the clinic schedules. ## Methods ### Setting and Data This study was done in partnership with the OCHIN practice-based research network of over 420 FQHC/RHCs across the US, serving >2 million patients.18,19 OCHIN is a community-based organization that provides a hosted, linked Epic© EHR to this national network of FQHC/RHCs, including the clinics in this study. Data on clinic encounters and wait times for the study clinics were abstracted from OCHIN′s EHR data warehouse. This study considered 10 primary clinics (from 3 separate health care organizations—OCHIN is an EHR vendor/partner, not a managing organization) that enrolled in the first phase of the APM implementation on March 1, 2013. Clinics may have been part of a health care organization that managed additional clinics that did not participate in the APM. All APM clinics were recognized primary care medical homes and all except 2 comparison clinics were not. Two APM clinics were excluded as 1 was a school-based clinic with a different appointment structure than other clinics and 1 did not implement an EHR before the beginning of the study period, precluding us from obtaining complete outcome measures. Thus, our final APM clinic group included 8 primary clinics. Clinics that implemented APM after March 1, 2013 but before the end of the study period were excluded because of the difficulty of differentiating changes due to APM implementation and changes due to the ACA expansion in these clinics. To account for secular trends, we sought a comparison group of clinics that were Oregon FQHCs/RHCs within the OCHIN network that were not receiving APM payments during the study period (July 1, 2012 to February 28, 2015) and had implemented their EHR before the start of July 1, 2012. This resulted in 12 potential comparison primary care clinics from 7 organizations. Two comparison clinics with fewer than 2000 patients in the pre-APM period (July 1, 2012 to February 28, 2013) were excluded as they were less than half the size of the smallest APM clinic. This resulted in 10 non-APM comparison clinics. The full details of the APM initiative have been published elsewhere.12,20 ### Participants All analyses were conducted at the clinic level. Clinic encounter metrics were computed based on all patients receiving care at study clinics from July 1, 2012 to February 28, 2015. ### Independent Variable The primary independent variable was the APM status of the clinics (APM or non-APM). Because the ACA Medicaid expansion occurred on January 1, 2014 in Oregon (10 months after APM was implemented), we measured rates during 3 separate time periods to account for the influx of new Medicaid enrollees 1) before the APM implementation (July 1, 2012 to February 28, 2013: “pre APM”), 2) after the APM implementation but before the ACA (March 1, 2013 to December 31, 2013: “post APM”), and 3) after the ACA (January 1, 2014 to February 28, 2015: “post ACA”). Patient panel variables for each clinic (% of Hispanic patients, % of patients with Medicaid coverage, and % of patients between 2 and 5 years old, or women between 18 and 44 years old, [age brackets with high primary care visit needs])21 were included in all models as covariates to control for potential confounding. ### Dependent Variables All dependent variables were measured each month and this serves as the unit of analysis. Dependent variables were classified into 2 types: 1) direct measures of visit volume (total patient encounters, total alternative encounters, new patient visits), and 2) metrics that reflect demand and availability of appointments (provider appointment availability, number of appointment overbooks [a double-booked appointment slot], and no-shows/late cancellations). We defined total alternative encounters as those that included hospice visits, home health visits, nurse visits, telemedicine, telephone and email consultations, lab and imaging-only visits, and visits with a pharmacist. Provider appointment availability was measured by time-to-third-next available appointment (a common metric used in practice administration)22,23; and by the proportion of schedule searches that indicated same-day appointment availability. ### Statistical Analysis We analyzed metrics for all patients (and not just patients with Medicaid) because 1) some outcomes, such as days to third-next appointment, are provider-level outcomes that are not based on patient insurance status; 2) clinics have a high proportion of patients with Medicaid, and 3) clinics are unlikely to change workflows for only 1 of their payors. We performed difference-in-differences analyses comparing changes in outcome rates across 3 time periods between APM and non-APM clinics. Outcomes were modeled using generalized linear mixed modeling (GLMM) with robust sandwich estimators, accounting for temporal correlation of observations within clinics over time using random effects.24 Encounter rates and appointment metrics were modeled using a Poisson GLMM model with a log link. Encounter rates used an offset of log(clinic patients) to account for varying number of patients between clinics. Appointment metrics used an offset of the log(clinic appointment slots) to account for varying appointment slots between clinics. GLMM fixed effects included time (study month) as a categorical variable, APM status, the interaction between time and APM and patient panel covariates listed above. Time period comparisons (Pre-APM, Pre-ACA; Post-APM, Pre-ACA; Post-APM Post-ACA) were generated from contrasts of the months within each time period that summarized the average monthly rate for each time period. To account for variation in provider volume/time in clinic, the measures of same day access were weighted by provider visit volume. To assess the validity of the difference-in-differences assumption of preintervention parallel trends (ie, comparability of APM and comparison clinics outcome trends before APM implementation), for each outcome, we contrasted the average change in monthly outcome rates over the preperiod between groups. As these were clinic-level analyses, for each clinic we combined data across all physicians and advanced practice providers (physician assistants and nurse practitioners) working in that clinic. It is possible that trends would differ between physicians and advanced practice (AP) providers, thus we considered sensitivity analyses where we reran our models stratified by provider type (MD/DO vs AP providers). All modeling was performed using the PROC GLIMMIX procedure in SAS Enterprise Guide 7.1. All statistical tests were 2-sided and significance was defined as *P* < .05. This study was approved by the Oregon Health & Science Institutional Review Board. ## Results Characteristics of all study clinics (8 APM clinics, 10 non-APM clinics) are described in Table 1. On the whole, the 10 non-APM comparison clinics tended to be smaller and have more advanced practice providers. However, within study arms, there was significant variation in most categories. Both APM and non-APM comparison clinics had similar proportions of Hispanic patients, Spanish-speaking patients, patients with Medicaid coverage, and uninsured patients. Most clinics were located in urban areas (90%). View this table: [Table 1.](http://www.jabfm.org/content/32/4/539/T1) Table 1. Comparison of Clinic Characteristics Between APM Clinics and Non-APM clinics, prior to APM implementation (July 1, 2012 to February 28, 2013) As noted in Table 2, there were not significant differences in overall numbers of conventional office visits or alternative encounters, between APM clinics and non-APM comparison clinics across the study period. Analyses of these rates by provider type indicated that APM clinics had lower rates of new office visits for advanced practice providers (relative rates [RRs], 0.70; 95% confidence interval [CI], 0.50 to 0.96), a lower relative difference of provider office visits with advanced practice providers (RR, −0.089; 95% CI, −0.17 to −0.01), and overall lower advanced practice provider visit rates (RR, 0.86; 95% CI, 0.80 to 0.93) compared with MD/DO providers (data not in Table). There were no significant differences between MD/DO and advanced practice providers in other outcomes in this cluster (data not shown). View this table: [Table 2.](http://www.jabfm.org/content/32/4/539/T2) Table 2. Total Visit Rates, New Patient Visit Rates, and Total Alternative Encounters Over Study Time Periods (Pre-APM, Pre-ACA; Post-APM, Pre-ACA; Post-APM Post-ACA) The next category of outcomes in our analysis focused on clinic scheduling metrics: wait times, same-day access, no-shows, and overbooks (Table 3). There were a few important differences to note in these analyses. First, patient access—as measured by third-next available appointment—was slightly better in the APM clinics than in the non-APM clinics at baseline, and this difference increased post-APM (RR, −2 days; 95% CI, −3.39 to −0.61). However, this metric was not significant in the pre-APM versus post-APM difference-in-differences comparison (RR, −0.99; 95% CI, −1.99 to 0). Second, APM clinics had a 20% greater increase in same-day openings than non-APM comparisons across the study period (RR, 1.20; 95% CI, 1.02 to 1.42) (Figure 1). No differences were noted in rates between MD/DO providers and midlevel providers in this outcome cluster (data not shown). View this table: [Table 3.](http://www.jabfm.org/content/32/4/539/T3) Table 3. Scheduling Metrics: Appointment Availability, Overbooks, and No-Shows/Late Cancellations Over Study Time Periods (Pre-APM, Pre-ACA; Post-APM, Pre-ACA; Post-APM Post-ACA) ![Figure 1.](http://www.jabfm.org/https://www.jabfm.org/content/jabfp/32/4/539/F1.medium.gif) [Figure 1.](http://www.jabfm.org/content/32/4/539/F1) Figure 1. Same day availability changes and relative rates in APM and non-APM comparison clinics before and after the APM and ACA implementations. APM: Alternative Payment Methodology; ACA, Affordable Care Act; RR, Relative Rates. ## Discussion We assessed the impact of the first phase of Oregon's 2013 APM implementation on several clinic metrics and found 2 significant differences between intervention and comparison clinics. APM clinics had a slightly increased rate of same-day appointment availability and fewer visits with advance practice providers. There are a few possible explanations for the increase in same-day appointments. It may be that the needs of patients with chronic conditions were addressed via telephone or electronic encounters, through nurse or behavioral visits, or other means facilitated by the APM, thereby freeing up some same-day visits for more acute needs. It is also possible that clinics used time scheduled for other tasks (eg, phone visits) to accommodate same-day needs when they arose. The reason APM clinics had fewer advanced practice provider visits after the ACA compared with non-APM clinics is uncertain; possible explanations may be that APM clinics had fewer advanced practice providers per clinic at baseline or that different appointment lengths for advanced practice providers influenced the overall visit rate by provider. Although our hypothesis was not proven with respect to many of our outcomes, we found that APM and non-APM comparison clinics did not differ in visit rates, wait times, patient “no-shows” for appointments, or provider overbooks. These findings confirm an important lack of negative impact of APM on the clinics and their ability to maintain access to care for patients with capitated payment. As the calls to end fee-for-service payments in primary care are widely prevalent25,26 and Bazemore and colleagues5 point out, that there are numerous gaps in our knowledge about the effects and effectiveness of APMs on a variety of outcomes, our study encourages continued use and evaluation of these payment models by giving a “street level” view of a key part of practice function and provider satisfaction: the daily schedule. While our view of the daily schedule in these community health centers is limited, it does confirm a lack of negative impacts, suggests some modest benefits, and breaks ground for continued evaluation. The state highlights that the APM approach allows “community health centers the opportunity to tailor their care and services to the unique issues and circumstances of their patients, as well as to adjust where, how and what kind of care and services they provide. With this flexibility, health centers can focus on partnering with patients to create a plan for supporting better health.”27 Before joining the program, each participating FQHC/RHC signed an agreement stipulating an understanding from both parties (ie, the FQHC/RHC and the state) that “the program is intended to incent a significant transition in patient centered care, and that it will likely result in a reduction in traditional, billable patient visits. At the same time, we expect that nonbillable touches with the patient will increase.”28 Qualitative data collection and observations during the early stages of APM implementation suggested that clinics were beginning to make changes to clinic schedules and work flows, such as increasing patient visit length, adjusting overall clinic visit schedules to integrate protected time for care team huddles and alternative forms of patient outreach, and more frequently using nonphysician members of the team to meet patient needs.20 So, while an alteration of the daily schedule, measurable by these variables, could be an indirect outcome of the APM implementation, it was not necessarily the goal of the APM clinics. In addition, given our focus on the early stages of APM implementation, we may not have captured the indirect effects of long-term intended changes in care delivery, as patient, provider, and clinic routines may take time to change. Lastly, while these Medicaid-centric payment changes were intended to affect a clinic's whole workflow, patients with Medicaid were still sometimes a minority of a clinic's panel, and this payor mix may have hampered more global practice change. Another explicit goal articulated by the state was to ensure that this payment change resulted in relatively stable utilization while allowing clinics to improve the quality of the care delivered.13 Accountability metrics included improving or maintaining clinically reportable measures of quality, maintaining or reducing per capita costs, and documenting either a billable visit or other nontraditional engagement touch with a member of the care team (via telephone, portal, or face-to-face) for 70% to 75% of the patient population over a 12-month period.28 Some of the metrics are the subject of additional project analyses. Again, our finding that there were few differences between the APM and non-APM comparison clinics confirms an important lack of negative impacts on the clinics and their ability to maintain access to care for patients with capitated payment. Patient and provider satisfaction, cost, the delivery of preventive services, disparities in care delivery, shifts in the proportion of visit types, and types of care not captured through the EHR are all possible items that could change as a result of a switch to capitated payments. Future studies should explore these and other impacts of APM implementation. ### Limitations Our analysis had several limitations. First, this was a state-based initiative for which clinics volunteered, and therefore was not randomized. Our findings are subject to selection bias and the difference-in-difference framework is not able to account for selection bias. However, it is still important to evaluate these real-world experiments, despite this unavoidable limitation. We were not able to control for all possible workflow differences in clinics at baseline (for instance, different same-day access strategies), and these differences may have contributed to many of our null findings. Our analyses were limited to a specific set of available operational and access metrics and did not include others that may have been significantly impacted by APM (eg, clinical quality metrics). We also had to exclude 2 small comparison clinics because their low denominators could have led to unstable rates of change over time, which can unduly influence overall estimation and lead to unstable models; these exclusions may have altered findings. In addition, our post-APM time period was complicated by the start of the ACA Medicaid expansion. This may have affected clinic function and patient panels in numerous and significant ways—FQHC/RHC's received large increases in Medicaid patient volume,29 and may have affected possible post-APM trends in the metrics measured in our study clinics. We did not analyze changes in specific disease subgroups, however, and future studies can look at patient level factors and outcomes more closely. While our study design allowed us to observe trends after ACA implementation, the change may have affected our understanding of the impact of the APM initiative. Despite these limitations, this innovative study allowed us to objectively measure metrics on a large scale natural experiment and demonstrate new methodologic approaches to help understand health care payment reform in the United States. ## Conclusions In 2013, Oregon instituted an APM demonstration project whereby some community health centers in Oregon were selected to receive per-member-per-month capitated payments in lieu of standard fee-for-service Medicaid payments. APM clinics experienced an improvement in access to same-day visits over the course of this experiment, but did not significantly differ from comparison clinics in wait times, visit rates, patient no-shows, and overbooks. This study demonstrates that there was not a negative impact of APM on visit and schedule metrics for study primary care clinics. Further research into other possible service delivery impacts of this experiment are necessary and ongoing. ## Acknowledgments We would like to acknowledge the patients, staff, and clinicians of the OCHIN Practice-Based Research Network, without whom this would not be possible, and Roopradha Datta, who assisted with final manuscript preparation. We would also like to acknowledge Oregon Primary Care Association, which contributed to the expertise in the alternative payment methodology project. ## Appendix View this table: [Appendix Table 1.](http://www.jabfm.org/content/32/4/539/T4) Appendix Table 1. Total Visit Rates, New-Patient Visit Rates, and Total Alternative Encounters Over Study Time Periods (Pre-APM, Pre-ACA; Post-APM, Pre-ACA; Post-APM Post-ACA) View this table: [Appendix Table 2.](http://www.jabfm.org/content/32/4/539/T5) Appendix Table 2. Scheduling Metrics: Appointment Availability, Overbooks, and No-Shows/Late Cancellations Over Study Time Periods (Pre-APM, Pre-ACA; Post-APM, Pre-ACA; Post-APM Post-ACA) ## Notes * This article was externally peer reviewed. * *Funding:* Agency for Healthcare Research and Quality Grant R01HS022651. * *Conflict of interest:* none declared. * To see this article online, please go to: [http://jabfm.org/content/32/4/539.full](http://jabfm.org/content/32/4/539.full). * Received for publication December 10, 2018. * Revision received February 27, 2019. * Accepted for publication March 6, 2019. ## References 1. 1.Saultz JW, Jones SM, McDaniel SH, et al. A new foundation for the delivery and financing of american health care. Fam Med 2015;47:612–619. [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=26382119&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) 2. 2.Kaiser Family Foundation. Medicaid Moving Forward. 2014. Available from: [http://kff.org/medicaid/fact-sheet/the-medicaid-program-at-a-glance-update/](http://kff.org/medicaid/fact-sheet/the-medicaid-program-at-a-glance-update/). 3. 3.Berwick DM, Nolan TW, Whittington J. The Triple Aim: Care, health, and cost. Health Aff (Millwood) 2008;27:759–769. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToiaGVhbHRoYWZmIjtzOjU6InJlc2lkIjtzOjg6IjI3LzMvNzU5IjtzOjQ6ImF0b20iO3M6MjA6Ii9qYWJmcC8zMi80LzUzOS5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 4. 4.Clough JD, Richman BD, Glickman SW. Outlook for alternative payment models in fee-for-service Medicare. JAMA 2015;314:341–342. 5. 5.Bazemore A, Phillips RL Jr., Glazier R, et al. Advancing primary care through alternative payment models: Lessons from the United States & Canada. J Am Board Fam Med 2018;31:322–327. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NToiamFiZnAiO3M6NToicmVzaWQiO3M6ODoiMzEvMy8zMjIiO3M6NDoiYXRvbSI7czoyMDoiL2phYmZwLzMyLzQvNTM5LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 6. 6.Bailit M, Phillips K, Long A. Paying for the Medical Home: Payment Models to Support Patient-Centered Medical Home Transformation in the Safety Net. Seattle, WA: Bailit Health Purchasing and Qualis Health; 2010. 7. 7.Barr MS. The patient-centered medical home: Aligning payment to accelerate construction. Med Care Res Rev 2010;67:492–499. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1177/1077558710366451&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=20448252&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) 8. 8.Effects of health care payment models on physician practice in the US. Santa Monica, CA: Rand Corporation; 2015. 9. 9.Kruse J. Health care economics, the PCMH, and education for the future: The importance of prospective care coordination payments. Fam Med 2012;44:739–742. [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=23148011&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) 10. 10.Kiran T, Kopp A, Moineddin R, et al. Longitudinal evaluation of physician payment reform and team-based care for chronic disease management and prevention. CMAJ 2015;187:E494–E502. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiY21haiI7czo1OiJyZXNpZCI7czoxMToiMTg3LzE3L0U0OTQiO3M6NDoiYXRvbSI7czoyMDoiL2phYmZwLzMyLzQvNTM5LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 11. 11.Kantarevic J, Kralj B, Weinkauf D. Enhanced fee-for-service model and physician productivity: Evidence from Family Health Groups in Ontario. J Health Econ 2011;30:99–111. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1016/j.jhealeco.2010.10.005&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=21111500&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) 12. 12.Angier H, O'Malley JP, Marino M, et al. Evaluating community health centers' adoption of a new global capitation payment (eCHANGE) study protocol. Contemp Clin Trials 2017;52:35–38. 13. 13.National Association of Community Health Centers. Spotlight on Health Center Payment Reform: Oregon Alternative Payment and Advanced Care Model. Available from: [http://www.nachc.org/wp-content/uploads/2016/12/Oregon-FQHC-APM-December-2017.pdf](http://www.nachc.org/wp-content/uploads/2016/12/Oregon-FQHC-APM-December-2017.pdf). 14. 14.Puffer JC, Borkan J, DeVoe JE, et al. Envisioning a new health care system for America. Fam Med 2015;47:598–603. [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=26382117&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) 15. 15.Bishop TF, Press MJ, Mendelsohn JL, Casalino LP. Electronic communication improves access, but barriers to its widespread adoption remain. Health Aff 2013;32:1361–1367. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToiaGVhbHRoYWZmIjtzOjU6InJlc2lkIjtzOjk6IjMyLzgvMTM2MSI7czo0OiJhdG9tIjtzOjIwOiIvamFiZnAvMzIvNC81MzkuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 16. 16.Magill MK, Ehrenberger D, Scammon DL, et al. The cost of sustaining a patient-centered medical home: Experience from 2 states. Ann Fam Med 2015;13:429–435. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6ODoiYW5uYWxzZm0iO3M6NToicmVzaWQiO3M6ODoiMTMvNS80MjkiO3M6NDoiYXRvbSI7czoyMDoiL2phYmZwLzMyLzQvNTM5LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 17. 17.Nocon RS, Sharma R, Birnberg JM, et al. Association between patient-centered medical home rating and operating cost at federally funded health centers. JAMA 2012;308:60–66. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1001/jama.2012.7048&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=22729481&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) 18. 18.Devoe JE, Sears A. The OCHIN community information network: Bringing together community health centers, information technology, and data to support a patient-centered medical village. J Am Board Fam Med 2013;26:271–278. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NToiamFiZnAiO3M6NToicmVzaWQiO3M6ODoiMjYvMy8yNzEiO3M6NDoiYXRvbSI7czoyMDoiL2phYmZwLzMyLzQvNTM5LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 19. 19.Devoe JE, Gold R, Spofford M, et al. Developing a network of community health centers with a common electronic health record: Description of the Safety Net West Practice-based Research Network (SNW-PBRN). J Am Board Fam Med 2011;24:597–604. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NToiamFiZnAiO3M6NToicmVzaWQiO3M6ODoiMjQvNS81OTciO3M6NDoiYXRvbSI7czoyMDoiL2phYmZwLzMyLzQvNTM5LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 20. 20.Cottrell EK, Hall JD, Kautz G, et al. Reporting from the front lines: Implementing Oregon's alternative payment methodology in federally qualified health centers. J Ambul Care Manage 2017;40:339–346. 21. 21.Butler DC, Petterson S, Phillips RL, et al. Measures of social deprivation that predict health care access and need within a rational area of primary care service delivery. Health Serv Res 2013;48(2 Pt 1):539–559. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1111/j.1475-6773.2012.01449.x&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=22816561&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) [Web of Science](http://www.jabfm.org/lookup/external-ref?access_num=000316120200011&link_type=ISI) 22. 22.Murray M, Bodenheimer T, Rittenhouse D, Grumbach K. Improving timely access to primary care: Case studies of the advanced access model. JAMA 2003;289:1042–1046. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1001/jama.289.8.1042&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=12597761&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) [Web of Science](http://www.jabfm.org/lookup/external-ref?access_num=000181129800035&link_type=ISI) 23. 23.Murray M, Berwick DM. Advanced access: Reducing waiting and delays in primary care. JAMA 2003;289:1035–1040. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1001/jama.289.8.1035&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=12597760&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) [Web of Science](http://www.jabfm.org/lookup/external-ref?access_num=000181129800034&link_type=ISI) 24. 24.McCulloch C, Searle SR. Generalized, linear, and mixed models. Hoboken, NJ: John Wiley and Sons; 2004. 25. 25.Magill MK. Time to do the right thing: End fee-for-service for primary care. Ann Fam Med 2016;14:400–401. [FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6ODoiYW5uYWxzZm0iO3M6NToicmVzaWQiO3M6ODoiMTQvNS80MDAiO3M6NDoiYXRvbSI7czoyMDoiL2phYmZwLzMyLzQvNTM5LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 26. 26.Berenson RA, Rich EC. US approaches to physician payment: The deconstruction of primary care. J Gen Intern Med 2010;25:613–618. [CrossRef](http://www.jabfm.org/lookup/external-ref?access_num=10.1007/s11606-010-1295-z&link_type=DOI) [PubMed](http://www.jabfm.org/lookup/external-ref?access_num=20467910&link_type=MED&atom=%2Fjabfp%2F32%2F4%2F539.atom) [Web of Science](http://www.jabfm.org/lookup/external-ref?access_num=000277712200023&link_type=ISI) 27. 27.Oregon Primary Care Association. Alternative payment & advanced care model: Overview. 2014. Available from: [https://www.orpca.org/initiatives/alternative-care-model](https://www.orpca.org/initiatives/alternative-care-model). 28. 28.Oregon Health Authority. Participation Agreement for Oregon's Alternative Payment and Care Methodology (APCM). 2016. 29. 29.Angier H, Hoopes M, Gold R, et al. An early look at rates of uninsured safety net clinic visits after the Affordable Care Act. Ann Fam Med 2015;13:10–16. [Abstract/FREE Full Text](http://www.jabfm.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6ODoiYW5uYWxzZm0iO3M6NToicmVzaWQiO3M6NzoiMTMvMS8xMCI7czo0OiJhdG9tIjtzOjIwOiIvamFiZnAvMzIvNC81MzkuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9)