Abstract
Introduction: Identifying patients with a history of cancer in primary care remains a barrier to implementing optimal survivorship care.
Methods: As part of an intervention to support primary care practices to deliver breast cancer survivorship care, our team developed a process that uses electronic health record (EHR) reporting capabilities to systematically identify patients with breast cancer history. This intervention was fielded in 13 primary care practices in an integrated health system. End-user feedback throughout the intervention informed refinements of the procedure and provided insights into the appropriateness and feasibility of implementing this tool.
Results: Practice members were able to successfully generate reports that included lists of patients with a history of breast cancer at both the practice and clinician level. Feedback from those implementing the intervention led to refinements that included expanding users who could generate and access the reports to include nonphysicians and adding additional search criteria.
Discussion: Implementation of system-wide EHR reports and instruction of clinic champions on procedures to generate analytics of breast cancer survivors can facilitate informatics skill-building and enable primary care teams to engage in breast cancer survivorship care.
Conclusion: This method of creating practice- and clinician-specific cancer survivor patient registries can be applied in other contexts to support the delivery of evidence-based survivorship care.
- Breast Cancer
- Cancer Survivors
- Electronic Health Records
- Human Centered Design
- Implementation Science
- Medical Informatics
- Organizational Change
- Primary Health Care
- Quality Improvement
- Registry Data
Introduction
Improvements in detection and treatment of cancers have led to an increase in patients with a history of cancer (ie, cancer survivors) being managed in primary care, with the number of cancer survivors in the United States projected to reach 21.6 million by 2030.1 Primary care routinely cares for majority of cancer survivors, with nearly all survivors (>90%) reporting at least an annual visit.2–4 Despite primary care’s centrality in the care of survivors, there has been a dearth of actionable information and strategies to inform care delivery improvements in primary care settings.5 Research focused on advancing practice-based strategies for primary care has reported systematic patient identification as a major barrier to implementing population health strategies for cancer survivors in primary care.6–8 This is a significant challenge as the structure of many electronic health record (EHRs) and inconsistent documentation practices make it difficult to identify cancer survivors in primary care.5 Recently, the National Cancer Institute (NCI) Office of Cancer Survivorship (OCS) convened a day-long virtual event, “Enhancing Capacity for Primary Care Research in Cancer Survivorship: A Workshop for Action,” which identified the absence of procedures to systematically identify cancer survivors in primary care as a distinct clinical category as a major hurdle in translating evidence-based cancer survivorship care into primary care settings.9 Further, studies comparing EHR data to cancer registries, considered the gold standard, have found significant discrepancies. One large 2021 study found nearly half of cancer patients in state registries lacked any corresponding cancer documentation in their primary care EHR.10 In a study of high functioning primary care teams, the use of population health strategies like patient registries for common chronic conditions were frequently reported; however, these same strategies were not applied to cancer survivor populations.7 Metrics to evaluate key population health strategies for cancer survivors are lacking and attributed to difficulties in the collection and synthesis of data from multiple sources (eg, EHRs, billing or claims data, cancer registries and patient reported outcomes).11 In a recent study focused on improving cancer survivorship care for rural cancer survivors, both the inability to systematically identify cancer survivors’ in the primary care clinics’ EHRs and the inability to synthesize information from multiple sources were key implementation barriers.6 Given the complexity of the cancer survivor population and its anticipated growth, developing methods for systematic identification is a necessary step before care can be improved or outcomes measured.
For evidence translation at multiple levels (eg, patient, provider, clinic) and for care delivery models to be responsive to cancer survivors and operational in primary care contexts, a critical first step is developing capacity to systematically identify patients with a history of cancer in these contexts.12,9 Both the inability to query billing codes for cancer-related care provided outside primary care and variability in EHR documentation contributes to a significant underestimate of patients in primary care with cancer history and can limit the delivery of adequate care.10,13 Our team developed and implemented an intervention, AFTER-BC (Actionable Follow-up to Enhance suRvivorship in Breast Cancer), that specifies practice-based procedures to use Epic Reporting Workbench to generate a patient registry of breast cancer survivors. Here, we describe the process and procedures used to develop and refine this actionable practice-based strategy.
Methods
Beginning in June 2024, AFTER-BC, a National Cancer Institute (NCI)-funded hybrid type I effectiveness-implementation randomized controlled trial with waitlist control, was initially implemented in 13 primary care practices across a large integrated health system that includes a comprehensive, NCI-designated cancer center where patients may have received their initial treatment (see Table 1 for initial intervention practice characteristics).14 The intervention included identification and preparation of practice champions, learning collaboratives, and virtual facilitation. We presented the rationale for systematic identification of cancer survivors in the first learning collaborative. Using principles of codesign with feedback from practice champions, our research team and Epic Information Technology (IT) team collaboratively developed and refined a system-wide report that queries the EHR to develop a practice-based patient registry for all living adult female patients with a history of breast cancer.
Practice and Champion Characteristics
Initially, the research team provided breast cancer criteria to the Epic IT team, which included all diagnoses and/or medical history based on International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) codes related to breast cancer history (ICD-10-CM codes D05.0-D05.9 for carcinoma in situ of breast and C50.011-C50.912 for malignant neoplasm of breast in females). These ICD-10-CM codes were part of diagnosis groupers for breast cancer defined by our health system. The development process is outlined in Table 2. This query produced a list of breast cancer survivors seen in each practice within the previous 3 years that could be filtered by clinician. We demonstrated how to produce this list and provided enduring materials to clinical champions at the first learning collaborative.
Process for Developing Epic Report to Generate Patient List
Next, facilitators asked each clinic’s team to run the report. On completion, facilitators asked champions to review the patients identified and reflect on the accuracy of these lists. Appropriateness (eg, provider perception of usefulness and perceived fit in the organization) and feasibility (eg, actual fit) were assessed through qualitative field notes captured during biweekly facilitation meetings (approximately 4 facilitation meetings per practice) with champions from individual practices.15 Facilitators recorded detailed field notes during the virtual facilitation meetings with practice champions. Facilitators and the process evaluation team met weekly to discuss facilitation experiences and practice interactions, including champion feedback on and use of the report. In these debriefs, the facilitators provided updates on each clinic's progress and the group discussed common themes and challenges across the practices. These debrief meetings were recorded, transcribed, and summarized for analysis. Analysis of field notes and debrief summaries was performed using an iterative immersion-crystallization approach.16 Appropriateness was gauged through qualitative assessments of champions’ processing the report results during facilitation meetings and descriptions of the workflow changes required for this to work in their practice. Feasibility was assessed based on a practice’s successful creation of a list of breast cancer patients that acted as a functional registry.
Results
Practice members were asked to run a report to generate a patient registry during facilitation meetings using the procedure demonstrated in the learning collaborative. This included using the Epic “My Reports” function and selecting the report created by the research and Epic teams. Practices described that the initial report generated a range of patients based on individual providers’ tenure in the practice, with a new provider only identifying 8 individual patients and an established provider (working with nearby oncologists for 25 years) identifying 233 individual patients in their panel. Champions believed that differences in physician documentation practices may also contribute to between provider variation in number of patients identified. Discussions with champions revealed inconsistencies in documentation that made manual identification difficult. For instance, 2 clinicians described reliance on documented surgical history of mastectomy to indicate breast cancer history, while others depended solely on the problem list or did not document the history in either of these structured fields. Clinician champions suggested using proxy indicators, such as a prescription for tamoxifen, to address these discrepancies. The challenge of multiple providers interacting with a single patient record was noted as worsening this problem. A few champions acknowledged that they or other clinicians in their practice did not regularly use the problem list and recognized that doing so could improve patient tracking. The patient list was therefore a tool to overcome challenges of documentation variation and the difficulty of locating a patient’s cancer history that may be buried in the patient record. Champions suggested that this knowledge of a patient’s breast cancer history could be relevant to future care, such as needed screenings or identifying potential late/long-term effects.
The procedures for report generation were amended (when possible) to meet practice needs and encourage adoption. For example, the original report query was designed for physicians to be able to generate a list of breast cancer patients in their panel. We learned in virtual facilitation meetings that nonphysicians (eg, medical assistants who were precharting) were interested in learning to produce this report. We found that the selection of “Base Patients” affected who could successfully run the report. Initially, the Base Patient selection included only patients with assigned primary care providers at the user’s location and restricted report access to physicians. At the suggestion of one of the office managers who had experience running reports in Epic, we worked with the Epic IT team to expand the potential users of the report by changing the Base Patient selection to a broader category of patients seen in each department. This allowed nonphysician champions to successfully run the report in subsequent implementation. This change also improved the ‘fit’ of report generation as many other office managers regularly ran reports as part of quality improvement initiatives. Overall, the workflow changes were described as minimal.
Physician champions reported no issues generating the report in all but one practice (92.3% feasibility). The practice that was unable to run the reported faced an unexpected barrier as the physician champion was on medical leave for the duration of the intervention. Physicians and staff found the list of patients with breast cancer history to be valuable and were interested in seeing how many patients they had with breast cancer history. In some instances, physicians saw patients on the list who they were not aware had breast cancer history.
Discussion
Generating a report using a query of existing EHR capabilities offers an initial step toward supporting primary care practices in panel management of their patients with a history of breast cancer. Further, there is potential for screening questionnaires such as those we are currently fielding via MyChart to enhance surveillance and intervention for symptoms that emerge over time. We also plan to compare rates of patient self-identification with generated practice registries to evaluate accuracy of the EHR reporting tool for identifying survivors.17
It is critical to identify patients with cancer history, who their providers are, and where they are accessing the health care system to effectively engage in population health management. Tools like the Epic Reporting Workbench can build capacity for population health in primary care through systematic identification of specific groups of patients like cancer survivors. As Epic is the most used EHR nationally,18,19 this method can be used by other health systems using a similar process. While a recent national survey of family physicians found that only one-in-four family physicians were very satisfied with their EHR,20 Epic had a significantly higher satisfaction rating than any other EHR platform.21 Describing and disseminating clear methods for using Epic as a population health management tool as presented here can facilitate clinicians to more easily manage panels of survivors by leveraging structured data in Epic.
By instructing clinic champions on procedures to generate analytics reports of breast cancer survivors and use of clinical decision support, implementation of system-wide reports across the EHR can support informatics skill-building and enable primary care teams to engage in breast cancer survivorship care. Giving primary care practices tools to identify survivors in their panels shifts cancer survivorship into the more familiar practices of chronic disease population health management. Practices can use these lists to prospectively identify patients who need annual physicals or cancer screenings. With knowledge of their survivor status, clinicians can tailor risk messaging like smoking cessation counseling in ways that may be more motivating to the patient.
This brief report has several limitations. First, we did not assess the percentage of identified survivors who had breast cancer on their problem list or medical history compared with only in billing or encounter history. As the report will likely be most useful for patients with breast cancer history documented only in the billing or encounter history (and therefore not immediately visible), future research should quantify this gap. Second, this analysis did not include formal interviews with clinicians on the report’s impact on their practice. Finally, we did not determine if there were patients who had breast cancer history who did not have structured data indicating their cancer history, which would require asking patients if they have a history of cancer and adding that to the problem list.
Conclusion
While requiring collaboration with IT staff for some programming, this method of creating an EHR-based report to identify patients can be expanded to other cancer sites. EHR-based tools for care delivery can support primary care clinicians in identifying patients who should receive evidence-based survivorship care. This pragmatic, clinical method to identify patients can be used by others with access to Epic EHR to create similar reports for use by primary care clinicians.
Notes
This article was externally peer reviewed.
Funding: This research was peer-reviewed and funded by the National Cancer Institute (R01CA257197).
Conflicts of interest: The authors report no conflict of interest.
- Received for publication March 8, 2025.
- Revision received June 30, 2025.
- Accepted for publication July 21, 2025.






