Abstract
Purpose:
The purpose of this study was to develop and evaluate an electronic health record (EHR) clinical decision support system to identify patients meeting criteria for low vision rehabilitation (LVR) referral.
Methods:
In this quality improvement project, we applied a user-centered design approach to develop an interactive electronic alert for LVR referral within the Johns Hopkins Wilmer Eye Institute. We invited 15 ophthalmology physicians from 8 subspecialties to participate in the design and implementation, and to provide user experience feedback. The three project phases incorporated development evaluation, feedback analysis, and system refinement. We report on the final alert design, firing accuracy, and user experiences.
Results:
The alert was designed as physician-centered and patient-specific. Alert firing relied on visual acuity and International Classification of Diseases (ICD)-10 diagnosis (hemianopia/quadrantanopia) criteria. The alert suppression considerations included age < 5 years, recent surgeries, prior LVR visit, and related alert actions. False positive rate (firing when alert should have been suppressed or when firing criteria not met) was 0.2%. The overall false negative rate (alert not firing when visual acuity or encounter diagnosis criteria met) was 5.6%. Of the 13 physicians who completed the survey, 8 agreed that the alert is easy to use, and 12 would consider ongoing usage.
Conclusions:
This EHR-based clinical decision support system shows reliable firing metrics in identifying patients with vision impairment and promising acceptance by ophthalmologist users to facilitate care and LVR referral.
Translational Relevance:
The use of real-time data offers an opportunity to translate ophthalmic guidelines and best practices into systematic action for clinical care and research purposes across subspecialties.
This quality improvement initiative was conducted at the Johns Hopkins Wilmer Eye Institute across 6 clinic locations between November 6, 2017, and April 5, 2019. A single EHR system (EpicCare Ambulatory; Epic Systems, Verona, WI) was in place during the study period. The Johns Hopkins University School of Medicine Institutional Review Board acknowledged this project.
Alert firing quality was assessed weekly for phase I, and monthly for phases II and III. Eligible encounters were defined as those without suppression reasons. Alert firing accuracy was assessed by two categories: false positive rates (firing for encounters with suppression reasons, firing without criteria met), and false negative rates (not firing when VA or diagnosis criteria met). Data on suppression reasons, VA, ICD-10 codes, and firing status for each encounter were extracted and analyzed. Overall false positive and false negative rates and by project phases were assessed. Descriptive statistics were used for reporting physician survey findings. All analyses were performed using STATA 15 (Stata Corp., College Station, TX).
Of the 15 physician-users, 14 completed the study and 1 left the institution before project completion. Thirteen physician-users completed the exit survey, including 5 female and 8 male physicians; mean time since fellowship completion was 16 years (range = 3–37 years). For the item on ease of alert use, 8 of 13 (62%) physicians agreed or strongly agreed, and 1 of 13 (7%) disagreed. For the item on the beneficial value of adding the dismissible alert, 10 of 13 (77%) agreed or strongly agreed, whereas 3 of 13 (23%) strongly disagreed. Twelve (92%) physicians responded that they would consider ongoing use of the alert, and 1 (7%) responded no. The one user that did not agree to ongoing use was a physician that had been routinely referring patients to LVR services before the alert implementation.
Precise programming and accurate data processing are the cornerstone of reliable alert functionality. Although studies have evaluated diagnostic alert firing accuracy by comparing to well defined clinical standards,
33,34 they have not assessed whether an alert was accurately firing according to the prescribed logic. During the phase I assessment, we identified two reasons the alert firing was inconsistent with user-centered design principles; both of which were subsequently addressed.
First, when addending a signed clinical encounter involved changing a VA measure recorded in error, the alert may be activated prior to signing the addendum. For example, during quality assessment we observed that the alert would not fire at the time of the patient encounter if 20/20 VA was recorded but modified later to 20/200 during addending. This resulted in missed alert firing while the VA firing criteria were seen as satisfied, and the encounter would be categorized as a false negative firing outcome. We subsequently modified the suppression criteria to include addenda to encounters.
A second observation was that the alert had failed to activate for one user. The error was due to an idiosyncratic workflow by that physician and a concurrent ophthalmology referral order in the EHR encounter suppressed the alert. To address this situation, we overrode the logic related to the concurrent ophthalmology referral. With these efforts, false negative rates had declined from phase I through phase III, showing a favorable performance outcome.
This project has real-world eye care delivery implications to identify patients potentially in need of LVR care and is an example of how a CDS system can assist in a variety of ophthalmic clinical and research settings. The 15 physician-users represented 8 ophthalmology subspecialties and provided ongoing feedback on the CDS system. Consultation with physician-users and a foundation of the American Academy of Ophthalmology preferred practice guidelines framed the VA and diagnosis considerations for the firing criteria. However, the current Academy guidelines are based on best-corrected VA, which is often not available during certain subspecialty encounters as refraction is rarely performed. We used BDVA to define firing criteria to take advantage of the real-world data practices. A similar user-centered design approach of putting practice guidelines into action could be replicable in other CDS efforts and on other EHR platforms.
This study was limited first by the fact that the CDS system did not include all indications for identifying patients potentially in need of LVR. For example, one physician commented on the firing criteria: “If we could integrate visual field data that would be great.” This may require an algorithm that uses visual field metrics (i.e. mean deviation) to drive the alert, or a behavior change among users to document ICD-10 visual field-related diagnoses. Second, individual user needs might not have been fully accommodated despite surveying and incorporating user requirements where feasible, exemplified by the different user preferences toward the dismissible alert at chart opening. Third, only data in the EHR's structured fields were examined for the alert firing. VA or diagnosis documented in the unstructured fields could not be included.
We plan to report the primary findings of the alert on physician documentation behaviors, monthly reporting to users, and referral practices using this CDS system in a separate analysis. Further studies are needed to assess the effectiveness of the alert in identifying appropriate LVR candidates, to refine and customize the firing criteria to the individual user, and ultimately to determine the effectiveness of the CDS system in connecting patients to LVR services.
Reader's Digest Partners for Sight Foundation. The funder played no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; nor the decision to submit the manuscript for publication.
Disclosure: X. Guo, None; B.K. Swenor, None; K. Smith, None; M.V. Boland, Carl Zeiss Meditec (C); J.E. Goldstein, None