Open Access
Articles  |   March 2021
Developing an Ophthalmology Clinical Decision Support System to Identify Patients for Low Vision Rehabilitation
Author Affiliations & Notes
  • Xinxing Guo
    Johns Hopkins Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA
  • Bonnielin K. Swenor
    Johns Hopkins Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA
    Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
  • Kerry Smith
    Johns Hopkins Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA
  • Michael V. Boland
    Johns Hopkins Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA
    Department of Ophthalmology, Massachusetts Eye and Ear Infirmary, Boston, MA, USA
  • Judith E. Goldstein
    Johns Hopkins Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA
  • Correspondence: Judith E. Goldstein, Lions Vision Research and Rehabilitation Center, Wilmer Eye Institute, Johns Hopkins University, 600 N Wolfe Street, Baltimore, MD 21287, USA. e-mail: jgolds28@jhmi.edu 
Translational Vision Science & Technology March 2021, Vol.10, 24. doi:https://doi.org/10.1167/tvst.10.3.24
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Xinxing Guo, Bonnielin K. Swenor, Kerry Smith, Michael V. Boland, Judith E. Goldstein; Developing an Ophthalmology Clinical Decision Support System to Identify Patients for Low Vision Rehabilitation. Trans. Vis. Sci. Tech. 2021;10(3):24. https://doi.org/10.1167/tvst.10.3.24.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose: The purpose of this study was to develop and evaluate an electronic health record (EHR) clinical decision support system to identify patients meeting criteria for low vision rehabilitation (LVR) referral.

Methods: In this quality improvement project, we applied a user-centered design approach to develop an interactive electronic alert for LVR referral within the Johns Hopkins Wilmer Eye Institute. We invited 15 ophthalmology physicians from 8 subspecialties to participate in the design and implementation, and to provide user experience feedback. The three project phases incorporated development evaluation, feedback analysis, and system refinement. We report on the final alert design, firing accuracy, and user experiences.

Results: The alert was designed as physician-centered and patient-specific. Alert firing relied on visual acuity and International Classification of Diseases (ICD)-10 diagnosis (hemianopia/quadrantanopia) criteria. The alert suppression considerations included age < 5 years, recent surgeries, prior LVR visit, and related alert actions. False positive rate (firing when alert should have been suppressed or when firing criteria not met) was 0.2%. The overall false negative rate (alert not firing when visual acuity or encounter diagnosis criteria met) was 5.6%. Of the 13 physicians who completed the survey, 8 agreed that the alert is easy to use, and 12 would consider ongoing usage.

Conclusions: This EHR-based clinical decision support system shows reliable firing metrics in identifying patients with vision impairment and promising acceptance by ophthalmologist users to facilitate care and LVR referral.

Translational Relevance: The use of real-time data offers an opportunity to translate ophthalmic guidelines and best practices into systematic action for clinical care and research purposes across subspecialties.

Introduction
There is growing interest in the development and utilization of electronic health record (EHR)-based clinical decision support (CDS) tools, such as alerts, order sets, etc., to aid in clinical decision making and optimize health care delivery. CDS tools highlight important clinical knowledge and other patient-related information to health care providers or patients,1 and are used broadly in screening, disease diagnosis, coordination of care, medication management, etc., intending to align clinical practice with standards of care.26 A best practice advisory is a specific type of CDS that appears at the point-of-care and is intended to guide physician action.711 
One opportunity to improve quality of ophthalmology care and implement a CDS tool is the identification of patients with vision impairment to facilitate referral to low vision rehabilitation (LVR) services. LVR services improve functional ability (e.g. reading, mobility, etc.) in people with vision impairment.12 Referral, however, is commonly overlooked,13 as patients with chronic vision loss may see multiple subspecialty providers, such as retina and glaucoma, with visits focused on medical or surgical management. To mitigate the problem, vision rehabilitation guidelines have been published by the American Academy of Ophthalmology as one of their Preferred Practice Patterns.14 However, barriers persist in the inconsistent application of recommendations and utilization of service.15 Referral assessment has historically been conducted by individual medical record review.13,16 There is no Current Procedural Terminology code for LVR service to facilitate standard documentation and referral tracing in EHR notes. Only when an automated CDS system is developed can reliable and sustainable audit trails be implemented. 
Understanding the variability in physician practice patterns and patient needs are essential for designing a successful ophthalmology CDS system. To explore the needs of the physician-users and to optimize compliance, a user-centered design approach was implemented to engage ophthalmology physicians during its development.1719 
We aimed to develop and evaluate an EHR-based CDS system, also referred to as “the alert” in this paper, to identify patients with vision impairment potentially in need of LVR service referral within a large ophthalmic institute. The alert was designed to: (1) identify patients with indication(s) for LVR service referral; (2) notify ophthalmology physicians treating the patients; and (3) document physician responses to the alert. We describe the initial design, ongoing monitoring, periodic feedback, and continuing refinement of the alert system.20 We also present findings on alert firing accuracy and user experiences. The potential impact of the alert on LVR referral rate will be discussed in a separate report.21 
Methods
This quality improvement initiative was conducted at the Johns Hopkins Wilmer Eye Institute across 6 clinic locations between November 6, 2017, and April 5, 2019. A single EHR system (EpicCare Ambulatory; Epic Systems, Verona, WI) was in place during the study period. The Johns Hopkins University School of Medicine Institutional Review Board acknowledged this project. 
User-Centered Approach for Alert Development and Evaluation
We invited 15 ophthalmology physicians as users from 8 subspecialties (2 retina, 3 glaucoma, 2 cornea, 2 comprehensive, 2 neuro-ophthalmology, 2 pediatric ophthalmology and adult strabismus, 1 ocular immunology, and 1 oculoplastic). The developers (authors of this report) and users were involved in the following aspects: (1) examining user requirements; (2) determining the firing criteria, designing, and making ongoing modifications; and (3) assessing user experiences and feedback.22 
Alert Design and Development
This project was conducted over 17 months (Table 1). In phase I, the developers reviewed the American Academy of Ophthalmology LVR referral guidelines, met with users, and outlined criteria, such as patient age considerations, which would indicate whether the alert would be considered for firing or suppressed. A prototype was developed and deployed for users. Phase II was carried out after 7 months of usability testing and feedback, resulting in updated alert firing criteria and modification to physician response options. Phase III incorporated further refinement of the alert appearance during the patient encounter. During all study phases, alert firing quality assessment and user feedback data were collected. Monthly user reports were provided to all participating physicians, with information on alert firing frequency and accuracy, and distributions of physician response options. 
Table 1.
 
Low Vision Rehabilitation Referral Electronic Alert Design and Updates by Project Phases
Table 1.
 
Low Vision Rehabilitation Referral Electronic Alert Design and Updates by Project Phases
Phase I: Alert Prototype Design and Implementation
The prototype alert was developed with the following key features: (1) alert firing criteria; (2) alert suppression criteria; (3) physician response options; and (4) alert design and visual interface. The suppression criteria were set such that the alert would not fire in cases less likely in need of LVR referral. For example, the alert was suppressed in patients who had eye surgery planned in the next 3 months or performed in the previous 3 months as visual acuity (VA) may be restored postoperatively. The alert was also suppressed for patients younger than 5 years of age because of their developing visual system and the belief by users that the 20/40 VA threshold may not be applicable. Additionally, users suggested that very young children in need of rehabilitation largely sought services through early intervention programs or other resources. The alert firing criteria were set as best documented VA (BDVA) worse than 20/40 in the better eye. BDVA was determined by examining all EHR structured fields where VA was documented in the encounter, including presenting, pinhole, and refracted VA. In phase I, there were seven user response options (Table 1, phase I, and Fig.). An EHR chart could not be closed or signed without responding to the alert, effectively creating a “hard stop” once fired. This study solely assessed the practical viability of an alert for physician-users; no further interventions were made to facilitate referral to LVR. 
Figure.
 
Low vision rehabilitation referral clinical decision support alert feature updates. Phase I – Mandatory alert at plan section with seven response options. Phase II – Mandatory alert at plan section with updated firing criteria and three response options. Phase III – Dismissible alert at chart opening in addition to the mandatory alert.
Figure.
 
Low vision rehabilitation referral clinical decision support alert feature updates. Phase I – Mandatory alert at plan section with seven response options. Phase II – Mandatory alert at plan section with updated firing criteria and three response options. Phase III – Dismissible alert at chart opening in addition to the mandatory alert.
Several issues were identified from user interviews at the end of phase I. (1) The seven alert response options were ambiguous and not interpreted uniformly. For example, a retina physician interpreted prior “VR” as prior “vitreo-retinal,” rather than “vision rehabilitation.” (2) The firing criteria did not capture patients in need of LVR due to visual field loss in the absence of VA loss. (3) Physicians disagreed on the inclusion of pinhole VA for the firing considerations. Some commented that pinhole improvement may overstate VA potential, and regardless of refractive findings, LVR should still be considered; others argued that patients are not likely in need of LVR service with VA potential of 20/40 or better. (4) The alert can interrupt clinic flow as a “hard stop” and would sometimes interfere with physician workflow or documentation. (5) The timing of the alert appearance during a patient encounter left inadequate face-to-face time to discuss LVR referral considerations with patients. 
Phase II: Evaluation and Optimization
The following changes were made in phase II with consideration of Epic EHR system interface and coding limitations: (1) Alert firing criteria updated to include visual field loss associated with diagnosis of hemianopia or quadrantanopia (an International Classification of Diseases [ICD]-10 code of H53.47 or H63.46X). (2) Physician-user response options were updated and consolidated from seven to three to minimize ambiguity (Table 1, phase II, and Fig.). A free-text comment field was attached to the option “Don't refer, other reasons” for specification. (3) A second (“dismissible”) alert was added at chart opening in addition to the mandatory alert to improve timing of the reminder. This alert shared the same firing criteria and allowed for cancelling the notification window. By June 15, 2018, an updated version was deployed while waiting for the institutional EHR committee's approval of the dismissible alert. 
Phase III: Streamlining and Refining
Phase III of the project focused on testing the dismissible alert (launched on November 19, 2018 after receiving institutional EHR committee's approval) and continued monitoring alert firing accuracy. No documentation action was required for the dismissible alert. 
Physician-User Exit Survey
An exit survey was administered to physician-users 2 months after the project completed. We evaluated the ease of use, the value of adding the dismissible alert, and the desire for ongoing use with these questions: (1) The best practice advisory is easy to use. (2) Adding the dismissible alert at the opening of the encounter benefited my workflow to enable patient discussions and referral for services. Answer options for (1) and (2) included: strongly disagree; disagree; neutral; agree; and strongly agree. (3) Would you consider ongoing use of the low vision best practice advisory? Answer options: no or yes. 
Statistical Analysis
Alert firing quality was assessed weekly for phase I, and monthly for phases II and III. Eligible encounters were defined as those without suppression reasons. Alert firing accuracy was assessed by two categories: false positive rates (firing for encounters with suppression reasons, firing without criteria met), and false negative rates (not firing when VA or diagnosis criteria met). Data on suppression reasons, VA, ICD-10 codes, and firing status for each encounter were extracted and analyzed. Overall false positive and false negative rates and by project phases were assessed. Descriptive statistics were used for reporting physician survey findings. All analyses were performed using STATA 15 (Stata Corp., College Station, TX). 
Results
Final Alert Design Considerations
We developed an EHR-based CDS system using a user-centered design approach. The alert was designed as physician-centered such that the alert was active only for physician-users and regardless of the locations where they provided care. A previous alert action by a different physician would not affect the current encounter firing eligibility for the current physician-user. The alert was also designed as patient-specific with the same set of suppression criteria for any encounter. For example, an alert would be suppressed for encounters in patients with an eye surgery planned within the next 3 months, regardless of the physician that scheduled the surgery. 
Several major modifications have been made to simplify the alert usage, better facilitate clinic flow, and promote physician compliance. The final production featured a dismissible alert at chart opening that informed users of the low vision referral advisory and a mandatory alert prior to closing the chart that required response (see Fig.). 
There were several firing and suppression criteria iterations that contributed to the final alert design (Table 2). For eligible encounters, the alert would fire when BDVA was worse than 20/40 or when encounter diagnoses included hemianopia or quadrantanopia. Suppression criteria considered patients younger than 5 years of age, recent ophthalmic surgeries in the past or next 3 months, prior LVR clinic visit in the past 12 months, and related prior alert actions that suppressed alert firing for the current encounter. Future encounters were suppressed for 365 days with the response of “Don't order – under low vision care.” No suppression terms were applied for responses of “Order” or “Don't order – other reasons” as it may take more than a one-time referral conversation for patients to utilize the service (see Table 1, phase III). 
Table 2.
 
Final Alert Criteria for Consideration of Low Vision Rehabilitation Referral
Table 2.
 
Final Alert Criteria for Consideration of Low Vision Rehabilitation Referral
Alert Firing Accuracy
Alert firing accuracy was assessed by overall false positive and false negative firing rates (Table 3). Among the total 60,860 encounters, 19,634 (32%) met at least one suppression criterion and false positive alert firing was observed in 43 (0.2%) of them. Of the 41,226 eligible encounters, BDVA 20/40 or better without a diagnosis of hemianopia or quadrantanopia was identified in 37,389 encounters, and false positive alert firing was found in 82 (0.2%). The overall false negative rates were 5.6% (213/3,837). False negative rates for missed firing in encounters meeting the VA criteria (n = 3,667) or meeting the diagnosis criteria (n = 197) were 5.8% and 1.5%, respectively. False negative rates for missed firing were 8.5% (147/1738), 3.3% (41/1234), and 2.9% (25/865) for phases I, II, and III, respectively. 
Table 3.
 
Reasons and Rates for Alert Misfiring
Table 3.
 
Reasons and Rates for Alert Misfiring
Physician-User Feedback
Of the 15 physician-users, 14 completed the study and 1 left the institution before project completion. Thirteen physician-users completed the exit survey, including 5 female and 8 male physicians; mean time since fellowship completion was 16 years (range = 3–37 years). For the item on ease of alert use, 8 of 13 (62%) physicians agreed or strongly agreed, and 1 of 13 (7%) disagreed. For the item on the beneficial value of adding the dismissible alert, 10 of 13 (77%) agreed or strongly agreed, whereas 3 of 13 (23%) strongly disagreed. Twelve (92%) physicians responded that they would consider ongoing use of the alert, and 1 (7%) responded no. The one user that did not agree to ongoing use was a physician that had been routinely referring patients to LVR services before the alert implementation. 
Discussion
We developed an EHR-based CDS system to identify patients meeting discrete VA and ICD-10 criteria potentially in need of LVR services referral. With a user-centered design approach incorporating ongoing modifications, the alerts were reliable and resulted in good user experience, with all but one physician agreeing to ongoing use. The alert met the standard of effective CDS system set forth by the Agency for Healthcare Research and Quality23 and showed favorable performance outcomes. 
Alert Standardization and Customization
The current CDS alert attempts to parallel the American Academy of Ophthalmology VA criteria for LVR referral, as lack of standardization has been identified as a main barrier for clinicians to act on electronic alerts.24 Standardization efforts were used by applying the same firing and suppression criteria for all participating physicians in the study. Our user-centered design alert criteria suppressed nearly one-third of the potential alerts when considering VA and diagnosis criteria alone, intending to minimize alert fatigue. 
Although standardization is the foundation for promoting physician compliance, streamlining clinic flow, and improving system cost-effectiveness,2527 customization to user requirements including more refined firing criteria by subspecialty may benefit user engagement and sustainability.2830 Lack of such consideration may result in alert overlook or underutilization of referral.31 An optimally sensitive and specific CDS system may need to be customized to accommodate each subspecialty, or even at the individual user level.32 For example, both pediatric ophthalmologists commented that BDVA of worse than 20/40 may be too good for a firing criterion when considering pediatric patient referral to LVR and thus the alert may benefit from using customized firing criteria for pediatric ophthalmology encounters. Similarly, despite that 77% of physicians agreed that the dismissible alert benefited their workflows, the remaining 23% preferred otherwise. A more customized alert would allow these users to have individual preference settings. However, such tailored, sophisticated systems may be costlier and more time-consuming in development and maintenance. Given the available resources, we have developed a balanced CDS system with most participating physicians agreeing to ongoing use. 
Alert Firing Accuracy
Precise programming and accurate data processing are the cornerstone of reliable alert functionality. Although studies have evaluated diagnostic alert firing accuracy by comparing to well defined clinical standards,33,34 they have not assessed whether an alert was accurately firing according to the prescribed logic. During the phase I assessment, we identified two reasons the alert firing was inconsistent with user-centered design principles; both of which were subsequently addressed. 
First, when addending a signed clinical encounter involved changing a VA measure recorded in error, the alert may be activated prior to signing the addendum. For example, during quality assessment we observed that the alert would not fire at the time of the patient encounter if 20/20 VA was recorded but modified later to 20/200 during addending. This resulted in missed alert firing while the VA firing criteria were seen as satisfied, and the encounter would be categorized as a false negative firing outcome. We subsequently modified the suppression criteria to include addenda to encounters. 
A second observation was that the alert had failed to activate for one user. The error was due to an idiosyncratic workflow by that physician and a concurrent ophthalmology referral order in the EHR encounter suppressed the alert. To address this situation, we overrode the logic related to the concurrent ophthalmology referral. With these efforts, false negative rates had declined from phase I through phase III, showing a favorable performance outcome. 
Clinic Workflow Considerations
The two major challenges regarding physician workflow with the initial alert design included interruption of clinic workflow and timing of the alert appearance. The mandatory alert was important for maximizing data collection for research purposes. However, the “hard stop” led to some physician-users being unable to close the chart window without addressing the alert even if the intent was to close the note temporarily. User-centered design necessitates effective integration into the user's workflow,32 and thus programming modification to distinguish between closing and signing a note with a mandatory alert present is essential. The dismissible alert did offer some relief to the “hard stop” as earlier notification was offered. For long-term quality improvement use, the “hard stop” restriction could be eliminated to minimize workflow interruptions.28,35 
The second challenge involves the timing of the alert appearance during the patient encounter. Several studies have pointed to the importance of user satisfaction with workflow and usability as measures of effectiveness of CDS implementation.25,36,37 Some of our initial physician-user feedback showed that the alert appearance in the patient-physician encounter workflow left inadequate time to consider referral or consult with the patient. The alert location was chosen because all physician-users must access that part of the encounter chart at some point. No better workflow alternatives were identified with other location options. To meet the earlier reminder requests, we integrated a dismissible alert at chart opening (Fig.) and most physicians found this feature helpful. 
Strengths, Limitations, and Future Actions
This project has real-world eye care delivery implications to identify patients potentially in need of LVR care and is an example of how a CDS system can assist in a variety of ophthalmic clinical and research settings. The 15 physician-users represented 8 ophthalmology subspecialties and provided ongoing feedback on the CDS system. Consultation with physician-users and a foundation of the American Academy of Ophthalmology preferred practice guidelines framed the VA and diagnosis considerations for the firing criteria. However, the current Academy guidelines are based on best-corrected VA, which is often not available during certain subspecialty encounters as refraction is rarely performed. We used BDVA to define firing criteria to take advantage of the real-world data practices. A similar user-centered design approach of putting practice guidelines into action could be replicable in other CDS efforts and on other EHR platforms. 
This study was limited first by the fact that the CDS system did not include all indications for identifying patients potentially in need of LVR. For example, one physician commented on the firing criteria: “If we could integrate visual field data that would be great.” This may require an algorithm that uses visual field metrics (i.e. mean deviation) to drive the alert, or a behavior change among users to document ICD-10 visual field-related diagnoses. Second, individual user needs might not have been fully accommodated despite surveying and incorporating user requirements where feasible, exemplified by the different user preferences toward the dismissible alert at chart opening. Third, only data in the EHR's structured fields were examined for the alert firing. VA or diagnosis documented in the unstructured fields could not be included. 
We plan to report the primary findings of the alert on physician documentation behaviors, monthly reporting to users, and referral practices using this CDS system in a separate analysis. Further studies are needed to assess the effectiveness of the alert in identifying appropriate LVR candidates, to refine and customize the firing criteria to the individual user, and ultimately to determine the effectiveness of the CDS system in connecting patients to LVR services. 
Conclusions
We have developed a CDS system for systematic identification of patients meeting VA and diagnosis criteria potentially in need of LVR referral. Reliable firing of the alert coupled with a majority of physician users favoring ongoing usage suggests this approach has the potential to improve standardization of care for patients with vision loss. This novel approach of patient identification can be applied for ophthalmology clinical and research purposes to improve patient care and outcomes. 
Acknowledgments
Reader's Digest Partners for Sight Foundation. The funder played no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; nor the decision to submit the manuscript for publication. 
Disclosure: X. Guo, None; B.K. Swenor, None; K. Smith, None; M.V. Boland, Carl Zeiss Meditec (C); J.E. Goldstein, None 
References
Oscheroff J, Pifer E, Sittig D, Jenders R, Teich J. Clinical Decision Support Implementers' Workbook. Chicago, IL: HIMSS; 2004.
Fleddermann A, Jones S, James S, Kennedy KF, Main ML, Austin BA. Implementation of best practice alert in an electronic medical record to limit lower-value inpatient echocardiograms. Am J Cardio. 2018; 122: 1574–1577. [CrossRef]
Haase M, Kribben A, Zidek W, et al. Electronic alerts for acute kidney injury. Dtsch Arztebl Int. 2017; 114: 1–8. [PubMed]
Federman AD, Kil N, Kannry J, et al. An electronic health record-based intervention to promote hepatitis C virus testing among adults born between 1945 and 1965: a cluster-randomized trial. Med Care. 2017; 55: 590–597. [CrossRef] [PubMed]
Bejjanki H, Mramba LK, Beal SG, et al. The role of a best practice alert in the electronic medical record in reducing repetitive lab tests. Clinicoecon Outcomes Res. 2018; 10: 611–618. [CrossRef] [PubMed]
Lobach D, Sanders GD, Bright TJ, et al. Enabling health care decision making through clinical decision support and knowledge management. Evid Rep Technol Assess. 2012; 203: 1–784.
Bangash H, Pencille L, Gundelach JH, et al. An implementation science framework to develop a clinical decision support tool for familial hypercholesterolemia. J Pers Med. 2020;3: 6710.
Joshi M, Ashrafian H, Arora S, Khan S, Cooke G, Darzi A. Digital alerting and outcomes in patients with sepsis: systematic review and meta-analysis. J Med Internet Res. 2019; 21: e15166. [CrossRef] [PubMed]
Kaako A. Evaluating the efficacy of best practice alerts to improve Clostridium difficile early detection in hospital settings: a 6-month interim analysis of the 2-year prospective study. Avicenna J Med. 2018; 8: 87–91. [CrossRef] [PubMed]
Karlsson LO, Nilsson S, Bang M, Nilsson L, Charitakis E, Janzon M. A clinical decision support tool for improving adherence to guidelines on anticoagulant therapy in patients with atrial fibrillation at risk of stroke: a cluster-randomized trial in a Swedish primary care setting (the CDS-AF study). PLoS Med. 2018; 15: e1002528. [CrossRef] [PubMed]
Kirby AM, Kruger B, Jain R, O'Hair DP, Granger BB. Using clinical decision support to improve referral rates in severe symptomatic aortic stenosis: a quality improvement initiative. Comput Inform Nurs. 2018; 36: 525–529. [CrossRef] [PubMed]
Binns AM, Bunce C, Dickinson C, et al. How effective is low vision service provision? A systematic review. Surv Ophthalmol. 2012; 57: 34–65. [CrossRef] [PubMed]
Coker MA, Huisingh CE, McGwin GJr, et al. Rehabilitation referral for patients with irreversible vision impairment seen in a public Safety-Net Eye Clinic. JAMA Ophthalmol. 2018; 136: 400–408. [CrossRef] [PubMed]
American Academy of Ophthalmology. Vision rehabilitation preferred practice pattern. 2017. Available at: 335, https://www.aao.org/preferred-practice-pattern/vision-rehabilitation-ppp-2017. Accessed February 2, 2021.
Goldstein JE, Guo X, Boland MV, Swenor BK. Low vision care - out of site. Out of mind. Ophthalmic Epidemiol. 2020; 27: 252–258. [CrossRef] [PubMed]
Kumar H, Monira S, Rao A. Causes of missed referrals to low-vision rehabilitation services: causes in a tertiary eye care setting. Semin Ophthalmol. 2016; 31: 452–458. [PubMed]
Brunner J, Chuang E, Goldzweig C, Cain CL, Sugar C, Yano EM. User-centered design to improve clinical decision support in primary care. Int J Med Inform. 2017; 104: 56–64. [CrossRef] [PubMed]
Schumacher RM, Lowry SZ, Schumacher RM. NIST guide to the processes approach for improving the usability of electronic health records. Gaithersburg, MD: US Department of Commerce, National Institute of Standards and Technology; 2010.
Singh H, Ash JS, Sittig DF. Safety assurance factors for electronic health record resilience (SAFER): study protocol. BMC Med Inform Decis Mak. 2013; 13: 46. [CrossRef] [PubMed]
Kawamoto K, McDonald CJ. Designing, conducting, and reporting clinical decision support studies: recommendations and call to action. Ann Intern Med. 2020; 172: S101–S109. [CrossRef] [PubMed]
Goldstein JE, Guo X, Smith K, Boland MV, Swenor BK. Using an electronic health record advisory to identify patients for referral to vision rehabilitation services. Invest Ophthalmol Vis Sci. 2019; 60: 4045–4045.
International Organization for Standardization. Ergonomics of human-system interaction — Part 210: Human-centred design for interactive systems. 2019. Available at: https://www.iso.org/standard/77520.html. Accessed February 2, 2021.
Agency for Healthcare Research and Quality. Clinical Decision Support (CDS). Available at: https://digital.ahrq.gov/ahrq-funded-projects/current-priorities/clinical-decision-support-cds. Accessed February 2, 2021.
Zazove P, McKee M, Schleicher L, et al. To act or not to act: responses to electronic health record prompts by family medicine clinicians. J Am Med Inform Assoc. 2017; 24: 275–280. [CrossRef] [PubMed]
Gulla J, Neri PM, Bates DW, Samal L. User requirements for a chronic kidney disease clinical decision support tool to promote timely referral. Int J Med Inform. 2017; 101: 50–57. [CrossRef] [PubMed]
Das AV, Rath S, Naik MN, Ali MJ. The incidence of lacrimal drainage disorders across a tertiary eye care network: customization of an indigenously developed electronic medical record system - eyeSmart. Ophthalmic Plast Reconstr Surg. 2019; 35: 354–356. [CrossRef] [PubMed]
Ruppel H, De Vaux L, Cooper D, Kunz S, Duller B, Funk M. Testing physiologic monitor alarm customization software to reduce alarm rates and improve nurses' experience of alarms in a medical intensive care unit. PLoS One. 2018; 13: e0205901. [CrossRef] [PubMed]
Chaparro JD, Hussain C, Lee JA, Hehmeyer J, Nguyen M, Hoffman J. Reducing interruptive alert burden using quality improvement methodology. Appl Clin Inform. 2020; 11: 46–58. [CrossRef] [PubMed]
Shellum JL, Nishimura RA, Milliner DS, Harper CMJr., Noseworthy JH. Knowledge management in the era of digital medicine: a programmatic approach to optimize patient care in an academic medical center. Learn Health Syst. 2017; 1: e10022. [CrossRef] [PubMed]
Castaneda C, Nalley K, Mannion C, et al. Clinical decision support systems for improving diagnostic accuracy and achieving precision medicine. J Clin Bioinforma. 2015; 5: 4. [CrossRef] [PubMed]
Chen JH, Fang DZ, Tim Goodnough L, Evans KH, Lee Porter M, Shieh L. Why providers transfuse blood products outside recommended guidelines in spite of integrated electronic best practice alerts. J Hosp Med. 2015; 10: 1–7. [CrossRef] [PubMed]
Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003; 10: 523–530. [CrossRef] [PubMed]
Makam AN, Nguyen OK, Auerbach AD. Diagnostic accuracy and effectiveness of automated electronic sepsis alert systems: a systematic review. J Hosp Med. 2015; 10: 396–402. [CrossRef] [PubMed]
Thompson M, Van den Bruel A, Verbakel J, et al. Systematic review and validation of prediction rules for identifying children with serious infections in emergency departments and urgent-access primary care. Health Technol Assess. 2012; 16: 1–100. [CrossRef]
Chen H, Butler E, Guo Y, et al. Facilitation or hindrance: physicians' perception on best practice alerts (BPA) usage in an electronic health record system. Health Commun. 2019; 34(9): 942–948. [CrossRef] [PubMed]
Roshanov PS, Misra S, Gerstein HC, et al. Computerized clinical decision support systems for chronic disease management: a decision-maker-researcher partnership systematic review. Implement Sci. 2011; 6: 92. [CrossRef] [PubMed]
Kilsdonk E, Peute LW, Jaspers MWM. Factors influencing implementation success of guideline-based clinical decision support systems: a systematic review and gaps analysis. Int J Med Inform. 2017; 98: 56–64. [CrossRef] [PubMed]
Figure.
 
Low vision rehabilitation referral clinical decision support alert feature updates. Phase I – Mandatory alert at plan section with seven response options. Phase II – Mandatory alert at plan section with updated firing criteria and three response options. Phase III – Dismissible alert at chart opening in addition to the mandatory alert.
Figure.
 
Low vision rehabilitation referral clinical decision support alert feature updates. Phase I – Mandatory alert at plan section with seven response options. Phase II – Mandatory alert at plan section with updated firing criteria and three response options. Phase III – Dismissible alert at chart opening in addition to the mandatory alert.
Table 1.
 
Low Vision Rehabilitation Referral Electronic Alert Design and Updates by Project Phases
Table 1.
 
Low Vision Rehabilitation Referral Electronic Alert Design and Updates by Project Phases
Table 2.
 
Final Alert Criteria for Consideration of Low Vision Rehabilitation Referral
Table 2.
 
Final Alert Criteria for Consideration of Low Vision Rehabilitation Referral
Table 3.
 
Reasons and Rates for Alert Misfiring
Table 3.
 
Reasons and Rates for Alert Misfiring
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×