Open Access
Articles  |   March 2019
Validation of StrabisPIX, a Mobile Application for Home Measurement of Ocular Alignment
Author Affiliations & Notes
  • Warachaya Phanphruk
    Department of Ophthalmology, Boston Children's Hospital, Boston, MA, USA
    Department of Ophthalmology, Faculty of Medicine, Khon Kaen University, Khon Kaen, Thailand
  • Yingna Liu
    Department of Ophthalmology, Boston Children's Hospital, Boston, MA, USA
    Department of Ophthalmology, Harvard Medical School, Boston, MA, USA
  • Katharine Morley
    Department of Medicine, Massachusetts General Hospital, Boston, MA, USA
  • Jacqueline Gavin
    Department of Ophthalmology, Boston Children's Hospital, Boston, MA, USA
  • Ankoor S. Shah
    Department of Ophthalmology, Boston Children's Hospital, Boston, MA, USA
    Department of Ophthalmology, Harvard Medical School, Boston, MA, USA
  • David G. Hunter
    Department of Ophthalmology, Boston Children's Hospital, Boston, MA, USA
    Department of Ophthalmology, Harvard Medical School, Boston, MA, USA
  • Correspondence: David G. Hunter, Boston Children's Hospital, Department of Ophthalmology, 300 Longwood Ave, Fegan 4, Boston, MA 02115, USA. e-mail: david.hunter@childrens.harvard.edu 
Translational Vision Science & Technology March 2019, Vol.8, 9. doi:https://doi.org/10.1167/tvst.8.2.9
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Warachaya Phanphruk, Yingna Liu, Katharine Morley, Jacqueline Gavin, Ankoor S. Shah, David G. Hunter; Validation of StrabisPIX, a Mobile Application for Home Measurement of Ocular Alignment. Trans. Vis. Sci. Tech. 2019;8(2):9. https://doi.org/10.1167/tvst.8.2.9.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose: StrabisPIX is a smartphone application that allows clinicians to prescribe a series of self-obtained images of head position and eye alignment in nine positions of gaze that are uploaded onto a secure platform for clinician review. This study evaluates the clinical utility of this application.

Methods: In this prospective, nonmasked, cross-sectional study, 30 strabismus patients aged ≥2 years were evaluated. Participants received standardized instructions, used StrabisPIX to obtain images as prompted, and completed a satisfaction survey. During the same visit, an orthoptist obtained standard clinical images with a professional camera. All 60 image sets were evaluated by three observers.

Results: StrabisPIX image quality was similar to that of clinic photographs. Clinic photographs had significantly higher acceptability for horizontal versions (81% vs. 67%), vertical versions (76% vs. 60%), and head posture (93% vs. 81%). Abnormalities were detected at a similar rate for versions, head posture, eyelid position, and pupil size. StrabisPIX had significantly higher detection of alignment abnormalities (89% vs. 77% for clinical photos). Interrater/intrarater agreements were moderate to high (κ = 0.44–1.00) for all parameters except pupil abnormality, which had poor to fair agreement. Most patients reported that StrabisPIX was easy to learn and use.

Conclusions: Overall, StrabisPIX images had similar quality and were as useful as images obtained in the clinic in detecting abnormalities.

Translational Relevance: The StrabisPIX application will enhance the use of telemedicine by allowing physicians to prescribe self-obtained images documenting strabismus.

Introduction
Strabismus affects up to 3.6% of the population,1 and early recognition and intervention can optimize outcomes. The gold-standard diagnosis of strabismus includes assessment of ocular motility combined with prism-and-cover measurement of deviations in various positions of gaze. Unfortunately, not all patients have access to expert care of strabismus, and ongoing management may pose difficulties with the logistics of travel, scheduling, and time out of work. With the increasing availability of smartphones, some clinicians and families try to expedite care by using handheld devices to generate self-obtained images and forward to clinicians via text message or email for review; however, these images are often incomplete and out of focus, and they may not be compliant with privacy regulations. Patients rarely understand the concept of obtaining images in nine gaze positions, and when they do, the received images must be manually formatted into a 3 × 3 grid prior to review. 
To address these concerns, we collaborated with the Boston Children's Hospital (BCH) Innovation and Digital Health Accelerator (IDHA) to develop StrabisPIX, a smartphone application that guides patients through the process of obtaining images of their ocular alignment in nine positions of gaze as well as any preferred head position. The patient (or a family member or friend) follows a smartphone-guided, step-by-step process to take photographs of the eyes in a structured manner (These steps are outlined in Supplementary Figs. S1A, S1B). These photographs are then formatted in a panel of nine images and transmitted to their ophthalmologist using a HIPAA-compliant mechanism that is integral to the application. (A sample of the clinician dashboard is provided in Fig. 1A.) The clinical concept underlying the application was to allow for home-based, qualitative assessment of ocular motility to be reviewed by a clinician to make decisions about triage related to current or prospective patients with known or suspected strabismus. We therefore wished to determine how effectively this or similar technology can be applied once it is in the hands of patients. 
Figure 1
 
Clinician dashboard of StrabisPIX application.
Figure 1
 
Clinician dashboard of StrabisPIX application.
The purpose of this study is to assess objectively the quality and interpretability of StrabisPIX-generated images and to characterize the patient and clinician assessment of the usability of the technology. [The application may be previewed by downloading from http://bit.ly/StrabisPIXi (Apple) or http://bit.ly/StrabisPIXa (Android) and logging in using email strabispixtest@childrens.harvard.edu and access code 038514. (Note that images obtained using this account will, if submitted for review, be forwarded to the senior author.)] 
Patients and Methods
Study Design
This is a prospective clinical study comparing the quality of ocular alignment images taken by patients using the StrabisPIX application with professional strabismus photographs obtained using a digital SLR camera in the ophthalmology clinic at BCH to characterize the potential clinical utility of this new application. The primary outcome of this study is the clinical diagnostic quality of the images obtained by StrabisPIX; the secondary outcome is the acceptability of clinical assessment to detect suspected specific ocular abnormalities when present. 
Selection and Recruitment of Patients
Patients were prospectively recruited during routine ophthalmology clinic visits at BCH. Patients aged 2 years and above who had a diagnosis of strabismus qualified for inclusion in the study. The family must have had access to a smartphone. Patients who were unable or unwilling to follow instructions to fixate on targets in different directions were excluded. A sample size of 30 was selected. 
This research adhered to the tenets of the Declaration of Helsinki. Informed consent was obtained from the caregiver or patient, as appropriate. The study was approved by the Institutional Review Board of BCH and was compliant with the United States Health Insurance Portability and Accountability Act. A StrabisPIX study brochure was provided to subjects at the beginning of the clinic visit and was used to help them learn about the study and decide whether they wished to participate. 
Baseline data collected from patient's medical record during enrollment included age, sex, diagnosis of strabismus, angle of deviation, history and type of strabismus surgery, and presence of any co-existing ocular diagnoses. 
Enrollment and Training of Patients
Enrolled patients underwent a standard eye examination as part of the scheduled visit, including visual acuity, ocular fixation preference, and angle of deviation. Strabismus measurements were obtained in the eye clinic for all patients using the alternate prism-and-cover test at 1/3 m without correction. Ductions and versions were assessed per standard clinical practice. Professional eye alignment photographs were obtained using the BCH standard protocol for obtaining clinical images for strabismus, which includes an initial “head-and-shoulders” photograph to evaluate head position followed by close-up images of the eyes in the nine diagnostic positions of gaze. The latter nine images were manually formatted using Q-Image software (http://www.ddisoftware.com/qimage-u/) and converted to a PDF document. A representative set of clinical images is provided in Figure 1B
For the StrabisPIX images, subjects were provided with a link to download the application along with a unique access code. Family members were asked to obtain eye-alignment images in a private examination room during the office visit prior to pupil dilation. If this was not possible, images were obtained from home and submitted to the research team members using the application. Once the images were submitted, the application formatted and uploaded the images to the StrabisPIX dashboard, deleted all images from the smartphone, and notified the research coordinator that the dashboard images were available for review. A representative set of StrabisPIX images from the same patient shown in Figure 1B is provided in Figure 1C
After using the application, subjects completed a StrabisPIX Patient Satisfaction Survey (Fig. 2). The patient experience was self-evaluated in eight categories on a scale from 1 to 10. 
Figure 2
 
Patient satisfaction survey. Users were asked, “Please rate your satisfaction using the StrabisPIX application with regard to the following statements” as “Strongly agree,” “Agree,” “Neutral,” “Disagree,” “Strongly disagree.”
Figure 2
 
Patient satisfaction survey. Users were asked, “Please rate your satisfaction using the StrabisPIX application with regard to the following statements” as “Strongly agree,” “Agree,” “Neutral,” “Disagree,” “Strongly disagree.”
Assessment of Quality of the Images
Once the determined number of images had been collected, the research coordinator printed each image set onto a single sheet of paper using a color laser printer. Anonymized image sets were then shuffled and placed into a binder for assessment. Three professional readers (two ophthalmologists and one orthoptist) were provided with the image binder for scoring. A StrabisPIX Clinician Assessment form (Fig. 3) was used to ensure all readers followed the same criteria for evaluating the images. Images were assessed for head posture, binocular alignment, horizontal and vertical versions, eyelid position, and pupil size. The grading scale of clinical diagnostic quality of the images (grading score 1–5) was defined as “acceptable” (excellent [5] or good [4]) and “not acceptable” (fair [3]/poor [2]/unacceptable [1]). 
Figure 3
 
StrabisPIX Clinician Assessment form.
Figure 3
 
StrabisPIX Clinician Assessment form.
Statistical Analysis
Statistical analyses were performed using SPSS software to assess each aspect of image quality (as determined by questionnaires), interrater agreement (calculated among three raters), and patient satisfaction (as determined by survey responses). Generalized estimating equations population-averaged model was performed to compare the results of professional grading of clinical diagnostic quality of the images between StrabisPIX images and clinic images. The grading scale of clinical diagnostic quality of the images (grading score 1–5) was defined as “acceptable” (excellent [5] or good [4]) and “not acceptable” (fair [3], poor [2], or unacceptable [1]). Statistical significance was defined at P < 0.05. Cohen's κ statistical analysis was used to test interrater and intrarater reliability to determine the consistency among the three readers. The strength of agreement (κ) was interpreted as follows2: poor agreement (κ < 0), slight agreement (κ 0.00–0.20), fair agreement (κ 0.21–0.40), moderate agreement (κ 0.41–0.60), substantial agreement (κ 0.61–0.80), and almost perfect agreement (κ 0.81–1.00). 
Results
The study population (Supplementary Table S1) included 30 strabismus patients (18 males,12 females, aged range 3 to 81 years [mean, 31.3 years]). Of these, 12 (40%) had an intermittent strabismus (six with intermittent exotropia, five with intermittent esotropia, and one with intermittent hypertropia). Complex strabismus was present in two patients (6.7%), including one with third nerve palsy and one with Duane retraction syndrome (Figs. 1B, 1C). The angle of deviation was ≤10 PD in six patients (20%). A total of 60 ocular alignment images (30 StrabisPIX images and 30 formatted clinical photo sets) were provided to reviewers. 
Image quality was similar between StrabisPIX images and clinic images in terms of focus, field of view, ocular alignment, eyelid position, and pupil size abnormality. However, clinic images had a significantly (P < 0.05) higher rate of acceptability for horizontal versions (67% StrabisPIX vs. 81% clinical images), vertical versions (60% vs. 76%), and head posture (81% vs. 93%) (Fig. 4A). 
Figure 4
 
Comparison of StrabisPIX Image and Clinic Image. (A) Average percent acceptability (as defined in Fig. 3) of image quality between StrabisPIX application image and professional camera-based clinic image for all three readers across different evaluation categories. (B) Intra-rater agreement of clinical assessment to detect ocular abnormalities with StrabisPIX images (κ, three readers). (C) Inter-rater agreement of clinical assessment to detect specific ocular abnormalities (κ, three graders; two sets of images [StrabisPIX image versus Clinic image]). (D) Clinical assessment for presence of suspected specific ocular abnormalities (N = 90; all 3 readers). FOV, field of view; Oc Align, ocular alignment; Horiz Ver, horizontal versions; Vert Ver, vertical versions; Head Pos, head posture; Lid Pos, eyelid position; Pup Sz, pupil size; Align Abn, alignment abnormality; Horiz Ver Abn, horizontal versions abnormality; Vert Ver Abn, vertical versions abnormality; AHP, anomalous head posture; Lid Pos Abn, eyelid position abnormality; Pup Sz Abn, pupil size abnormality; OIA, overall image acceptability; Retake Pics, retake pictures.
Figure 4
 
Comparison of StrabisPIX Image and Clinic Image. (A) Average percent acceptability (as defined in Fig. 3) of image quality between StrabisPIX application image and professional camera-based clinic image for all three readers across different evaluation categories. (B) Intra-rater agreement of clinical assessment to detect ocular abnormalities with StrabisPIX images (κ, three readers). (C) Inter-rater agreement of clinical assessment to detect specific ocular abnormalities (κ, three graders; two sets of images [StrabisPIX image versus Clinic image]). (D) Clinical assessment for presence of suspected specific ocular abnormalities (N = 90; all 3 readers). FOV, field of view; Oc Align, ocular alignment; Horiz Ver, horizontal versions; Vert Ver, vertical versions; Head Pos, head posture; Lid Pos, eyelid position; Pup Sz, pupil size; Align Abn, alignment abnormality; Horiz Ver Abn, horizontal versions abnormality; Vert Ver Abn, vertical versions abnormality; AHP, anomalous head posture; Lid Pos Abn, eyelid position abnormality; Pup Sz Abn, pupil size abnormality; OIA, overall image acceptability; Retake Pics, retake pictures.
Intrarater agreements ranged from moderate to almost perfect (κ 0.49–1.00) for all parameters except pupil size abnormality, which had fair agreement (κ 0.23) (Fig. 4B). Interrater agreements were moderate to almost perfect (κ 0.44–1.00) for all parameters except pupil size abnormality, which had poor to fair agreement (κ (−0.01)–0.31) (Fig. 4C). Abnormalities were detected at a similar rate for horizontal and vertical versions, head posture, eyelid position, and pupil size; however, alignment abnormalities were detected at a significantly higher rate in the StrabisPIX images (89% vs. 77%); P < 0.05. All other assessments were similar in the two groups (Fig. 4D). 
Overall, patients were satisfied with the experience of using StrabisPIX (Fig. 5). The parents of one 3-year-old subject stated that although their child seemed too anxious to fully cooperate for imaging (or strabismus measurements) in the clinic, once the child returned home they were able to obtain images with the StrabisPIX application. In addition, one 12-year-old child with intermittent exotropia happened to have good control during the office visit, with no strabismus manifest in clinical images, but the family repeated the StrabisPIX images at home when the deviation was manifest. 
Figure 5
 
StrabisPIX patient satisfaction survey.
Figure 5
 
StrabisPIX patient satisfaction survey.
Discussion
The StrabisPIX application was designed in an effort to reduce barriers to quality healthcare by allowing clinicians to “prescribe” an imaging protocol that could then be carried out by the patient with the assistance of a smartphone. It allows patients or parents to obtain images of head position and eye alignment in nine positions of gaze and upload these onto a secure platform for clinician review. We are not aware of any similar patient-facing applications, though at least one clinician tool facilitating image formatting in nine positions is available (http://bit.ly/9Gaze). 
Other studies have evaluated the quality and accuracy of photographs in an effort to document and quantify strabismus. In the office, the clinician views the eyes with a coaxial light source to qualitatively assess strabismus by evaluating the relative centration of the corneal light reflexes. Ideally, photographs that document strabismus in primary position should exactly reproduce this familiar image.3 Hunter et al.3 evaluated the centration of the corneal light reflex of the flash relative to the camera lens and proposed that the flash unit be held adjacent to, and directly below, the camera lens, with the subject fixating on a target placed one fifth of the distance from the center of the flash to the center of the camera lens.3 This positioning is not required with a smartphone, where the flash is generally in close proximity to the lens. Others have evaluated the sensitivity and specificity of corneal light reflex centration for determining the presence of strabismus in the context of vision screening,46 including the use of deep neural networks to achieve automated strabismus detection.7 Our goal, however, was not to automate strabismus detection in populations but to allow for qualitative assessment of motility and alignment in self-obtained images to facilitate triage or ongoing evaluation of patients with suspected or known strabismus. 
In this study, we demonstrate that images obtained by patients were similar in quality and clinical utility to those obtained using a 35-mm SLR digital camera by a professional. Patients and family members generally found the application easy to use, and in two cases, the ability to obtain images at home may have allowed for superior imaging. Interrater/intrarater agreements were moderate to high (κ = 0.44–1.00) for all parameters except pupil abnormality, which had poor to fair agreement. The variability in pupil assessment could be a result of variability in illumination provided by the camera; while patients were encouraged to keep the flash on for all images, some were bothered by the flash and turned it off to facilitate photography. Pupil size anomalies are generally easier to detect with lighter iris color, especially when lighting is suboptimal. 
Our study does have some important limitations. Because of the distinctive nature of the output of the StrabisPIX application and the clinical photographs, it was not possible to mask clinicians/readers as to the source of the images. This may have allowed for bias in scoring, though we tried to mitigate this using multiple readers. The study is also limited in its generalizability; specifically, we enrolled only cooperative patients to determine feasibility of this concept. Hence, the elimination of uncooperative patients and the use of this device in an office environment may not necessarily apply to how it would be used in general practice. In general, we have found that young children, especially those under the age of 3, are unable to cooperate sufficiently to allow successful image acquisition in all gaze directions. This is not intended to be used as a vision screening tool, therefore we did not attempt to assess the accuracy of the application for detecting intermittent versus constant deviations or microstrabismus. The intent is to allow a clinician to understand within a particular clinical context how to determine the timing of follow-up and which resources to devote to future in-person evaluation; the focus of the study was on usability and a qualitative approach to whether the application would be useful for triage (e.g., “When should this patient come into the clinic, and what resources will be required for the evaluation?”). We did not attempt to correlate the size or nature of the abnormality with detection, nor to assess the sensitivity or specificity of detecting particular abnormalities. While the StrabisPIX application is available to all patients, the dashboard providing access to images is currently available only to clinicians at BCH. 
Overall, StrabisPIX allowed clinicians to prescribe a sequence of images that were obtained by patients but had image quality comparable to those obtained in the office. Patients reported good satisfaction with the experience, and clinicians in the study found the images useful for detecting abnormalities in ocular motility and alignment. The concept of self-obtained, clinician-prescribed imaging could be extended to other specialties such as craniofacial surgery, where sequences of staged images at a single point in time are of diagnostic value. The concept of HIPAA-compliant images provided and managed via a dashboard that interacts with the electronic health record to allow appropriate documentation and follow-up could be extended even to the acquisition of single images. While photographs are no substitute for an in-person clinical examination, applications such as StrabisPIX will provide clinicians with images that may supplement other forms of telemedicine. Such innovative mobile software can improve access to care in remote settings, enhance the patient care experience, and provide clinicians additional clinical data when an in-person examination is not feasible. 
Acknowledgments
This study was entirely conducted at Boston Children's Hospital, Boston, Massachusetts. 
Supported by Children's Hospital Ophthalmology Foundation, Inc., Boston, MA (DGH, AS); Department of Ophthalmology, Faculty of Medicine, Khon Kaen University, Khon Kaen, Thailand (WP). 
Funding and software development: Innovation and Digital Health Accelerator at BCH, with particular thanks to Nitin Gujral, Matt Murphy, Devin Nadar, and Lakshmi Yajurvedi. 
The study and its findings were presented at ARVO 2017 Annual Meeting (Baltimore, MD). 
No relevant conflicting relationship exists for the authors. The StrabisPIX application is free to use and the authors have no intellectual property or financial interest in the application. 
Disclosure: W. Phanphruk, None; Y. Liu, None; K. Morley, None; J. Gavin, None; A.S. Shah, None; D.G. Hunter, None 
References
McKean-Cowdin R, Cotter SA, Tarczy-Hornoch K, et al. Prevalence of amblyopia or strabismus in Asian and non-Hispanic white preschool children: multi-ethnic pediatric eye disease study. Ophthalmology. 2013; 120: 2117–2124.
Landis JR, Koch GG. An application of hierarchical kappa-type statistics in the assessment of majority agreement among multiple observers. Biometrics. 1977; 33: 363.
Hunter DG, Guyton DL. Vertical location of the corneal light reflex in strabismus photography. Arch Ophthalmol. 1998; 116: 767–771.
Barry JC, Effert R, Kaupp A, Burhoff A. Measurement of ocular alignment with photographic Purkinje I and IV reflection pattern evaluation. Invest Ophthalmol Vis Sci. 1994; 35: 4219–4235.
Duangsang S, Tengtrisorn S. The central corneal light reflex ratio from photographs derived from a digital camera in young adults. J Med Assoc Thail. 2012; 95: 699–703.
Maor R, Holland J, Tailor V, et al. Rate of strabismus detection on digital photographs increases by using off-center near target. J Pediatr Ophthalmol Strabismus. 2017; 54: 90–96.
Lu J, Feng J, Fan Z, Huang L, Zheng C, Li W. Automated strabismus detection based on deep neural networks for telemedicine application. Knowledge-based systems. https://www.groundai.com/project/automated-strabismus-detection-based-on-deep-neural-networks-for-telemedicine-applications/. Accessed March 13, 2019.
Figure 1
 
Clinician dashboard of StrabisPIX application.
Figure 1
 
Clinician dashboard of StrabisPIX application.
Figure 2
 
Patient satisfaction survey. Users were asked, “Please rate your satisfaction using the StrabisPIX application with regard to the following statements” as “Strongly agree,” “Agree,” “Neutral,” “Disagree,” “Strongly disagree.”
Figure 2
 
Patient satisfaction survey. Users were asked, “Please rate your satisfaction using the StrabisPIX application with regard to the following statements” as “Strongly agree,” “Agree,” “Neutral,” “Disagree,” “Strongly disagree.”
Figure 3
 
StrabisPIX Clinician Assessment form.
Figure 3
 
StrabisPIX Clinician Assessment form.
Figure 4
 
Comparison of StrabisPIX Image and Clinic Image. (A) Average percent acceptability (as defined in Fig. 3) of image quality between StrabisPIX application image and professional camera-based clinic image for all three readers across different evaluation categories. (B) Intra-rater agreement of clinical assessment to detect ocular abnormalities with StrabisPIX images (κ, three readers). (C) Inter-rater agreement of clinical assessment to detect specific ocular abnormalities (κ, three graders; two sets of images [StrabisPIX image versus Clinic image]). (D) Clinical assessment for presence of suspected specific ocular abnormalities (N = 90; all 3 readers). FOV, field of view; Oc Align, ocular alignment; Horiz Ver, horizontal versions; Vert Ver, vertical versions; Head Pos, head posture; Lid Pos, eyelid position; Pup Sz, pupil size; Align Abn, alignment abnormality; Horiz Ver Abn, horizontal versions abnormality; Vert Ver Abn, vertical versions abnormality; AHP, anomalous head posture; Lid Pos Abn, eyelid position abnormality; Pup Sz Abn, pupil size abnormality; OIA, overall image acceptability; Retake Pics, retake pictures.
Figure 4
 
Comparison of StrabisPIX Image and Clinic Image. (A) Average percent acceptability (as defined in Fig. 3) of image quality between StrabisPIX application image and professional camera-based clinic image for all three readers across different evaluation categories. (B) Intra-rater agreement of clinical assessment to detect ocular abnormalities with StrabisPIX images (κ, three readers). (C) Inter-rater agreement of clinical assessment to detect specific ocular abnormalities (κ, three graders; two sets of images [StrabisPIX image versus Clinic image]). (D) Clinical assessment for presence of suspected specific ocular abnormalities (N = 90; all 3 readers). FOV, field of view; Oc Align, ocular alignment; Horiz Ver, horizontal versions; Vert Ver, vertical versions; Head Pos, head posture; Lid Pos, eyelid position; Pup Sz, pupil size; Align Abn, alignment abnormality; Horiz Ver Abn, horizontal versions abnormality; Vert Ver Abn, vertical versions abnormality; AHP, anomalous head posture; Lid Pos Abn, eyelid position abnormality; Pup Sz Abn, pupil size abnormality; OIA, overall image acceptability; Retake Pics, retake pictures.
Figure 5
 
StrabisPIX patient satisfaction survey.
Figure 5
 
StrabisPIX patient satisfaction survey.
Supplement 1
Supplement 2
Supplement 3
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×