January 2024
Volume 13, Issue 1
Open Access
Low Vision Rehabilitation  |   January 2024
Field Evaluation of a Mobile App for Assisting Blind and Visually Impaired Travelers to Find Bus Stops
Author Affiliations & Notes
  • Shrinivas Pundlik
    Schepens Eye Research Institute of Mass Eye & Ear, Harvard Medical School, Boston, MA, USA
  • Prerana Shivshanker
    Schepens Eye Research Institute of Mass Eye & Ear, Harvard Medical School, Boston, MA, USA
  • Tim Traut-Savino
    The Carroll Center for the Blind, Newton, MA, USA
  • Gang Luo
    Schepens Eye Research Institute of Mass Eye & Ear, Harvard Medical School, Boston, MA, USA
  • Correspondence: Shrinivas Pundlik, Schepens Eye Research Institute of Mass Eye & Ear, 20 Staniford St., Boston, MA 02114, USA. e-mail: shrinivas_pundlik@meei.harvard.edu 
Translational Vision Science & Technology January 2024, Vol.13, 11. doi:https://doi.org/10.1167/tvst.13.1.11
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Shrinivas Pundlik, Prerana Shivshanker, Tim Traut-Savino, Gang Luo; Field Evaluation of a Mobile App for Assisting Blind and Visually Impaired Travelers to Find Bus Stops. Trans. Vis. Sci. Tech. 2024;13(1):11. https://doi.org/10.1167/tvst.13.1.11.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Purpose: GPS location-based navigation apps are insufficient to aid blind and visually impaired (BVI) travelers for micro-navigation tasks, such as finding the exact location of bus stops. The resulting large gaps could lead to BVI travelers missing their bus. We evaluated the ability of a signage detection mobile app, All_Aboard, to guide BVI travelers precisely to the bus stops compared to Google Maps alone.

Methods: The All_Aboard app detected bus stop signs in real-time via smartphone camera using a deep neural network model, and provided distance coded audio feedback to help localize the detected sign. BVI individuals used the All_Aboard and Google Maps app to localize 10 bus stops each in downtown and suburban Boston, Massachusetts. For each bus stop, the subjects used both apps to navigate as close as possible to the physical bus stop sign, starting from 30 to 50 meters away. The outcome measures were success rate and gap distance between the app-indicated location and the actual physical location of the bus stop.

Results: The study involved 24 legally blind participants (mean age [SD] = 51 [14] years; 11 [46%] women). The success rate of the All_Aboard app (91%) was significantly higher than the Google Maps (52%, P < 0.001). The gap distance when using the All_Aboard app was significantly lower (mean = 1.8, 95% confidence interval [CI] = 1.2–2.3 meters) compared to the Google Maps alone (mean = 7, 95% CI = 6.5–7.5 meters, P < 0.001).

Conclusions: All_Aboard micro-navigation app guided BVI travelers to bus stops more accurately and reliably than a location-based macro-navigation app alone.

Translational Relevance: The All_Aboard app together with a macro-navigation app can potentially help BVI individuals independently access public transportation.

Introduction
Blind and visually impaired (BVI) people often rely on public transportation, such as buses and subways, to travel for employment, leisure, and for other needs.1,2 Geolocation and transportation information accessed through smartphones has greatly facilitated macro-navigation. Navigation apps make up one of the major groups of vision assistance mobile apps available in the App store and Play store.3 Using smartphone apps, one can plan a route and get detailed instructions on mobile devices for point-to-point navigation via public transit. On the other hand, micro-navigation – navigating precisely to the desired destination at any stage of the journey – remains a largely unsolved issue for BVI individuals.4,5 Regional transit agencies are required to comply with the Americans with Disabilities Act (1990) regarding the accessibility of transit infrastructures.6 In the context of vision disabilities, the requirements include placing of large-print signage at bus stops, providing braille and tactile information within transit stations, and making stop announcements inside transit vehicles at main points, among others. However, lack of cues accessible from a distance is one of the main barriers to equal access to public transportation (for example, tactile bus stop signs are accessible only one touches them).712 
Systemic inaccuracies in the GPS based location services is one of the underlying problems that lead to micro-navigation challenges faced by people with BVI. This is also referred to as the last 30-feet or last 10-meter problem in wayfinding. For example, when navigating to a bus stop, a blind person following GPS-based navigation apps, may arrive at the app-indicated location with a considerable gap (typically a few to 10 meters) from the actual bus stop due to the localization error in the GPS service. For perspective, a 10 meter gap could be almost equal to an entire standard bus length. According to the feedback from blind travelers, sometimes even a small gap can be large enough for them to miss the bus because the bus drivers misunderstand their intention and not stop for them.1316 Weather and environment (density of tall buildings in downtown areas, for example) can further affect GPS-based localization. In the worst-case scenario, especially in crowded cities, the GPS localization may be off by more than a block, making the macro-navigation apps essentially useless in the pedestrian mode.17 In addition to localization error, there is a possibility of mapping errors (sometimes very large) in the stop locations that are made publicly available by the transit agencies. In our survey of 174 bus stop locations in the Boston metro area, about 23% were mapped more than 2 bus lengths away.18 Some large mapping errors are typically due to outdated mapping data for relocated bus stops. 
Mapping and localization errors together contribute toward making purely location-based services unreliable for micro-navigation tasks, such as finding bus stops. Making matters worse, bus stop signs can be one of many signs on a typical urban street (among traffic/parking signs and street signs), and thus finding it becomes a visual search task, in addition to a plain geolocation task. Because visual search performance is known to be significantly degraded in people with low vision,19,20 it is evident that a navigation aid is needed for micro-navigation with visual search capabilities that could work together with macro-navigation apps, especially in the last 10 to 15 meters of the destination. 
One of the conventional wayfinding solutions is the use of Bluetooth beacons or WiFi access points to provide micro-location information high accuracy on nearby landmarks.4,2125 The scalability and applicability of this approach in outdoor environments are restricted due to the high cost for infrastructure modification and maintenance. On the other hand, smartphones could allow rapid scaling of accessibility. A few smartphone apps have been developed, tested, or released to help people with BVI access public transportation specifically, or to navigate to destinations in general.2628 Apps such as Blindsquare and Lazarillo use GPS and location-based data to help users navigate, providing them with information about nearby points of interest, public transportation stops, and bus and train schedules. Because these apps are primarily GPS-based, they are still subject to the limitations of GPS-based navigation systems detailed above. In order to achieve localization more accurately, some apps combine location information together with landmark recognition.15,16,29,30 For example, the BlindWays app guides blind bus riders with crowdsourced clues that describe recognizable and permanent landmarks near the bus stop, like a tree, a fire hydrant, or a mailbox. However, landmark maps around the various locations have to be built, maintained, and made widely available prior to use. Combing signage information in General Transit Feed Specification (GTFS) with optical character recognition could provide a viable micro-navigation solution.31 For the purpose of bus stop navigation, a purely visual approach can work well if combined with a typical macro-navigation app. 
We have developed a micro-navigation mobile app, All_Aboard, which recognizes bus stop signs to help the users navigate within a short range of the physical location of the sign.32 The All_Aboard app is intended to be used together with a macro-navigation app like Google Maps in real-world usage. When users arrive within the vicinity of the bus stops, they can scan the surroundings with the All_Aboard app to find the bus stop signs. Our preliminary testing of the All_Aboard app indicated its superior localization performance compared to Google Maps app alone.17 The goal of this study was to evaluate All_Aborad to understand whether it would be successful in helping BVI transit users in detecting bus stop signs in real-world conditions and be able to close the gap, relative to macro-navigation aids. Our primary hypothesis was that the localization based on All_Aboard app was significantly better than just using a conventional navigation app (Google Maps) in terms of distance to the desired bus-stop location and rate of successful localizations. Given that GPS-based localization typically suffers in densely built downtown areas, we further hypothesized that All_Aboard might be more beneficial in these locations compared to more sparsely populated suburban areas. 
Methods
Participants
The inclusion criteria were: vision status of legal blindness, independent mobility without assistance from a sighted guide, physical ability to walk over a distance of about 1 mile at a given time, and familiarity with smartphone/mobile devices. Participants for this study were recruited via referrals from the Carroll Center for the Blind (Newton, Massachusetts), practitioners at vision rehabilitation clinics, and via a pool of volunteers who had participated in prior studies. The study protocol was approved by the Institutional Review Board at Mass Eye & Ear. The study followed the tenets of the Declaration of Helsinki and written informed consent was obtained from all the participants. The participants were reimbursed for travel to the study sites and for their time. 
All_Aboard App
One of the underlying ideas behind the All_Aboard app is that the bus stop signage is unique (different than other road signs), uniform in appearance (typically, but not always), and standardized across the entire area of a transit agency (Fig. 1A). Moreover, because the bus stop signs for a given transit agency have the same known physical size, one can estimate the approximate distance based on the image size of the detected bus stop sign (the farther the distance, the smaller the image, and vice versa). Therefore, it is feasible for computer vision algorithms to learn the appearance of the bus stop signs, detect them in the images captured by smartphone cameras, and estimate the distance to the actual sign. Bus stop detection in All_Aboard is performed in real-time using MobileNetV2 deep learning neural network,33 trained on about 10,000 images of bus stops collected for a given city/region. Images of bus stop signs were collected from the Google Street View imagery based on the publicly available bus stop coordinates via GTFS standard. The stop signs were manually labeled by placing a bounding box in the collected images. The trained model runs locally on the smartphone device (no cloud processing). 
Figure 1.
 
(A) Typical MBTA bus stop signs among the selected trial locations in downtown Boston. The signage on the left is the most recent version, however, older versions (on the right) with slightly different appearance but similar shapes persist throughout the coverage area. Very few bus stops in the area covered by the transit agency are sheltered, and the distinctive sign is often the only visual identification of the bus stop. (B) Operation of the All_Aboard app in the general vicinity of a bus stop. The lower inset shows the app detecting the bus stop sign (the bounding box is drawn around the detected sign in the camera view displayed on the smartphone screen). The percentage value around the bounding box indicates the confidence of detection. The upper inset shows a successful detection at night in low light conditions. The app works for both the old and new versions of signage for MBTA. Additionally, the app detection is robust to faded, partially damaged, or warped MBTA bus stop signs. A demonstration video of the All_Aboard app in action can be accessed online: https://www.youtube.com/watch?v=VUVpqEw1_2k. Screenshots are available at: https://apps.apple.com/ca/app/all-aboard/id1580638469.
Figure 1.
 
(A) Typical MBTA bus stop signs among the selected trial locations in downtown Boston. The signage on the left is the most recent version, however, older versions (on the right) with slightly different appearance but similar shapes persist throughout the coverage area. Very few bus stops in the area covered by the transit agency are sheltered, and the distinctive sign is often the only visual identification of the bus stop. (B) Operation of the All_Aboard app in the general vicinity of a bus stop. The lower inset shows the app detecting the bus stop sign (the bounding box is drawn around the detected sign in the camera view displayed on the smartphone screen). The percentage value around the bounding box indicates the confidence of detection. The upper inset shows a successful detection at night in low light conditions. The app works for both the old and new versions of signage for MBTA. Additionally, the app detection is robust to faded, partially damaged, or warped MBTA bus stop signs. A demonstration video of the All_Aboard app in action can be accessed online: https://www.youtube.com/watch?v=VUVpqEw1_2k. Screenshots are available at: https://apps.apple.com/ca/app/all-aboard/id1580638469.
Another key idea behind the operation of All_Aboard is that it supplements a macro-navigation app and thus only needs to be operational in the general vicinity of the bus stop. In a typical usage scenario (Fig. 1B), the user launches All_Aboard when a macro-navigation app (such as Google Maps, etc.) indicates that the user is near the desired bus stop location. The All_Aboard app senses phone orientation and searches only when the device is held in an upright position (i.e. the phone camera is facing forward, and not too low). The app can detect a stop sign from 30 to 50 feet distance (about 10–15 meters). The user can then scan with the phone camera in an arc to first determine the angular orientation of the sign. A positive detection leads to a beeping sound and a true positive detection is typically indicated by a series of continuous auditory tones. Once the sign is in the camera field of view, the auditory tones change in frequency as the user gets near the bus stop sign (similar to a homing signal). Thus, the app can help users gauge relative distance to the sign and adjust their approach. Whereas the distance to the bus stop is a continuous measure, it was mapped to 4 levels of audio tones in the app to indicate the approximate distance to the bus stop sign, with the highest frequency indicated that the detected sign was within 2 meters (approximately 6 feet). After it is launched, the app works without needing any active intervention from the user, as long as the device is held in an upright manner. 
The All_Aboard app is available to the public for free for iOS version 10 or later (Apple app store link: https://apps.apple.com/ca/app/all-aboard/id1580638469), and it is capable of recognizing bus stops in 10 major cities/regions around the world. The app was custom trained on the pictures of the bus stop signs of these 10 cities/regions, and new cities/regions could be updated by repeating this training procedures in the future. Once installed, the users can download any of the available trained neural network models for the corresponding transit agencies (at the time of first launch or thereafter on demand). In this study, the All_Aboard app was loaded with the model trained to detect bus stop signs of the Massachusetts Bay Transportation Authority (MBTA) that operates public transit in the Boston metro area. 
Study Design
The field study involved two visits at two separate study sites: downtown Boston (city) and near the campus of the Carroll Center for the Blind in Newton, Massachusetts (suburb). Each study site involved navigating to 10 bus stops following a specific route (Fig. 2). These bus stops are all the bus stops along the selected routes. No stop was deliberately selected or removed to favor any of the apps. For each bus stop, performance with All_Aboard and Google Maps was evaluated with both the apps running simultaneously. During the study, the participants were accompanied by a certified orientation and mobility specialist (COMS) who provided directions along the route and ensured the safety of the participants during the study. Members of the study team also accompanied the participant and the COMS, who administered the study and recorded measurements. For all trials, a preconfigured Android smartphone was provided to the participants. Participants were generally unfamiliar with the bus stop locations. 
Figure 2.
 
Routes at the two study sites: downtown Boston (left) and in Newton (right). Each site had 10 bus stop locations (indicated by numbers). The route is indicated by the dashed black line. The bus stops in Newton were on the opposite side of the street such that the route was a loop that was traversed in the direction shown by the arrows.
Figure 2.
 
Routes at the two study sites: downtown Boston (left) and in Newton (right). Each site had 10 bus stop locations (indicated by numbers). The route is indicated by the dashed black line. The bus stops in Newton were on the opposite side of the street such that the route was a loop that was traversed in the direction shown by the arrows.
Before starting the study, each participant was provided oral instructions and hands-on training with using the All_Aboard app at a practice location. First, they were given a brief oral description of the app, including its purpose, its functionalities, how it is supposed to work, and how to interpret the app’s audio tones. Then, the app was handed over to the participants and they were asked to walk using their mobility aid toward the bus stop and experience the working of the app. At the same time, the experimenter and the COMS, walking with the participants, provided oral feedback as needed. They were given a choice to repeat this process until they were comfortable using the app. 
At the start of the trial at each bus stop location, the participant was guided by the COMS to a location that was about 30 to 50 meters (approximately 100 to 150 feet) away from the bus stop sign along the direction of travel in approximately a straight ahead direction. The starting distance from the stop sign was varied at each stop location to dissuade participants from guessing the stop location based on step counting. The path from the starting location to bus stop sign did not involve crossing streets, except in the case of one bus stop location in Newton, where the stop sign was affixed very close to the intersection. At the starting point for each trial, the Google Maps app (operating in the pedestrian directions mode) was launched by the experimenter, LiveView calibration was done (this was one of the features of the Google Maps app that uses live camera imagery to geo-locate more accurately), and the mapped location of the said bus stop was set as the destination. Then, the All_Aboard app was launched such that both apps were running simultaneously (Fig. 3A). 
Figure 3.
 
Running the All_Aboard app and the Google Maps app simultaneously to find bus stops. (A) Screenshot of the device at the start point. Both the All_Aboard app and the Google Maps app (inset) were launched and run simultaneously. (B) A user holds the phone upright with the rear camera facing straight ahead. (C) Screenshot of the device when the Google Maps app indicate arrival at the destination. The All_Aboard app indicates the physical bus stop sign is still some distance ahead.
Figure 3.
 
Running the All_Aboard app and the Google Maps app simultaneously to find bus stops. (A) Screenshot of the device at the start point. Both the All_Aboard app and the Google Maps app (inset) were launched and run simultaneously. (B) A user holds the phone upright with the rear camera facing straight ahead. (C) Screenshot of the device when the Google Maps app indicate arrival at the destination. The All_Aboard app indicates the physical bus stop sign is still some distance ahead.
Then, the smartphone device was handed over to the participants. From this starting location, the participants were instructed to navigate as close to the bus stop sign as possible. They were also instructed to hold the smartphone upright with its rear camera pointing ahead (Fig. 3B). After this point, the participants walked on their own, relying on their habitual mobility aid (long cane or guide dogs) and their residual vision if present, along with the auditory feedback from the All_Aboard app. Meanwhile, the Google Maps app provided intermittent voice instructions, including distance to the destination in feet. People using guide dogs followed the same steps as the rest of the participants. Guide dogs aided in obstacle avoidance (if needed). The guide dogs involved in this study were not specifically trained to recognize bus stops and could not recognize bus stops during the trial. No optical or other visual aids were used during the trials. 
As the participants walked, they scanned the phone camera side-to-side to determine the relative orientation of the bus stop sign from their walking trajectory. At the end of the trial at a bus stop location, the participants stopped and notified the experimenters when they thought they were closest to the bus stop sign as per their judgment. This was primarily based on the audio feedback by the All_Aboard app – when the audio tone frequency and pitch were at the highest levels indicating the detected sign was very close. A few participants could use their residual vision from this point onward to get even closer. Distance from where they stopped to the actual stop sign was measured with a tape measure. This was the localization distance for the All_Aboard app. The Google Maps app also indicated via auditory feedback when it determined that the participant arrived at the destination (Fig. 3C). The distance between the bus stop sign and the arrival point according to Google Maps was also measured with the tape measure. 
At the time of their first study visit, the participants answered a brief survey that collected basic demographic information, vision status, use of vision aids, and their preferred transit options (public transit, rideshare, or private vehicle –family member driving). The level of vision was recorded either in terms of self-reported visual acuity in Snellen, or as light perception, or no light perception (in case of completely blind individuals). 
Outcome Measures
The two main outcome measures, separately obtained for each app (the All_Aboard and the Google Maps apps), were: the localization error (gap distance) in meters and the rate of successful localizations (success rate). As mentioned above, the gap distance was obtained via direct measurement of the distance between the app indicated/subject determined location of the bus stop and the physical location of the bus stop sign. When the All_Aboard app guided the participant close enough for them to identify (via their residual vision) the bus stop sign and touch the sign post with their hand or long cane, the gap distance was marked as 0. If the app indicated that the location on the ground was beyond the physical bus stop sign with respect to the direction of travel, then the measured gap distance was recorded as negative. A trial instance was deemed as a failure if the gap distance was >25 meters. The success rate for each app was defined as the percentage of locations for which a valid measurable distance along the travel path was available. 
At any given bus stop location, trial failures could occur because of various reasons. In the case of the Google Maps app, incorrect mapping of the bus stops was one of the reasons. Such failures were predictable and repetitive across all the subjects because the location of the bus stop in the map was fundamentally incorrect, for instance, the mapped location can be more than 30 meters away from the physical bus stop. Another cause of trial failure with the Google Maps app was large inaccuracies in geolocation leading to incorrect navigation directions, for example, when the app directed the users to cross streets, double-back, or go around a corner, which would lead them to completely miss the bus stop. These failures were more prevalent in the downtown Boston area with tall buildings and/or on overcast/rainy days. 
In the case of the All_Aboard app, trial failure could occur because of missed detection by the app or deficient technique by the subjects while using the app. Shadows and occlusions could lead to the app to fail to recognize a bus stop sign. Signage largely slanting away from the side walk direction can lead to detection failure with the app. On other occasions, the subjects did not scan sufficiently while walking toward the bus stop sign and lost the audio signal. If the bus stop sign was initially detected but then went outside the field of view of the camera as the subject approached, the continuous audio cue suddenly stopped. This was an indicator to the subjects that they either passed the sign or are too close to it. They were allowed to retrace their steps and try again once to zero-in or confirm the presence of the stop sign in the near vicinity. If major intervention by the COMS or the experimenter was needed to reorient the subject after initial failure to detect, then the trail was considered as a failure for the All_Aboard app for the given location, even if the sign was successfully detected in the subsequent tries. 
We did not consider time to complete a trial as an outcome measure because the primary goal was to reduce the gap distance. In this study, walking with the All_Aboard app meant that the subjects would take a longer amount of time in order to walk further closer to bus stops when the Google Maps app announced arrival. Time of trial would be a valid outcome measure when comparing two micro-navigation solutions, which was not in the scope of this study. 
Statistical Analysis
Potential factors of interest affecting the outcome measures were the app used (the All_Aboard or Google Maps app), the environment (city vs. suburb), and the vision status (with or without residual vision). Completely blind subjects without light perception were categorized as without residual vision, whereas the rest were with residual vision. Vision in the better eye was used for this categorization. Association of absolute gap distance with these above potential variables was analyzed within-subject via a linear model in repeated measures framework. The success rate was analyzed using a binary logistic regression. In addition to the main effects, the interaction among the above three factors were also examined. Estimated marginal means with their 95% confidence intervals (CIs) and contrasts are reported for gap distance. Estimated mean marginal probability of success and the 95% CI is reported from the logistic regression model for success rate. The P values <0.05 were considered statistically significant. Statistical analysis was performed using statistical packages in R (version 4.0.4).3439 
Results
A total of 25 subjects were recruited, of which data for both study sites was available for 24 subjects (see Table 1 for summary of subject characteristics). One subject was dropped from the trial after the first visit due to the concerns about overall physical fitness required to complete the study. Eleven participants (46%) were women. A variety of conditions affected the vision of the participants. Although all were legally blind in the United States, their vision ranged from completely blind to 20/200 vision. The majority of the subjects walked with a long cane. Public transit was the most preferred transit option, followed by rideshare and private vehicle. 
Table 1.
 
Study Participant Characteristics
Table 1.
 
Study Participant Characteristics
Table 2 shows the trial instances and other data for both apps and at each study site. Across 24 subjects, there were supposed to be 480 planned trials. However, over the course of the study, some bus stops were skipped due to construction or missing bus stop signs, resulting in a total of 48 instances with missing data. Therefore, each app was evaluated in a total of 432 instances. Overall success rate and gap distance measures were substantially better with the All_Aboard app than with the Google Maps app. 
Table 2.
 
Cumulative Statistics for Trial Data
Table 2.
 
Cumulative Statistics for Trial Data
In Table 2, successful instances with each app are listed independently of each other. When compared pairwise at each bus stop instance (Table 3), there were only a handful of instances where both apps failed (18 out of 432 or about 4%). There were 13 (3%) instances where the All_Aboard app failed but the Google Maps app succeeded, and there were 189 (44%) instances where the Google Maps apps failed but the All_Aboard app succeeded. For the former cases, the average gap distance with Google Maps was 9.3 (SD = 5) meters, and for the latter cases, the average SD gap distance with the All_Aboard app was 1.6 (SD = 1.4 meters. From 225 successful instances with the Google Maps app, the arrival location was mapped past the bus stop sign along the direction of travel in 60 instances (about 27%), with an average gap distance of 7.2 (SD = 2.9) meters. 
Table 3.
 
The 2 × 2 Tables Showing Joint Successes or Failures of the All_Aboard App and the Google Maps App Over all 432 Bus Stop Instances
Table 3.
 
The 2 × 2 Tables Showing Joint Successes or Failures of the All_Aboard App and the Google Maps App Over all 432 Bus Stop Instances
There was no significant effect of subject age, gender, or mobility aid used on the gap distance or on the success rate. The results and discussion are mostly related to three key factors: the app used, study site, and subject group based on their vision status. 
Gap distance (in meters), averaged over vision status and study sites, was significantly smaller with the All_Aboard app (mean = 1.8, 95% CI = 1.3–2.3) compared to the Google Maps app (mean = 7.0, 95% CI = 6.5–7.5, P < 0.001). Gap distance with the All_Aboard app was significantly smaller than the Google Maps app in the city and suburb, as well as in subjects with or without residual vision (Fig. 4A). The gap distance was significantly larger in the completely blind group (mean = 8.4, 95% CI = 7.3–9.5) compared to those with residual vision (mean = 5.4, 95% CI = 4.7–6.1, P < 0.001) in the city with the Google Maps app. No significant effect of vision status on the gap distance with the All_Aboard app was observed. A significant effect of study site was seen only in the case of the Google Maps app in subjects with residual vision, where gap distance in the suburb (mean = 6.8, 95% CI = 6.1–7.5) was significantly larger than that observed in the city (mean = 5.4, 95% CI = 4.7–6.1, P < 0.022). Again, no significant effect of study site was observed for gap distance with the All_Aboard app. 
Figure 4.
 
Outcome measures by app, location, and vision status. (A) The gap distance with the All_Aboard app was significantly smaller than the Google Maps app in both sites and in both subject groups. Those with residual vision achieved significantly smaller gap distance compared to completely blind with the Google Maps app in the city. Gap distance in the city was significantly lower than the suburbs in the case of subjects with residual vision with the Google Maps app. (B) The success rate with the All_Aboard app was significantly higher in both sites and in subjects with and without residual vision. Completely blind individuals when using the All_Aboard app in the suburbs had a significantly lower success rate compared to those with residual vision. For all panels, the error bars show 95% confidence interval of the mean; significance levels: *** : P < 0.001, ** : P = 0.001–0.01, and * : P = 0.01–0.05; P value adjustment: BH method for 4 tests.
Figure 4.
 
Outcome measures by app, location, and vision status. (A) The gap distance with the All_Aboard app was significantly smaller than the Google Maps app in both sites and in both subject groups. Those with residual vision achieved significantly smaller gap distance compared to completely blind with the Google Maps app in the city. Gap distance in the city was significantly lower than the suburbs in the case of subjects with residual vision with the Google Maps app. (B) The success rate with the All_Aboard app was significantly higher in both sites and in subjects with and without residual vision. Completely blind individuals when using the All_Aboard app in the suburbs had a significantly lower success rate compared to those with residual vision. For all panels, the error bars show 95% confidence interval of the mean; significance levels: *** : P < 0.001, ** : P = 0.001–0.01, and * : P = 0.01–0.05; P value adjustment: BH method for 4 tests.
The success rate with the All_Aboard app was significantly higher than the Google Maps app across both study sites and both the subject groups (Fig. 4B). The overall success rate with the All_Aboard app (mean = 0.91, 95% CI = 0.87–0.94), averaged over study sites and subject group factors, was much higher than the Google Maps app (mean = 0.52, 95% CI = 0.47–0.58, P < 0.001). When using the All_Aboard app in the suburban location, the success rate for completely blind subjects (mean = 0.8, 95% CI = 0.69–0.90, P < 0.001) was slightly but statistically significantly lower compared to those with residual vision (mean = 0.95, 95% CI = 0.91–0.98, P < 0.001). Otherwise, there was no significant difference in success rates for any other conditions. 
Discussion
When people (normally sighted or BVI) take buses in areas like the Boston metro region, where most bus stops are indicated just by a sign on a post, standing even a short distance away from the sign may cause the buses to not stop for them. This accessibility challenge may diminish independence, and compromise adoption of affordable transportation for BVI travelers.1,2,40,41 This is just one of the last 10-meter navigation assistance needs of BVI individuals that is unmet. In this study, we evaluated the ability of the All_Aboard app relative to the Google Maps navigation app in guiding BVI travelers accurately to bus stop locations in urban and suburban settings. The rate of successful localization of bus stops was substantially higher and the gap distance was much smaller when using the All_Aboard app compared to the Google Maps app navigation. On average, the All_Aboard app was able to guide the subjects within about 2 meters (approximately 6 feet) of the bus stop sign, whereas with the Google Maps app they were likely to be about 7 meters (approximately 23 feet) away. The large effect size of the All_Aboard app in terms of success rate of localization and the gap distance was observed irrespective of the location of testing, the vision status of the subjects, other demographic characteristics, and the kind of mobility aid used. Thus, our findings suggest that the All_Aboard app could help BVI travelers in navigation by accurately detecting the bus stop signs and guiding the users close enough to the designated stop, and therefore greatly reduce their chance of missing buses due to standing too far from the bus stops. Importantly, this study indicates that computer vision-based object recognition capabilities can be used in a complementary way and provide added benefit to purely mapping-based macro-navigation services in real-world settings. 
Prior to the study, we were expecting some difference in the All_Aboard app's advantage between city and suburban environment based on our previous preliminary study,17 that is, it could largely outperform the Google Maps app in the city, but not much in the suburbs where GPS accuracy is supposed to be better than in the city. We found that the All_Aboard app significantly outperformed the Google Maps app in both the city and suburb as the environment did not have any significant effect on the success rate with the Google Maps app, and its effect on gap distance was relatively modest (slight but statistically significant difference was seen only in subjects with residual vision). The primary reason for the lack of environment effect on the Google Maps app was that the localization error in the city was somewhat counteracted by large mapping errors in the suburbs. Thus, despite better localization accuracy of the Google Maps app in the suburb, large mapping errors caused a number of unsuccessful trials or large gaps. For instance, some of bus stop signs in the Newton site were moved onto nearby electric poles from their original location, but the new locations were not updated in the Google Maps app. 
The study design featuring simultaneous use of both apps when in the vicinity of bus stops reflects the intended real-world use of the All_Aboard app – as a complementary micro-navigation aid that should be used together with a macro-navigation app. Therefore, the results should not be interpreted as direct comparison of the All_Aboard app and the Google Maps app. We compared the All_Aboard app to the Google Maps app in data analyses because our goal was to not only to quantify the localization performance of a micro-navigation app, but also to emphasize the need for micro-navigation solutions. When a GPS based macro-navigation app like the Google Maps app says “you have arrived at your destination,” the users should search further to get to the exact destination location. Our study shows that if they use the All_Aboard app to search further, they will get a 1.8 meter gap on average and 9% failure rate, which is a beneficial improvement. It should be noted that residual vision may have affected or further aided in the localization performance when using the All_Aboard app. This is a positive finding, as it is expected that the users will use all modes of information available to them in real-world usage, and that they are likely to do better (in terms of localization) with the All_Aboard app than without it. 
Although the gap distance and success rate quantify localization performance, this study cannot fully inform about the actual impacts of the All_Aboard app in real life because the needs and abilities of individual users may differ considerably, and a baseline cannot be established in this initial study. Furthermore, this study did not establish an “end point,” as in a preset distance to which one should navigate to ensure that they will be “close enough” to the bus stop. In real-world conditions, a gap distance of 0 versus 1 meter may not matter much in terms of getting on the bus, but at what distance it starts becoming an adverse factor is not clear. Therefore, the benefit of the All_Aboard app needs to be studied in actual bus riding activities, which is planned in future work. 
As we enrolled participants with a wide range of visual abilities – from completely blind up to visual acuity of 20/200, we analyzed the effect of residual vision presence on the performance with the navigation apps. If the All_Aboard app could successfully guide low vision travelers close to the bus stop sign, it was possible that they could use their residual vision (even if it was only restricted to shape or form perception) from there on to navigate further close to the sign. We indeed observed this behavior in a few participants. However, as a whole, gap distance was not significantly different between those with or without residual vision – indicating that the app already guided the subjects close enough to the bus stop sign, such that any further shortening in gap distance due to residual vision was minor. 
Trial success rate was affected by the presence of residual vision in the suburb an location, where completely blind subjects experienced significantly more failures with the All_Aboard app compared to those with residual vision (success rate 0.8 vs. 0.95). A possible reason for the higher failure rate in the suburb could be because some of the bus stop signs on that site were not properly placed. Some were occluded by trees, slanting toward the street instead of the sidewalk, or not at the edge of the street curb. In these situations, scanning with the app in a sufficiently wide arc is crucial. However, completely blind subjects tended to scan across a narrow range or not scan at all due to complete loss of visual input to help with orientation. Therefore, they were more likely to miss signs that are not in the usual direction. More training and practicing on scanning skills might help improve their success rate in these situations. 
The field of smartphone navigation aids, and specifically for transit access for BVI people has seen a tremendous growth over the past few years with the introduction of a large number of apps.35 Whereas location-based navigation has its own advantages and limitations, utilization of object detection/recognition technology and its integration with location services may be more effective in guiding BVI travelers.31 In this regard, the All_Aboard app could potentially solve the last 30 feet problem frequently faced by smartphone location-based apps when used together with macro-navigation apps for the task of navigating to bus stops. The All_Aboard app could provide a clear and tangible benefit to BVI travelers who rely on public transportation (this was clear in the findings from our sample, where the vast majority rated it as their first or second option). 
Our study has some limitations. First, whereas the All_Aboard app can work in multiple different regional transit areas, our evaluation was limited to the Boston metro area. However, we believe the accuracy of the app may not be significantly different in other regions given that the computer vision model training is fundamentally the same for all locations. Second, whereas all participants were verified as legally blind, we did not measure the vision of the study participants and used self-reported vision levels. Therefore, we categorized the participants based on the presence of residual vision, which was roughly verified by observing their behaviors during the course of the study. For a field trial with a relatively small sample, this categorization was appropriate, but could be revised in the future if larger sample sizes are involved. Finally, this field trial only provides an estimate of the potential benefit of the All_Aboard app. The real understanding of its impact on the travel pattern of BVI individuals will require a long-term study. 
Although our results indicate that the All_Aboard app could guide users close to the bus stops with higher overall success rate of detection than just using the Google Maps app irrespective of the vision status, demographic characteristics, location, and the kind of mobility aids used, the actual benefits for users depends on the individual's baseline capability as well as needs for public transportation. The All_Aboard app would be especially useful for those who need to ride buses frequently but have an insufficient wayfinding skills/capacity. Bus stop signs in the real world come with a wide variety – the sign can be facing toward street, not near street curb, occluded by tree leaves at some points, far from mapping location, etc. Training on how to handle those unexpected situations will help maximize the utility of the app. In addition, improved app user interface to remind users to scan wide enough, optical character recognition functionality for recognizing bus routes, and conveying failure to detect after a certain amount of time could further improve user experience. 
Acknowledgments
The All_aboard app development was funded in part by Microsoft AI4A award. The authors would like thank Nick Corbett and Dina Rosenbaum from the Carroll Center for the Blind for their help with participant recruitment and coordination. 
Commercial Relationships: The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. The All_Aboard app evaluated in this study is released to public for free. There is no revenue from app sale or in-app advertisements. 
Disclosure: S. Pundlik, None; P. Shivshanker, None; T. Traut-Savino, None; G. Luo, None 
References
Crudden A, McDonnall MC, Hierholzer A. Transportation: an electronic survey of persons who are blind or have low vision. J Visual Impair Blindness. 2015; 109: 445–456. [CrossRef]
Crudden A. Transportation and vision loss: where are we now? J Am Soc Opthalmic Register Nurses. 2018; 43: 19–24.
Pundlik S, Shivshanker P, Luo G. Impact of apps as assistive devices for visually impaired persons. Ann Rev Vis Sci. 2023; 9: 12.11–12.20.
Parker AT, Swobodzinski M, Wright JD, Hansen K, Morton B, Schaller E. Wayfinding tools for people with visual impairments in real-world settings: a literature review of recent studies. Front Educ. 2021; 28(6): 1–23.
Swobodzinski M, Parker AT. A comprehensive examination of electronic wayfinding technology for visually impaired travelers in an urban environment: final report. NITC-RR-1177. Portland, OR: Transportation Research and Education Center (TREC); 2019.
US Department of Justice Civil Rights Division. ADA standards for accessible design. Available at: https://www.ada.gov/law-and-regs/design-standards/2010-stds/. 2010.
Marston JR, Golledge RG. The hidden demand for participation in activities and travel by persons who are visually impaired. J Vis Impair Blindness. 2003; 97: 475–488. [CrossRef]
Park J, Chowdhury S. Investigating the barriers in a typical journey by public transport users with disabilities. J Transport Health. 2018; 10: 361–368. [CrossRef]
Visnes Øksenholt K, Aarhaug J. Public transport and people with impairments – exploring non-use of public transport through the case of Oslo, Norway. Disabil Society. 2018; 33: 1280–1302. [CrossRef]
Wong S. Traveling with blindness: a qualitative space-time approach to understanding visual impairment and urban mobility. Health Place. 2018; 49: 85–92. [CrossRef] [PubMed]
Low W-Y, Cao M, De Vos J, Hickman R. The journey experience of visually impaired people on public transport in London. Transp Policy (Oxf). 2020; 97: 137–148. [CrossRef]
Jonnalagedda A, Pei L, Saxena S, et al. Enhancing the Safety of Visually Impaired Travelers in and around Transit Stations. Pittsburgh, PA: The Robotics Institute Carnegie Mellon University; Technical Report CMU-RI-TR-14-28; 2014.
Golledge RG, Marston JR, Costanzo CM. Attitudes of visually impaired persons toward the use of public transportation. J Vis Impair Blindness. 1997; 91: 446–459. [CrossRef]
Azenkot S, Prasain S, Borning A, Fortuna E, Ladner RE, Wobbrock JO. Enhancing independence and safety for blind and deaf-blind public transit riders. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Vancouver, BC, Canada: Association for Computing Machinery; 2011: 3247–3256.
Hara K, Azenkot S, Campbell M, et al. Improving public transit accessibility for blind riders by crowdsourcing bus stop landmark locations with Google Street View: an extended analysis. ACM Trans Access Comput. 2015; 6:Article 5.
Perkins School for the Blind. BlindWays: a crowdsourced bus stop location app. Available at: https://www.perkins.org/resource/blindways-crowdsourced-bus-stop-location-app/. (last accessed August 2023).
Jiang E, Ma Z, Singh A, et al. Field testing of All Aboard, an AI app for helping blind individuals to find bus stops (abstract). Invest Ophthalmol Vis Sci. 2021; 62: 3529–3529.
Luo G, Pundlik S. Widespread errors in bus stop location mapping is an accessibility barrier for passengers who are blind or have low vision. J Vis Impair Blindness. 2023; 117(5): 396–398. [CrossRef]
Kuyk TK, Liu L, Fuhr PS. Feature search in persons with severe visual impairment. Vis Res. 2005; 45: 3224–3234. [CrossRef] [PubMed]
Luo G, Satgunam P, Peli E. Visual search performance of patients with vision impairment: effect of Jpeg image enhancement. Ophthalmic Physiol Opt. 2012; 32: 421–428. [CrossRef] [PubMed]
Sáez Y, Muñoz J, Canto F, García A, Montes H. Assisting visually impaired people in the public transport system through RF-communication and embedded systems. Sensors. 2019; 19: 1282. [CrossRef] [PubMed]
Alvarado A, Chong A, Kojitani Y, et al. RouteMe2: a cloud-based infrastructure for assisted transit. Transportation Research Board 97th Annual Meeting. Washington DC, United States, January 07–11, 2018. Available at: https://escholarship.org/uc/item/8wx760m2.
Chen H-E, Lin Y-Y, Chen C-H, BlindNavi Wang I-F.: a navigation app for the visually impaired smartphone user. Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. Seoul, Republic of Korea: Association for Computing Machinery; 2015: 19–24.
El-taher FE-Z, Taha A, Courtney J, Mckeever S. A systematic review of urban navigation systems for visually impaired people. Sensors. 2021; 21: 3103. [CrossRef] [PubMed]
Flores G, Manduchi R. Experiments with a public transit assistant for blind passengers. In: Miesenberger K, Bühler C, Penaz P (eds), Computers Helping People with Special Needs. Cham, Switzerland: Springer International Publishing; 2016: 43–50.
Campbell M, Bennett C, Bonnar C, Borning A. Where's my bus stop? Supporting independence of blind transit riders with StopInfo. Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. Rochester, New York, USA: Association for Computing Machinery; 2014: 11–18.
BlindSquare. Available at: https://www.blindsquare.com/about/. Accessed August 2023.
Lazarillo. Available at: Inclusive navigation and digital maps. Accessed August 2023.
Lock JC, Cielniak G, Bellotto N. A Portable Navigation System with an Adaptive Multimodal Interface for the Blind. AAAI Spring Symposia; 2017.
Saha M, Fiannaca AJ, Kneisel M, Cutrell E, Morris MR. Closing the gap: designing for the last-few-meters wayfinding problem for people with visual impairments. Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. Pittsburgh, PA, USA: Association for Computing Machinery; 2019: 222–235.
Feng J, Beheshti M, Philipson M, Ramsaywack Y, Porfiri M, Rizzo JR. Commute booster: a mobile application for first/last mile and middle mile navigation support for people with blindness and low vision. IEEE J Transl Eng Health Med. 2023; 11: 523–535. [CrossRef] [PubMed]
Massachusetts Eye & Ear Infirmary. All_Aboard. Find bus stops for the blind. Available at: https://apps.apple.com/us/app/all-aboard/id1580638469. Accessed August 2023.
Sandler M, Howard A. MobileNetV2: the next generation of on-device computer vision networks. Available at: https://ai.googleblog.com/2018/04/mobilenetv2-next-generation-of-on.html. 2018.
Bates D, Mächler M, Bolker B, Walker S. Fitting linear mixed-effects models using lme4. J Stat Softw. 2015; 67: 1–48. [CrossRef]
Brooks ME, Kristensen K, van Benthem KJ, et al. glmmTMB balances speed and flexibility among packages for zero-inflated generalized linear mixed modeling. The R Journal. 2017; 9: 378–400. [CrossRef]
Lenth RV . emmeans: Estimated Marginal Means, aka Least-Squares Means. R package version 1.7.2. Available at: https://CRAN.R-project.org/package=emmeans. 2022.
Lüdecke D, Ben-Shachar M, Patil I, Waggoner P, Makowski D. Performance: an R package for assessment, comparison and testing of statistical models. J Open Source Softw. 2021; 6: 3139. [CrossRef]
Hartig F . DHARMa: residual diagnostics for hierarchical (multi-level /mixed) regression models. R package version 0.4.1. Available at: https://CRAN.R-project.org/package=DHARMa. 2021.
Wickham H. ggplot2: Elegant Graphics for Data Analysis New York, NY: Springer-Verlag; 2016.
Bleach K, Fairchild N, Rogers P, Rosenblum LP. Improving transportation systems for people with vision loss. Available at: https://www.afb.org/sites/default/files/2020-03/Improving-Transportation-Systems-People-Vision-Loss.pdf: American Federation for the Blind; Public Policy and Research Institute; March 2020.
O'Day B, Chanes-Mora P, Roth M, Lauckner M. Project VISITOR. Available at: https://www.afb.org/sites/default/files/2020-04/Project-VISITOR-Phase-Two-Report-Final.pdf: American Foundation for the Blind; March 2020.
Figure 1.
 
(A) Typical MBTA bus stop signs among the selected trial locations in downtown Boston. The signage on the left is the most recent version, however, older versions (on the right) with slightly different appearance but similar shapes persist throughout the coverage area. Very few bus stops in the area covered by the transit agency are sheltered, and the distinctive sign is often the only visual identification of the bus stop. (B) Operation of the All_Aboard app in the general vicinity of a bus stop. The lower inset shows the app detecting the bus stop sign (the bounding box is drawn around the detected sign in the camera view displayed on the smartphone screen). The percentage value around the bounding box indicates the confidence of detection. The upper inset shows a successful detection at night in low light conditions. The app works for both the old and new versions of signage for MBTA. Additionally, the app detection is robust to faded, partially damaged, or warped MBTA bus stop signs. A demonstration video of the All_Aboard app in action can be accessed online: https://www.youtube.com/watch?v=VUVpqEw1_2k. Screenshots are available at: https://apps.apple.com/ca/app/all-aboard/id1580638469.
Figure 1.
 
(A) Typical MBTA bus stop signs among the selected trial locations in downtown Boston. The signage on the left is the most recent version, however, older versions (on the right) with slightly different appearance but similar shapes persist throughout the coverage area. Very few bus stops in the area covered by the transit agency are sheltered, and the distinctive sign is often the only visual identification of the bus stop. (B) Operation of the All_Aboard app in the general vicinity of a bus stop. The lower inset shows the app detecting the bus stop sign (the bounding box is drawn around the detected sign in the camera view displayed on the smartphone screen). The percentage value around the bounding box indicates the confidence of detection. The upper inset shows a successful detection at night in low light conditions. The app works for both the old and new versions of signage for MBTA. Additionally, the app detection is robust to faded, partially damaged, or warped MBTA bus stop signs. A demonstration video of the All_Aboard app in action can be accessed online: https://www.youtube.com/watch?v=VUVpqEw1_2k. Screenshots are available at: https://apps.apple.com/ca/app/all-aboard/id1580638469.
Figure 2.
 
Routes at the two study sites: downtown Boston (left) and in Newton (right). Each site had 10 bus stop locations (indicated by numbers). The route is indicated by the dashed black line. The bus stops in Newton were on the opposite side of the street such that the route was a loop that was traversed in the direction shown by the arrows.
Figure 2.
 
Routes at the two study sites: downtown Boston (left) and in Newton (right). Each site had 10 bus stop locations (indicated by numbers). The route is indicated by the dashed black line. The bus stops in Newton were on the opposite side of the street such that the route was a loop that was traversed in the direction shown by the arrows.
Figure 3.
 
Running the All_Aboard app and the Google Maps app simultaneously to find bus stops. (A) Screenshot of the device at the start point. Both the All_Aboard app and the Google Maps app (inset) were launched and run simultaneously. (B) A user holds the phone upright with the rear camera facing straight ahead. (C) Screenshot of the device when the Google Maps app indicate arrival at the destination. The All_Aboard app indicates the physical bus stop sign is still some distance ahead.
Figure 3.
 
Running the All_Aboard app and the Google Maps app simultaneously to find bus stops. (A) Screenshot of the device at the start point. Both the All_Aboard app and the Google Maps app (inset) were launched and run simultaneously. (B) A user holds the phone upright with the rear camera facing straight ahead. (C) Screenshot of the device when the Google Maps app indicate arrival at the destination. The All_Aboard app indicates the physical bus stop sign is still some distance ahead.
Figure 4.
 
Outcome measures by app, location, and vision status. (A) The gap distance with the All_Aboard app was significantly smaller than the Google Maps app in both sites and in both subject groups. Those with residual vision achieved significantly smaller gap distance compared to completely blind with the Google Maps app in the city. Gap distance in the city was significantly lower than the suburbs in the case of subjects with residual vision with the Google Maps app. (B) The success rate with the All_Aboard app was significantly higher in both sites and in subjects with and without residual vision. Completely blind individuals when using the All_Aboard app in the suburbs had a significantly lower success rate compared to those with residual vision. For all panels, the error bars show 95% confidence interval of the mean; significance levels: *** : P < 0.001, ** : P = 0.001–0.01, and * : P = 0.01–0.05; P value adjustment: BH method for 4 tests.
Figure 4.
 
Outcome measures by app, location, and vision status. (A) The gap distance with the All_Aboard app was significantly smaller than the Google Maps app in both sites and in both subject groups. Those with residual vision achieved significantly smaller gap distance compared to completely blind with the Google Maps app in the city. Gap distance in the city was significantly lower than the suburbs in the case of subjects with residual vision with the Google Maps app. (B) The success rate with the All_Aboard app was significantly higher in both sites and in subjects with and without residual vision. Completely blind individuals when using the All_Aboard app in the suburbs had a significantly lower success rate compared to those with residual vision. For all panels, the error bars show 95% confidence interval of the mean; significance levels: *** : P < 0.001, ** : P = 0.001–0.01, and * : P = 0.01–0.05; P value adjustment: BH method for 4 tests.
Table 1.
 
Study Participant Characteristics
Table 1.
 
Study Participant Characteristics
Table 2.
 
Cumulative Statistics for Trial Data
Table 2.
 
Cumulative Statistics for Trial Data
Table 3.
 
The 2 × 2 Tables Showing Joint Successes or Failures of the All_Aboard App and the Google Maps App Over all 432 Bus Stop Instances
Table 3.
 
The 2 × 2 Tables Showing Joint Successes or Failures of the All_Aboard App and the Google Maps App Over all 432 Bus Stop Instances
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×