Image Credit: Pexels.com

How Augmented Reality Helps People With a Visual Impairment

3 Definitive Approaches For Each Level Of Vision

Research, User Experience Comments (2)

Many people think of blindness as a total lack of vision, a state of being without any perception of light whatsoever. But, the majority of people who are classified as “legally blind” have some sight, and even those who are viewed as totally blind may have some light perception.

Visual impairment needs to be seen as a continuum, ranging from mild vision loss to total lack of vision. Since 1972, the World Health Organization (WHO) has recognized six levels of visual impairment.

  1. 20/30 to 20/60 – near-normal vision or mild vision loss;
  2. 20/70 to 20/160 – moderate visual impairment, or moderate low vision;
  3. 20/200 to 20/400 – severe visual impairment, or severe low vision;
  4. 20/500 to 20/1000 – profound visual impairment, or profound low vision;
  5. Below 20/1000 – near-total visual impairment, or near total blindness; and
  6. No light perception – total visual impairment, or total blindness.

(READ MORE: Look at this white paper on digital assistive wayfinding and navigation (DAWN) technologies.)

In some countries, different levels of visual impairment are based on loss of the visual field. In the United States, any person with best-corrected visual acuity below 20/200 or a visual field smaller than 20° in their better-seeing eye is considered “legally blind.” All of the definitions are under discussion within the WHO, and may shortly be revised.  

Given that the vast majority of persons with a visual impairment has some degree of usable sight, efforts are underway in many locations worldwide to use various technologies to help people who are visually impaired improve the use of the vision they already have, or to enhance their experience of the world by adding new information into their perceptual mix through an emerging technology known as “augmented reality” (AR).

What is Augmented Reality?

Because the term “augmented reality” was first coined by Boeing researcher Thomas Caudell in 1990 in the context of how head-mounted displays worked, early definitions of AR tended to emphasize the visual aspects of reality. For example, the Oxford Dictionary defines AR as “a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view.”

But I prefer a broader view of AR, such as “…a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data.” This goes beyond simply adding computer-generated images on top of visual perceptions of the immediate environment to the possibility of using whatever useful visual perception a person might have, supplemented with additional computer generated information in the form of additional visuals, sound, vibrations, touch, smell, or even taste. As computers develop their own senses and enhance ours, we can create new perceptual sensations that help us to better understand the world around us. Many of these new digital technologies can be applied to build better assistive devices for people with visual impairments.

Interested In An Augmented Reality Demo?

AR and Visual Impairment

Digital technologies for improving devices to help people with visual impairments learn about, and navigate through, the world can be classified into three groups:

Enhancement Of A Person’s Existing Vision

Handheld digital magnifiers such as the new Pebble HD or the SmartLux Digital can be tailored for specific low vision needs. Users can try different color combinations, change the magnification level, or modify the contrast or brightness of an image at will.  

Information Augmentation

Limited access to visual information leaves room for the processing of additional information that may be useful to a person at any given time. Increasing the information density of everyday life by augmenting the information available about the environment, can improve interactions with people and things around us. AR technologies for feeding additional information sources to a person while they navigate the world include augmented reality glasses, object recognition technologies, and automatic mapping engines. These can be combined to produce new apps for helping persons with a visual impairment navigate better in their environment, both indoors and outdoors.

Sensory Substitution Or Addition

A third approach to augmenting information for persons with a visual impairment is “sensory substitution” or “sensory addition.” Here researchers use alternative sensory channels to feed information to a person with a disability as a way of augmenting their normal everyday experiences.  Examples of projects using alternative sensory channels include:

A critical issue is finding the right approach to design for augmentation devices so that they don’t attract unnecessary attention to the user. This is why virtual reality goggles, for example, might be the wrong approach to providing new sensory experiences for people with disabilities. A panel at the National Center for Biotechnology Information issued this warning about new technologies for people with disabilities:

The social awkwardness of the bulky headwear devices is another barrier to acceptance on the market. Like hearing aids, patients do not want to advertise their disability by wearing a signpost on their heads. The goggles also cut off socially essential eye contact. Even the low-vision patient would like to be able to look other people “in the eyes.”

We are working on many of these concepts at Float. We use a human-centered design approach in all our development work, and believe that our success in producing new assistive technologies has general applicability for new enterprise mobility apps.

Follow Float
The following two tabs change content below.
Gary Woodill is a senior analyst with Float, as well as CEO of i5 Research. He Gary Woodill is a senior analyst for Float Mobile Learning. Gary conducts research and market analyses, as well as assessments and forecasting for emerging technologies. Gary is the co-editor of "Mastering Mobile Learning," author of “The Mobile Learning Edge,” and the co-author of “Training and Collaboration with Virtual Worlds.” He also presents at conferences and is the author of numerous articles and research reports on emerging learning technologies. Gary holds a doctor of education degree from the University of Toronto.

» Research, User Experience » How Augmented Reality Helps People...
On September 28, 2016
By
, , , , ,

2 Responses to How Augmented Reality Helps People With a Visual Impairment

  1. […]  However, augmented reality actually includes whatever useful visual perception a person might have, supplemented with additional computer-generated sensory information to help us better understand our world. […]

Leave a Reply

Your email address will not be published. Required fields are marked *

« »