For those with good vision, it is easy to forget how fortunate we are to live in a world filled with technologies that were designed with us in mind. For the roughly 12 million Americans with a visual impairment, everyday life offers constant reminders that the designers of electronic gadgets and websites often give little thought to their needs, or have a poor understanding of what their needs actually are. As the global population continues to age, this problem is only going to expand in scope. Designers of novel technologies will need to be better aware of the access barriers that they inadvertently introduce during development.
As the saying goes, the best way to understand another person is to walk a mile in their shoes. Along these lines, several technologies have been developed to help a person with good vision experience the world as someone with a visual impairment does. One approach uses virtual reality, which can show the world as if any number of problems with visual acuity existed. However, this does not allow a user to see the actual world, and the actual technology they have developed, but rather a digital representation of it. Moreover, this method only displays 2D images which ignores issues related to a changing eye focus when looking at objects at different distances.
Low-tech options also exist, like uninstrumented glasses that have some sort of filter permanently in place over their lenses. Such solutions do not track with eye movements, limiting the types of conditions they can simulate, and they generally provide a poor representation of what someone with a visual impairment actually experiences. An innovative idea recently reported on by a research team headed up at Keio University in Japan uses a pair of eyeglasses with a programmable level and pattern of opacity in the lenses. The system also tracks eye movement, allowing the pattern of lens occlusion to shift and simulate what would be seen with various vision problems in real time.
The glasses use two monochrome 2.9 inch, 128 x 128 pixel LCD displays, one covering each eye. When the pixels are all turned off, the display panels are almost totally transparent, allowing the wearer to see normally through them. Each pixel can be individually controlled, which allows different visual impairments to be simulated. Further, the contrast level of each pixel can be set, which allows the severity of a condition to also be simulated. A SparkFun ESP32 ThingsPlus microcontroller development board was chosen to drive the LCD display screens. A pair of 200 Hz 192 x 192 pixel infrared cameras are pointed towards the eyes, and Pupil Core Eye Tracker software was used to extract eye movement information from that data. All processing was done on a laptop computer with an Intel i7 CPU and 16 GB of RAM.
The hardware was mounted on a pair of resin frames to finish the prototype. This implementation allows wearers to quickly check their designs to see if they are low-vision friendly, and also to turn on or off the visual impairment mode as needed. A study consisting of 14 participants was conducted to validate the glasses and assess user acceptance. They were given a pair of the glasses, then asked to search for a hidden personal item while either central or peripheral vision loss was simulated. Questionnaires were completed before and after using the device for this task, and it was found that a statistically significant increase in awareness of, and empathy for, visual acuity problems resulted from using the glasses.
In its current state, the device is not quite fast enough to keep up with rapid eye movements. Between the refresh rate of the LCD, and the sampling rate of the pupil tracking software, a lag of up to 100 milliseconds can occur, which is noticeable to a user. Work will need to be done to speed this refresh rate up to improve the experience and make it as true to life as it can be.
Hackster.io, an Avnet Community © 2022
Try to See It From My Perspective – Hackster.io