New eyewear could help the visually impaired
Ultrasound could spot obstacles in someone’s path, while other sensors could help identify colors
By Sid Perkins
PITTSBURGH, Pa. — About 253 million people — nearly one in every 30 worldwide — have some type of visual impairment. Roughly 36 million of them are totally blind. All of these people face daily challenges, from navigating safely to telling one color from another (which is important when your currency is color-coded). But two teen inventors have just unveiled new technologies to ease such problems.
Axel Toro Vega, 15, is a 10th grader at Dr. Carlos González High School in Aguada, Puerto Rico. Jane Hantanto, 18, is a senior at Smak Penabur Gading Serpong school. It’s in Kabupaten Tangerang, Indonesia. These teens live half a world apart. But earlier this month, both arrived here to unveil their new adaptive technologies at the Intel International Science and Engineering Fair (ISEF).
They were among nearly 1,800 students from 81 nations, regions and territories who competed for about $5 million is prizes and scholarships.
One of the biggest problems for people with visual impairments is safely getting from one place to another. To avoid bumping into or tripping over objects, many blind people use a cane. And in previous years, ISEF finalists have invented a number of “smart” canes to navigate around obstacles. Axel took a different route. His navigation band instead alerts wearers to objects in their path.
In effect, this device uses sound to help people see. Ultrasound, actually. It’s a range of high frequencies that people typically can’t hear. (Ultra is a Latin prefix that means “beyond” or “above.”) Few people can hear above frequencies of 20,000 hertz, or cycles per second. Axel’s new device broadcasts at 40,000 hertz. And there’s a good reason for that, the teen adds. He doesn’t want their use to audibly distract whomever wears them (or anyone nearby).
The teen mounted each of its two ultrasound emitters and two receivers in small holes on the device. Future versions, he notes, can be custom-printed to fit the dimensions of its wearer’s head.
Axel modeled a prototype device at Intel ISEF. He wore the narrow bar, which he 3-D printed, atop a regular pair of glasses at eyebrow level. But if he was totally blind, he could wear them in place of glasses at eye level, much like the adaptive eyewear worn by Lt. Cmdr. Geordi La Forge (LeVar Burton’s TV character on Star Trek: The Next Generation).
How it works
Key to the new device is a postage-stamp-sized computer chip. It controls the ultrasonic emitters (one over each eye). It directs these to regularly send out a brief pulse of sound. Each pulse lasts about 10 millionths of a second. The sound waves travel out, then bounce back to the system’s receivers (also positioned with one over each eye). The computer calculates how long it takes for an echo to return. The quicker that is, the closer is the object off of which it bounced.
The computer beeps a warning when it detects an object within 1.3 meters (4 feet) of the wearer. That warning can be heard through earphones, such as those used to listen to musical devices. If an object is closer than 50 centimeters (20 inches), the glasses actually vibrate.
Each emitter and receiver can detect objects located in a cone that is about 80 degrees wide. Together, both emitters and receivers, which are mounted side-by-side, can “see” objects in a swath about 140 degrees wide. That will include almost everything that’s in front of its wearer. But the system isn’t perfect, Axel admits. If a person is looking straight ahead and level, the system can only detect obstacles that are at least knee high. To see something lower than that, a wearer would need to periodically scan downward, too.
Six volunteers helped test Axel’s prototype. Most wore blindfolds during the trials. But one social worker at Axel’s school, who is blind, also took part in the tests, which included 16 obstacles. Axel’s system ended up spotting about 88 percent of them.
Listening to color
Of all visual impairments, colorblindness is relatively benign. Still, it can pose serious challenges. People with the red-green form can’t distinguish red from green (which can make interpreting traffic light signaling difficult). They also have trouble identifying some shades of purple. Red-green colorblindness is the most common type. It afflicts more than one in every 20 males and fewer than one in 200 females. But Jane notes that this isn’t the only form of colorblindness. For instance, some people can’t tell blue from yellow.
In all its forms, colorblindness can interfere with education because many charts and graphs are color-coded. It can keep people from properly matching the colors of their clothes. It even can pose a problem to shoppers, Jane points out. That’s because many products come in different versions with labels that differ largely by color. And in Indonesia, Jane’s homeland, the paper currency also differs by color.
Jane decided to invent a device to help discriminate between colors. It mounts on a pair of glasses. Like Axel’s, her system uses sensors and a computer. (Her computer is slightly larger and heftier, about the size of a cell phone.) Its computer easily can be worn in a small pouch on an armband.
A small gadget that can fit on a fingertip contains three sensors and a small white light-emitting diode (LED). When held against an object, the LED shines light on the object, which then reflects back to the sensors. One sensor detects red light and grades its intensity on a scale from 1 to 256. The other two sensors use the same scale to measure green and blue. Together, these scores represent the unique color of the object. (Any color can be represented in such a red-green-blue, or RGB, color system, Jane notes.)
Jane’s computer uses those three values to create a musical tone. The RGB scores modify, among other things, the tone’s frequency and volume. In general, lighter colors create a louder tone, says Jane. In theory, Jane’s RGB scores could represent more than 16.7 million colors. In practice, the teen notes, people can’t really tell the subtle differences between that many sounds.
Nevertheless, when she tested her system with blind volunteers, they quickly learned to identify a few colors. Some of her tests involved the light reflected off of fruit. In about one-third of those trials, volunteers were able to correctly match the “sound” of the fruit with the tone generated by a similar-colored sheet of origami paper.
In a different set of tests, Jane tested her classmates (who were blindfolded) to see if they could match a particular tone they heard with the one created by a specific color they had memorized earlier. In these trials, her classmates were, on average, able to learn seven different colors in about 5 minutes.
Many people with visual impairments might benefit from Jane’s system. Indeed, this system would work for people who are totally blind, she notes.
Society for Science & the Public created the ISEF competition and has been running it since 1950. (The Society also runs Science News for Students and this blog.) Intel sponsored this year’s event.