Helping the blind use sound to “see”
A scientist and a student team have developed the Colorophone system, which translates colour into sound.
I close my eyes to test a new type of “glasses” for the blind. The glasses are connected to a headset that interprets the environment for me.
When I move my head from side to side, my ears pick up sounds at different frequencies via the headphones. The different sounds correspond to different colours. Little by little, and surprisingly quickly, I form an approximate inner image of the surroundings.
This is the Colorophone, a portable sensor system that can help the blind orient themselves in everyday life.
The technology was developed by Dominik Osinski, an assistant professor at NTNU’s Department of Electrical Engineering and Renewable Energy.
“The idea came to me on the bus. I sat and looked out the window seeing all the great colours that streamed by – and thinking how blind people can’t see this. So things began to “percolate” in my head, and two days later I had a prototype ready,” says Osinski.
- You might also like: What makes type readable for the visually impaired
Students create new design
The prototype did not look particularly hip or user friendly, however. Osinski decided to launch the idea of product development for his undergraduates, and one of the student teams got fired up about the task. They have been developing the technology and created a new design for the Colorophone hardware in cooperation with the Norwegian Association of the Blind and Partially Sighted.
The students’ task was to design and build the hardware part of the system, and they also programmed the myRIO (Real-Time FPGA system) processing unit that transforms light and colour into sound waves.
The project was among the six finalist entries for the Northern European Student Design Contest, an annual competition held by National Instruments to reward students’ most innovative technical application projects. The students — Sindre Bjørsvik, Kawan Kandili, Jørgen Kapstad and Edwin de Pano—qualified for the finals in stiff competition with student projects from 30 other European countries.
- You might also like: Education vital for the visually impaired
Seeking solutions for over 100 years
Translating colours to sound has been a challenge that many researchers have taken on. Sir Isaac Newton is among those who developed a theory about the relationship between colour and sound. Newton defined seven primary colours: red, yellow, green, blue, indigo, orange and violet. He connected these seven colours with seven notes. The problems with this method were that it only worked for people with perfect pitch, and that the method couldn’t encode and render nuanced colours.
“The first electronic aid for the blind was already developed around 1898. Today, more than 100 years later, not a single recognized e-aid exists for the blind. The contrast between the development of consumer electronics for most people and the development of available electronic aids for individuals with special needs is huge,” says Osinski.
One of the solutions that exist are bionic eye implants, where an array of electrodes is connected directly to the optic nerve. But this kind of prosthesis is very expensive and requires surgical intervention – and the fact that 90 per cent of the world’s blind population live in poor countries means that very few people have access to this solution.
- You might also like: Seeing the difference between forests and roads
Sensory substitution gives hope
Systems of sensory substitution, like the Colorophone, can be an inexpensive alternative. Sensory substitution that information is sent to the brain through an alternative sensory channel, in this case through the ears instead of the eyes.
“New research in neuroscience shows that the brain is more task oriented than sensory oriented. The brain is more flexible than we used to think. In a sense, it’s our ‘plug and play unit’ and can adapt to analysing incoming information from different senses. We can activate the brain’s visual centre, the visual cortex, by sending visual information encoded as sound,” Osinski says.
The biggest challenge with sensory substitution systems that change images into sound is the difference in the amount of information that is transmitted through sight and hearing. The brain gets around 100 times more information through the eyes than through the ears.
“So that’s why we have to transmit what’s most important, which are colours,” says Osinski.
In addition, the Colorophone system transmits distance as ticking sounds, which improve orientation ability.
Quick and easy to learn
Osinski’s method and invention were inspired by the human visual system. The Colorophone prototype consists of a pair of glasses with a built-in camera, a proximity sensor, AfterShokz headphones and a processor (myRIO).
The camera takes a picture and sends digital RGB values (colours) to the processing device. The different RGB values are used to create a waveform, which is sent to the headphones. The user needs to learn to interpret sound frequencies in order to visualize the surroundings.
“Our colour perception comes from comparing the responses of different colour-sensitive cells called cones. We’re equipped with three different cone types that have high sensitivity to a range of light wavelengths. We perceive those wavelengths as red, green and blue,” says Osinski.
The Colorophone method links the colour red to a high-frequency sound, the colour green to a mid–frequency sound, blue to a low–frequency sound and white to a hushed noise. The encoding method is based on a psychological analysis of intuitive associations between colour and sound. The other colours in the colour spectrum are created from the three basic colours red, green and blue (RGB), and this method makes it possible to represent all the possible colours with easily recognizable sounds.
“This is a new way to experience colours. With this system we’re able to listen to a wide range of colours, without the user having to learn a wide range of frequencies,” says Osinski.
During the testing of the method and equipment, test subjects were able to identify 14 colours after 5 minutes of training time with an efficiency of 98.6 per cent.
Cheap to produce
The World Health Organization (WHO) has estimated that there are 285 million visually impaired people in the world. Of these, more than 39 million are blind and many of them live in poor countries.
“We hope that the Colorophone project will help towards developing a technology that will be available and affordable for many people, including those in poor countries,” says Bjørvik, one of the students on the Colorophone team.
Equipment components are reasonable
Osinski says he wants to develop a system that is not only user-friendly and improves quality of life, but that also seamlessly integrates into the user’s everyday life. “You can compare it with reading glasses hanging around your neck after you’re done reading,” he says.
During the testing of the method and equipment, the test subjects were able to identify 14 colours after 5 minutes of practice, with an efficiency of 98.6 per cent.
The next step will be to develop a new design of the glasses, with integrated blink control and an application that can be used on mobile phones. Osinski hopes to create an interdisciplinary project at NTNU related to the new technology, both in terms of research – to understand more of what happens in the brain when colours are transformed into sound – as well as design and usability.
He has already established a collaborative framework with psychology researchers from the UK and Poland who have built a laboratory for researching the Colorophone.
“Research-wise it will be very interesting to see if we can improve cognition through long-term use of the system,” Osinski says.