Latest news

Visually impaired hear the world thanks to radar

Created by Nils Pohl | |   Aktuelle Meldungen | InsysRaVis-3D

Perceiving the world like a bat - that's what blind and visually impaired people should be able to do with a new technology.

Scanned by radar technology and translated into an acoustic image: This is how the environment is to be presented to blind and visually impaired people in the future so that they can orient themselves better. Researchers at Ruhr-Universität Bochum (RUB) are developing a corresponding sensor system together with partners from the field in the Ravis-3D project. It is being funded for three years by the European Union and the state of North Rhine-Westphalia with a total of 1.8 million euros. 1.36 million euros of this will go to RUB.

High-tech should help in everyday life

Until now, guide dogs and canes have been the main way for visually impaired people to find their way around. The partners in the Ravis-3D project now want to improve everyday assistance with high-tech. Their plan: special sensors on the head or body detect the surroundings by radar. "Thanks to state-of-the-art radar technology, similar to that which will enable autonomous driving in the future, the sensor technology also works in poor lighting conditions and in the rain," explains project partner Prof. Dr. Nils Pohl from the Chair of Integrated Systems at RUB. The environment is then translated into audio signals in real time. A 3D audio environment is created, which is displayed to the user via a hearing aid. The rotating radar in the anechoic chamber of the RUB researchers.

The system is designed to represent obstacles and movements by different sound properties such as pitch or audio frequency coming from the corresponding correct direction. "To make the 3D audio environment usable, a very small but fast computing system is needed," said Prof. Dr. Michael Hübner of the Chair of Embedded Systems. "It has to process the radar data in real time and take into account movements of the user as well as rotations of the head. Thus, it can generate a freely rotating 3D environment and output it via the hearing aid."

The technology should make it possible for the user to recognize obstacles, estimate distances and move relatively naturally in the environment. SNAP GmbH, another project partner, will test, among other things, whether audio signals via a hearing aid are sufficient for orientation, or whether tactile signals - i.e. a brief tap on the skin - can provide additional assistance. Interdisciplinary collaboration

The development of the innovative tool is possible because three chairs of the RUB Faculty of Electrical Engineering and Information Technology and numerous partners from industry are pooling their expertise: The team from the Chair for Integrated Systems is responsible for developing the radar sensor technology. It wants to further reduce the size of the sensor technology and make it pleasantly portable. The team from the Chair for Embedded Systems is taking care of processing the radar data in real time and creating a virtual 3D map from it. In close cooperation with the Institute of Communication Acoustics, real-time capable algorithms are to be developed to make the virtual map of the environment pleasantly and high-resolution accessible to the user via an acoustic 3D display.

"Low latency and individual acoustic adaptation are the presumed key factors here for the perceived quality of virtual acoustics," says Dr. habil. Gerald Enzner. The researchers attach particular importance to the fact that the system can also be used intuitively and that the sound presentation does not generate any stress.

Source

Back
To Top