The visually impaired are not able to see the things around them but they can feel smells, hear sounds, and explore surroundings by touch. What if there was a new way for them to perceive the beauty of the world? Music Bond uses sound transmission through the bone (ears stay free), 3D audio, tracking eye and head position, recognition of the distance to objects and augmented reality. All these technologies already exist but these elements are used together for the first time for the visually impaired. IR emitter and sensor scan the surrounding space in 3D, recognising obstacles, their shape and distance. The higher the obstacle, the higher the sound it emits; the nearer the object, the louder the sound. A special algorithm combines notes into tonal chords – the same way computers help contemporary musicians turn algorithms into music.
With the help of the sound-phase shift and binaural effect, 3D audio-images of things around can be created. Using sensors (electrodes placed around the eyes), Music Bond reads the movement of a person’s eyes. This increases the feeling that one can “see”. If a person is not able to move their eyes, the eye tracking function can be disabled. A photo-camera and voice assistance technology helps Music Bond recognise any objects. Simply press a special button on the device and wait for the voice assistance to define what the object is. By placing the palm onto the interactive surface and using simple gestures, one can easily control functions such as volume, sensitivity and voice assistance activation.