A breakthrough for the visually impaired with AI
A Chinese team has developed a 𝐩𝐫𝐨𝐭𝐨𝐭𝐲𝐩𝐞 𝐨𝐟 𝐜𝐨𝐧𝐧𝐞𝐜𝐭𝐞𝐝 𝐞𝐲𝐞𝐰𝐞𝐚𝐫 that continuously analyzes the environment in which the wearer is moving, with 𝐚𝐮𝐭𝐨𝐦𝐚𝐭𝐢𝐜 𝐀𝐈 𝐥𝐞𝐚𝐫𝐧𝐢𝐧𝐠. The device is highly accurate, since the frequency of scans of the surrounding environment is one image every 250 milliseconds. Depending on what is perceived by the AI, the device then transmits information to the person using 𝐚𝐮𝐝𝐢𝐭𝐨𝐫𝐲 𝐚𝐥𝐞𝐫𝐭𝐬 (voice commands) via a 𝐛𝐨𝐧𝐞 𝐜𝐨𝐧𝐝𝐮𝐜𝐭𝐢𝐨𝐧 𝐬𝐲𝐬𝐭𝐞𝐦. This research team has also developed skin patches that 𝐯𝐢𝐛𝐫𝐚𝐭𝐞 𝐰𝐡𝐞𝐧 𝐚𝐩𝐩𝐫𝐨𝐚𝐜𝐡𝐢𝐧𝐠 𝐚𝐧 𝐨𝐛𝐬𝐭𝐚𝐜𝐥𝐞.
These connected glasses have been tested on 𝐫𝐨𝐛𝐨𝐭𝐬 𝐚𝐧𝐝 𝐯𝐢𝐬𝐮𝐚𝐥𝐥𝐲 𝐢𝐦𝐩𝐚𝐢𝐫𝐞𝐝 𝐩𝐚𝐫𝐭𝐢𝐜𝐢𝐩𝐚𝐧𝐭𝐬 in virtual and real environments. These results were recently published in the journal 𝑁𝑎𝑡𝑢𝑟𝑒 𝑀𝑎𝑐ℎ𝑖𝑛𝑒 𝐼𝑛𝑡𝑒𝑙𝑙𝑖𝑔𝑒𝑛𝑐𝑒, and showed the device to be highly reliable in a variety of tests, such as moving through a maze or grasping an object. As these devices have been tested on a small number of patients, these results must of course be reproduced 𝐨𝐧 𝐚 𝐥𝐚𝐫𝐠𝐞𝐫 𝐩𝐨𝐩𝐮𝐥𝐚𝐭𝐢𝐨𝐧.
Whether it is to help diagnose eye diseases or provide visual assistance to patients, AI is likely to be used 𝐦𝐨𝐫𝐞 𝐚𝐧𝐝 𝐦𝐨𝐫𝐞. These solutions are a real revolution for visually impaired people, since they will give them greater independence and confidence, and ultimately enable them to 𝐦𝐨𝐯𝐞 𝐚𝐫𝐨𝐮𝐧𝐝 𝐦𝐨𝐫𝐞 𝐞𝐚𝐬𝐢𝐥𝐲 𝐢𝐧 𝐮𝐧𝐟𝐚𝐦𝐢𝐥𝐢𝐚𝐫 𝐞𝐧𝐯𝐢𝐫𝐨𝐧𝐦𝐞𝐧𝐭𝐬.