Remote Virtual Companion via Tactile Codes and Voices for The People With Visual Impairment
Author:
Affiliation:

1) Key Laboratory for the Physics & Chemistry of Nanodevices, School of Electronics, Peking University, Beijing 100871, China;2) School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, China;3) Peking University Sixth Hospital, Peking University Institute of Mental Health, NHC Key Laboratory of Mental Health (Peking University), National Clinical Research Center for Mental Disorders (Peking University Sixth Hospital), Beijing 100191, China;4) School of Microelectronics, Shandong University, Jinan 250100, China;5) Key Laboratory for Neuroscience, School of Basic Medical Sciences, Neuroscience Research Institute, Department of Neurobiology, School of Public Health, Peking University, Beijing 100191, China

Clc Number:

Fund Project:

This work was supported by a grant from the National Key Research and Development Program of China (2017YFA0701302).

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Objective Existing artificial vision devices can be divided into two types: implanted devices and extracorporeal devices, both of which have some disadvantages. The former requires surgical implantation, which may lead to irreversible trauma, while the latter has some defects such as relatively simple instructions, limited application scenarios and relying too much on the judgment of artificial intelligence (AI) to provide enough security. Here we propose a system that has voice interaction and can convert surrounding environment information into tactile commands on head and neck. Compared with existing extracorporeal devices, our device can provide a larger capacity of information and has advantages such as lower cost, lower risk, suitable for a variety of life and work scenarios.Methods With the latest remote wireless communication and chip technologies, microelectronic devices, cameras and sensors worn by the user, as well as the huge database and computing power in the cloud, the backend staff can get a full insight into the scenario, environmental parameters and status of the user remotely (for example, across the city) in real time. In the meanwhile, by comparing the cloud database and in-memory database and with the help of AI-assisted recognition and manual analysis, they can quickly develop the most reasonable action plan and send instructions to the user. In addition, the backend staff can provide humanistic care and emotional sustenance through voice dialogs.Results This study originally proposes the concept of “remote virtual companion” and demonstrates the related hardware and software as well as test results. The system can not only achieve basic guide functions, for example, helping a person with visual impairment to shop in supermarkets, find seats at cafes, walk on the streets, construct complex puzzles, and play cards, but also can meet the demand for fast-paced daily tasks such as cycling.Conclusion Experimental results show that this “remote virtual companion” is applicable for various scenarios and demands. It can help blind people with their travels, shopping and entertainment, or accompany the elderlies with their trips, wilderness explorations, and travels.

    Reference
    Related
    Cited by
Get Citation

GE Song, HUANG Xuan-Tuo, LIN Yan-Ni, LI Yan-Cheng, DONG Wen-Tian, DANG Wei-Min, XU Jing-Jing, YI Ming, XU Sheng-Yong. Remote Virtual Companion via Tactile Codes and Voices for The People With Visual Impairment[J]. Progress in Biochemistry and Biophysics,2024,51(1):158-176

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:February 22,2023
  • Revised:December 06,2023
  • Accepted:April 13,2023
  • Online: January 19,2024
  • Published: January 20,2024