AI-on-skin: On-body AI inference for artificial skin interfaces

Clearly, there has been an increasing use of neural network based inference integrated with current artificial skin sensors in various applications ranging from robotics to health monitoring. However, due to a lack of on-skin compute, current skin sensors have to rely on off-body computers such as phones and cloud servers, for compute. This causes a significant slowdown in real time response.

For instance, for a person with a prosthetic arm, on-skin neural network compute can enable data from the skin-on interface integrated with his/her haptic arm to respond to touch, pressure, heat etc in real-time. We thus see a need for a fast, low-power on-skin AI compute engine (neural network accelerator) to be integrated with current artificial skins. In addition, the on-skin AI compute engine has to be highly configurable, so it can run diverse neural networks, and support multiple applications.

We developed AI-on-skin - a wearable artificial skin interface integrated with a neural network hardware accelerator that can be reconfigured across diverse neural network models and applications. AI-on-skin is designed to scale to the entire body, comprising tiny, low-power, accelerators distributed across the body.

We built a prototype of AI-on-skin that covers the entire forearm (17 by 10 cm) based on off-the-shelf FPGAs.Our electronic skin based prototype can perform (a) handwriting recognition with 96% accuracy, (b) gesture recognition with 95% accuracy and (c) handwritten word recognition with 93.5% accuracy. AI-On-Skin achieves 20X and 35X speedup over off-body inference via bluetooth and on-body microcontroller based inference approach respectively. To the best of our knowledge, AI-On-Skin is the first ever wearable prototype to demonstrate skin interfaces with on-body neural network inference.

Github code, AI-on-skin prototypes and real-time applications supported by AI-on-skin will be released soon!


Handwritten English alphabet recognition (96%)
English alphabet recognition
Handwritten Four-letter English word recognition (95%)
Four-letter English word recognition
Handwritten Gesture and drawing recognition (93.5%)
Gesture and drawing recognition

Video

The 5-minute video presentation at CHI 2021.

Video preview at CHI 2021.

Our Paper

AI-on-skin: Enabling On-body AI Inference for Wearable Artificial Skin Interfaces (Slides)
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, May 2021
Ananta Narayanan Balaji and Li-Shiuan Peh
@inproceedings{10.1145/3411763.3451689,
  author = {Balaji, Ananta Narayanan and Peh, Li-Shiuan},
  title = {AI-on-Skin: Enabling On-Body AI Inference for Wearable Artificial Skin Interfaces},
  year = {2021},
  isbn = {9781450380959},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  url = {https://doi.org/10.1145/3411763.3451689},
  doi = {10.1145/3411763.3451689},
  booktitle = {Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems},
  articleno = {358},
  numpages = {7},
  keywords = {Artificial Skin interfaces, Artificial intelligence accelerators, Neural networks, Wearable Computing, Handwritten text recognition},
  location = {Yokohama, Japan},
  series = {CHI EA '21}
  }

Team

The team is with the Department of Electical and computer Enginerring as well as Department of Computer Science at National University of Singapore.

Acknowledgements

The authors acknowledge the support from the Singapore National Research Foundation: NRF-RSS2016-005.