BLOGS WEBSITE

UbiComp 2016: Enabling Fine-Grained Hand Gesture Detection by Decoding Echo Signals

Collaborating with researchers from Princeton University and Tsinghua University, we present a work that can accurately decode the hand’s motion by transforming the COTS microphone and speaker of a mobile device into an active sonar system. The paper has been accepted by UbiComp 2016.

 

AudioGest: Enabling Fine-Grained Hand Gesture Detection by Decoding Echo Signals. Wenjie Ruan, Quan Z. Sheng, Lei Yang, Tao Gu, Peipei Xu, and Longfei Shangguan. The 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2016). Heidelberg, Germany, 12-16 September, 2016.

Abstract: Hand gesture is becoming an increasingly popular means of interacting with consumer electronic devices, such as mobile phones, tablets and laptops. In this paper, we present AudioGest, a device-free gesture recognition system that can accurately sense the hand in-air movement around user’s devices. Compared to the state-of-the-art, AudioGest is superior in using only one pair of built-in speaker and microphone,
without any extra hardware or infrastructure support and with no training, to achieve fine-grained hand detection. Our system is able to accurately recognize various hand gestures, estimate the hand in-air time, as well as average moving speed and waving range. We achieve this by transforming the device into an active sonar system that transmits inaudible audio signal and decodes the echoes of hand at its microphone. We
address various challenges including cleaning the noisy reflected sound signal, interpreting the echo spectrogram into hand gestures, decoding the Doppler frequency shifts into the hand waving speed and range, as well as being robust to the environmental motion and signal drifting. Our system is extensively evaluated in four real-world scenarios by five users for more than two weeks. The experimental results show that AudioGest can detect six hand gestures with a high accuracy, and by distinguishing the gesture attributions, it can provide up to 162 control commands for various applications.

 

This entry was posted in Research, Web Technologies. Bookmark the permalink.
 

Comments are closed.