TY - JOUR
T1 - An End to End Wearable Device and System for Indefinite, Continuous, Real Time Gesture Recognition of Directional and Shape-Based Arm Gestures
AU - Gambo, Abdurrahman Aliyu
AU - Ali, Emmanuel Yahi
AU - Arungbemi, David Adeshina
AU - Hanif, Mehwish
AU - Anefu, Praise Ngbede
AU - Ali, Nyangwarimam Obadiah
AU - Thomas, Sadiq
AU - Chinda, Francis Emmanuel
AU - May, Zazilah
AU - Qureshi, Saima
AU - Kashif, Muhammad
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2025
Y1 - 2025
N2 - Human–Computer Interfaces designate how individuals interact with digital devices and systems. Often these interfaces heavily involve fine motor control via keyboards, controllers, and touch screens. These modalities tend to exclude users with disabilities, such as amputees or those affected by carpal tunnel syndrome. This paper proposes the creation of a wearable gesture recognition system that enables hands-free, device interaction through the detection of user movements and gestures. The system comprises of a wearable device with an onboard inertial measurement unit (IMU), a complementary filter for orientation estimation, and a gesture classification pipeline that recognizes a set of directional, rotational, and shape based gestures. We evaluate two models: an XGBoost classifier using IMU data preprocessed with a Mahony filter, and Wavelet Transform feature extraction, and a Convolutional Neural Network (CNN) directly using raw IMU data and Mahony-based orientation estimation. On the benchmark 6DMG dataset, our models achieve an accuracy of 99.55% and 99.18% respectively, while our custom-collected dataset yields 94.81% and 96.28%. In real-time, continuous, gesture tracking, the system achieves an average recognition accuracy of 92.33% with an average sensor to prediction latency of 20 milliseconds. Our system is actualized through a modular application programming interface (API) and graphical user interface (GUI), enabling real-time software interaction through movements and recognized gestures. Compared to existing methods, our approach achieves high accuracy, and continuous real-time performance with low-latency, scalable integration with external software.
AB - Human–Computer Interfaces designate how individuals interact with digital devices and systems. Often these interfaces heavily involve fine motor control via keyboards, controllers, and touch screens. These modalities tend to exclude users with disabilities, such as amputees or those affected by carpal tunnel syndrome. This paper proposes the creation of a wearable gesture recognition system that enables hands-free, device interaction through the detection of user movements and gestures. The system comprises of a wearable device with an onboard inertial measurement unit (IMU), a complementary filter for orientation estimation, and a gesture classification pipeline that recognizes a set of directional, rotational, and shape based gestures. We evaluate two models: an XGBoost classifier using IMU data preprocessed with a Mahony filter, and Wavelet Transform feature extraction, and a Convolutional Neural Network (CNN) directly using raw IMU data and Mahony-based orientation estimation. On the benchmark 6DMG dataset, our models achieve an accuracy of 99.55% and 99.18% respectively, while our custom-collected dataset yields 94.81% and 96.28%. In real-time, continuous, gesture tracking, the system achieves an average recognition accuracy of 92.33% with an average sensor to prediction latency of 20 milliseconds. Our system is actualized through a modular application programming interface (API) and graphical user interface (GUI), enabling real-time software interaction through movements and recognized gestures. Compared to existing methods, our approach achieves high accuracy, and continuous real-time performance with low-latency, scalable integration with external software.
KW - Computer accessibility
KW - convolutional neural networks
KW - event-based application program interface
KW - gesture recognition
KW - human computer interaction
KW - user interfaces
KW - wearable devices
UR - https://www.scopus.com/pages/publications/105014773807
U2 - 10.1109/ACCESS.2025.3602871
DO - 10.1109/ACCESS.2025.3602871
M3 - Article
AN - SCOPUS:105014773807
SN - 2169-3536
VL - 13
SP - 153436
EP - 153463
JO - IEEE Access
JF - IEEE Access
ER -