Classification of human grasp forces in activities of daily living using a deep neural network
Jesus Fernando Padilla-Magaña, Isahi Sanchez-Suarez, Esteban Peña-Pitarch
Abstract
The study of human grasp forces is fundamental for the development of rehabilitation programs and the design of prosthetic hands in order to restore hand function. The purpose of this work was to classify multiple grasp types used in activities of daily living (ADLs) based on finger force data. For this purpose, we developed a deep neural network (DNN) model using finger forces obtained during the performance of six tests through a novelty force sensing resistor (FSR) glove system. A study was carried out with 25 healthy subjects (mean age: 35.4±11.6) all right handed. The DNN classifier showed high overall performance, obtaining an accuracy of 93.19%, a precision of 93.33%, and a F1-score of 91.23%. Therefore, the DNN classifier in combination with the FSR glove system is an important tool for physiotherapists and health professionals to determine and identify finger grasp forces patterns. The DNN model will facilitate the development of tailored and personalized rehabilitation programs for subjects recovering of hand injurie and other hand diseases. In future work, prosthetic hand devices can be optimized to more accurately reproduce natural grasping patterns.
Keywords
Deep learning; Finger force; Force sensors; Human grasp; Rehabilitation
DOI:
https://doi.org/10.11591/eei.v13i6.7181
Refbacks
There are currently no refbacks.
This work is licensed under a
Creative Commons Attribution-ShareAlike 4.0 International License .
<div class="statcounter"><a title="hit counter" href="http://statcounter.com/free-hit-counter/" target="_blank"><img class="statcounter" src="http://c.statcounter.com/10241695/0/5a758c6a/0/" alt="hit counter"></a></div>
Bulletin of EEI Stats
Bulletin of Electrical Engineering and Informatics (BEEI) ISSN: 2089-3191, e-ISSN: 2302-9285 This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU) .