TY - GEN
T1 - Feature selection for hand pose recognition in human-robot object exchange scenario
AU - Rasines, Irati
AU - Remazeilles, Anthony
AU - Bengoa, Pedro M.Iriondo
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/1/8
Y1 - 2014/1/8
N2 - Vision-based hand gesture recognition relies on the extraction of features describing the hand, and the appropriate set of features is usually selected in an empirical manner. We propose in this article a systematic selection of the best features to be considered. An iterative sequential forward feature selection (SFS) approach is proposed to combine the features with the highest recognition rate considering the Gaussian Mixture Modelling within the Expectation Maximization algorithm as classification technique. This approach has been tested with two different illustrative databases. The first one is related to human robot physical interaction and the hand postures considered correspond to key postures the human partner performs just before acquiring an object from the robot. The second database corresponds to the representation of the 10 first numbers of the American Sign Language. In both cases, the recognition rate obtained, measured through the F1 score metrics, is satisfactory (over 0,97), and demonstrates that the proposed technique could be applied to a very large field of applications.
AB - Vision-based hand gesture recognition relies on the extraction of features describing the hand, and the appropriate set of features is usually selected in an empirical manner. We propose in this article a systematic selection of the best features to be considered. An iterative sequential forward feature selection (SFS) approach is proposed to combine the features with the highest recognition rate considering the Gaussian Mixture Modelling within the Expectation Maximization algorithm as classification technique. This approach has been tested with two different illustrative databases. The first one is related to human robot physical interaction and the hand postures considered correspond to key postures the human partner performs just before acquiring an object from the robot. The second database corresponds to the representation of the 10 first numbers of the American Sign Language. In both cases, the recognition rate obtained, measured through the F1 score metrics, is satisfactory (over 0,97), and demonstrates that the proposed technique could be applied to a very large field of applications.
KW - Feature selection
KW - Gaussian Mixture Models
KW - Human-Robot interaction
KW - SFS
KW - Vision-based hand static gesture recognition
UR - http://www.scopus.com/inward/record.url?scp=84946685486&partnerID=8YFLogxK
U2 - 10.1109/ETFA.2014.7005139
DO - 10.1109/ETFA.2014.7005139
M3 - Conference contribution
AN - SCOPUS:84946685486
T3 - 19th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2014
BT - 19th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2014
A2 - Martinez Garcia, Herminio
A2 - Grau, Antoni
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 19th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2014
Y2 - 16 September 2014 through 19 September 2014
ER -