TY - GEN
T1 - A signal processing perspective on human gait
T2 - 4th International Conference on Interactive Collaborative Robotics, ICR 2019
AU - Gregorj, Adrien
AU - Yücel, Zeynep
AU - Hara, Sunao
AU - Monden, Akito
AU - Shiomi, Masahiro
N1 - Funding Information:
Acknowledgments. This work was supported by JSPS KAKENHI Grant Number JP18K18168 and JP18H04121. We would like to thank S. Koyama, H. Nguyen, P. Supitayakul and T. Pramot for their help in annotation and F. Zanlungo for his invaluable discussion.
Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019/1/1
Y1 - 2019/1/1
N2 - This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals. Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.
AB - This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals. Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.
KW - Gesture
KW - Group coordination
KW - Pedestrian
KW - Social robot
UR - http://www.scopus.com/inward/record.url?scp=85071483996&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85071483996&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-26118-4_8
DO - 10.1007/978-3-030-26118-4_8
M3 - Conference contribution
AN - SCOPUS:85071483996
SN - 9783030261177
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 75
EP - 85
BT - Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings
A2 - Rigoll, Gerhard
A2 - Ronzhin, Andrey
A2 - Meshcheryakov, Roman
PB - Springer Verlag
Y2 - 20 August 2019 through 25 August 2019
ER -