Adaptation of robot perception on fuzzy linguistic information by evaluating vocal cues for controlling a robot manipulator

A. G.B.P. Jayasekara, Keigo Watanabe, Kazuo Kiguchi, Kiyotaka Izumi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes a method for adapting robot's perception on fuzzy linguistic information by evaluating vocal cues. Robot's perception on fuzzy linguistic information such as "very little" depends on the environmental arrangement and the user's expectations. Therefore robot's perception on the corresponding environment is modified by acquiring user's perception through vocal cues. Fuzzy linguistic information related to primitive movements are evaluated by a behavior evaluation network (BEN). Vocal cue evaluation system (VCES) is utilized to evaluate the vocal cues for modifying the BEN. The proposed system is implemented by a PA-10 robot manipulator.

Original languageEnglish
Title of host publicationProceedings of the 15th International Symposium on Artificial Life and Robotics, AROB 15th'10
Pages926-929
Number of pages4
Publication statusPublished - Dec 1 2010
Event15th International Symposium on Artificial Life and Robotics, AROB '10 - Beppu, Oita, Japan
Duration: Feb 4 2010Feb 6 2010

Publication series

NameProceedings of the 15th International Symposium on Artificial Life and Robotics, AROB 15th'10

Other

Other15th International Symposium on Artificial Life and Robotics, AROB '10
Country/TerritoryJapan
CityBeppu, Oita
Period2/4/102/6/10

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Adaptation of robot perception on fuzzy linguistic information by evaluating vocal cues for controlling a robot manipulator'. Together they form a unique fingerprint.

Cite this