Episodic Memory Multimodal Learning for Robot Sensorimotor Map Building and Navigation

Wei Hong Chin, Yuichiro Toda, Naoyuki Kubota, Chu Kiong Loo, Manjeevan Seera

研究成果査読

7 被引用数 (Scopus)

抄録

In this paper, an unsupervised learning model of episodic memory is proposed. The proposed model, enhanced episodic memory adaptive resonance theory (EEM-ART), categorizes and encodes experiences of a robot to the environment and generates a cognitive map. EEM-ART consists of multilayer ART networks to extract novel events and encode spatio-temporal connection as episodes by incrementally generating cognitive neurons. The model connects episodes to construct a sensorimotor map for the robot to continuously perform path planning and goal navigation. Experimental results for a mobile robot indicate that EEM-ART can process multiple sensory sources for learning events and encoding episodes simultaneously. The model overcomes perceptual aliasing and robot localization by recalling the encoded episodes with a new anticipation function and generates sensorimotor map to connect episodes together to execute tasks continuously with little to no human intervention.

本文言語English
論文番号8488558
ページ(範囲)210-220
ページ数11
ジャーナルIEEE Transactions on Cognitive and Developmental Systems
11
2
DOI
出版ステータスPublished - 6月 2019
外部発表はい

ASJC Scopus subject areas

  • ソフトウェア
  • 人工知能

フィンガープリント

「Episodic Memory Multimodal Learning for Robot Sensorimotor Map Building and Navigation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル