Joint attention by gaze interpolation and saliency

Zeynep Yücel, Albert Ali Salah, Çetin Meriçli, Tekin Mericļi, Roberto Valenti, Theo Gevers

Research output: Contribution to journalArticlepeer-review

52 Citations (Scopus)


Joint attention, which is the ability of coordination of a common point of reference with the communicating party, emerges as a key factor in various interaction scenarios. This paper presents an image-based method for establishing joint attention between an experimenter and a robot. The precise analysis of the experimenter's eye region requires stability and high-resolution image acquisition, which is not always available. We investigate regression-based interpolation of the gaze direction from the head pose of the experimenter, which is easier to track. Gaussian process regression and neural networks are contrasted to interpolate the gaze direction. Then, we combine gaze interpolation with image-based saliency to improve the target point estimates and test three different saliency schemes. We demonstrate the proposed method on a human-robot interaction scenario. Cross-subject evaluations, as well as experiments under adverse conditions (such as dimmed or artificial illumination or motion blur), show that our method generalizes well and achieves rapid gaze estimation for establishing joint attention.

Original languageEnglish
Pages (from-to)829-842
Number of pages14
JournalIEEE Transactions on Cybernetics
Issue number3
Publication statusPublished - Jun 2013
Externally publishedYes


  • Developmental robotics
  • Gaze following
  • Head pose estimation
  • Joint visual attention
  • Saliency
  • Selective attention

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering


Dive into the research topics of 'Joint attention by gaze interpolation and saliency'. Together they form a unique fingerprint.

Cite this