Preferred color reproduction based on personal histogram transformation

Kenji Hara, Atsuhiko Maeda, Hirohito Inagaki, Minoru Kobayashi, Masanobu Abe

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)


We propose a facial color beautification method for video communication. The video-captured face often disappoints us, especially the subjects themselves, because the lighting environment is often poor. Inadequate lighting makes the face look depressed or tired. The best solution is illuminating the face directly with good light, but it is not always possible to realize the perfect environment in everyday situations. Our solution is to retouch the captured video. Our method identifies the distribution of skin color and the eye luminance of a detected face, and converts the color distribution of the entire image by histogram transformation. This method provides three strong points, which are 1) converting video in real-time, 2) beautifying the facial color even if it is captured under inadequate light, and 3) creating a variety of face "styles" by controlling the transformation parameters. We conduct a computational performance evaluation, which shows that our method provides real-time video processing performance. Moreover, we also conduct a user study to reveal the diversity of user preferences and the method's effectiveness in terms of impression improvement. The user study indicates that our method is effective even though user preferences vary widely.

Original languageEnglish
Pages (from-to)855-863
Number of pages9
JournalIEEE Transactions on Consumer Electronics
Issue number2
Publication statusPublished - Aug 25 2009
Externally publishedYes


  • Face beautification
  • Facial impression
  • Histogram transformation
  • Skin color
  • Video processing

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering


Dive into the research topics of 'Preferred color reproduction based on personal histogram transformation'. Together they form a unique fingerprint.

Cite this