Deep learning of cancer stem cell morphology using conditional generative adversarial networks

Saori Aida, Junpei Okugawa, Serena Fujisaka, Tomonari Kasai, Hiroyuki Kameda, Tomoyasu Sugiyama

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)


Deep-learning workflows of microscopic image analysis are sufficient for handling the contextual variations because they employ biological samples and have numerous tasks. The use of well-defined annotated images is important for the workflow. Cancer stem cells (CSCs) are identified by specific cell markers. These CSCs were extensively characterized by the stem cell (SC)-like gene expression and proliferation mechanisms for the development of tumors. In contrast, the morphological characterization remains elusive. This study aims to investigate the segmentation of CSCs in phase contrast imaging using conditional generative adversarial networks (CGAN). Artificial intelligence (AI) was trained using fluorescence images of the Nanog-Green fluorescence protein, the expression of which was maintained in CSCs, and the phase contrast images. The AI model segmented the CSC region in the phase contrast image of the CSC cultures and tumor model. By selecting images for training, several values for measuring segmentation quality increased. Moreover, nucleus fluorescence overlaid-phase contrast was effective for increasing the values. We show the possibility of mapping CSC morphology to the condition of undifferentiation using deep-learning CGAN workflows.

Original languageEnglish
Article number931
Pages (from-to)1-13
Number of pages13
Issue number6
Publication statusPublished - Jun 2020


  • Cancer stem cell
  • Conditional generative adversarial network
  • Green fluorescence protein
  • Phase contrast
  • Tumor

ASJC Scopus subject areas

  • Biochemistry
  • Molecular Biology


Dive into the research topics of 'Deep learning of cancer stem cell morphology using conditional generative adversarial networks'. Together they form a unique fingerprint.

Cite this