3D conditional generative adversarial networks for high-quality PET image estimation at low dose

Yan Wang, Biting Yu, Lei Wang, Chen Zu, David S. Lalush, Weili Lin, Xi Wu, Jiliu Zhou, Dinggang Shen, Luping Zhou

Research output: Contribution to journalArticle

  • 1 Citations

Abstract

Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures.

LanguageEnglish (US)
Pages550-562
Number of pages13
JournalNeuroImage
Volume174
DOIs
StatePublished - Jul 1 2018

Fingerprint

Positron-Emission Tomography
Biochemical Phenomena
Radioactive Tracers
Physiological Phenomena
Benchmarking
Human Body
Noise
Health
Brain

Keywords

  • 3D conditional GANs (3D c-GANs)
  • Generative adversarial networks (GANs)
  • Image estimation
  • Low-dose PET
  • Positron emission tomography (PET)

ASJC Scopus subject areas

  • Neurology
  • Cognitive Neuroscience

Cite this

3D conditional generative adversarial networks for high-quality PET image estimation at low dose. / Wang, Yan; Yu, Biting; Wang, Lei; Zu, Chen; Lalush, David S.; Lin, Weili; Wu, Xi; Zhou, Jiliu; Shen, Dinggang; Zhou, Luping.

In: NeuroImage, Vol. 174, 01.07.2018, p. 550-562.

Research output: Contribution to journalArticle

Wang, Yan ; Yu, Biting ; Wang, Lei ; Zu, Chen ; Lalush, David S. ; Lin, Weili ; Wu, Xi ; Zhou, Jiliu ; Shen, Dinggang ; Zhou, Luping. / 3D conditional generative adversarial networks for high-quality PET image estimation at low dose. In: NeuroImage. 2018 ; Vol. 174. pp. 550-562.
@article{43c7feec42584ad9aa36f37f64cd8d67,
title = "3D conditional generative adversarial networks for high-quality PET image estimation at low dose",
abstract = "Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures.",
keywords = "3D conditional GANs (3D c-GANs), Generative adversarial networks (GANs), Image estimation, Low-dose PET, Positron emission tomography (PET)",
author = "Yan Wang and Biting Yu and Lei Wang and Chen Zu and Lalush, {David S.} and Weili Lin and Xi Wu and Jiliu Zhou and Dinggang Shen and Luping Zhou",
year = "2018",
month = "7",
day = "1",
doi = "10.1016/j.neuroimage.2018.03.045",
language = "English (US)",
volume = "174",
pages = "550--562",
journal = "NeuroImage",
issn = "1053-8119",
publisher = "Academic Press Inc.",

}

TY - JOUR

T1 - 3D conditional generative adversarial networks for high-quality PET image estimation at low dose

AU - Wang, Yan

AU - Yu, Biting

AU - Wang, Lei

AU - Zu, Chen

AU - Lalush, David S.

AU - Lin, Weili

AU - Wu, Xi

AU - Zhou, Jiliu

AU - Shen, Dinggang

AU - Zhou, Luping

PY - 2018/7/1

Y1 - 2018/7/1

N2 - Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures.

AB - Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures.

KW - 3D conditional GANs (3D c-GANs)

KW - Generative adversarial networks (GANs)

KW - Image estimation

KW - Low-dose PET

KW - Positron emission tomography (PET)

UR - http://www.scopus.com/inward/record.url?scp=85045122885&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85045122885&partnerID=8YFLogxK

U2 - 10.1016/j.neuroimage.2018.03.045

DO - 10.1016/j.neuroimage.2018.03.045

M3 - Article

VL - 174

SP - 550

EP - 562

JO - NeuroImage

T2 - NeuroImage

JF - NeuroImage

SN - 1053-8119

ER -