Skip to main content
placeholder image

Bidirectional Mapping Generative Adversarial Networks for Brain MR to PET Synthesis

Journal Article


Abstract


  • Fusing multi-modality medical images, such as magnetic resonance (MR) imaging and positron emission tomography (PET), can provide various anatomical and functional information about the human body. However, PET data is not always available for several reasons, such as high cost, radiation hazard, and other limitations. This paper proposes a 3D end-to-end synthesis network called Bidirectional Mapping Generative Adversarial Networks (BMGAN). Image contexts and latent vectors are effectively used for brain MR-to-PET synthesis. Specifically, a bidirectional mapping mechanism is designed to embed the semantic information of PET images into the high-dimensional latent space. Moreover, the 3D Dense-UNet generator architecture and the hybrid loss functions are further constructed to improve the visual quality of cross-modality synthetic images. The most appealing part is that the proposed method can synthesize perceptually realistic PET images while preserving the diverse brain structures of different subjects. Experimental results demonstrate that the performance of the proposed method outperforms other competitive methods in terms of quantitative measures, qualitative displays, and evaluation metrics for classification.

Publication Date


  • 2022

Citation


  • Hu, S., Lei, B., Wang, S., Wang, Y., Feng, Z., & Shen, Y. (2022). Bidirectional Mapping Generative Adversarial Networks for Brain MR to PET Synthesis. IEEE Transactions on Medical Imaging, 41(1), 145-157. doi:10.1109/TMI.2021.3107013

Scopus Eid


  • 2-s2.0-85113861594

Web Of Science Accession Number


Start Page


  • 145

End Page


  • 157

Volume


  • 41

Issue


  • 1

Abstract


  • Fusing multi-modality medical images, such as magnetic resonance (MR) imaging and positron emission tomography (PET), can provide various anatomical and functional information about the human body. However, PET data is not always available for several reasons, such as high cost, radiation hazard, and other limitations. This paper proposes a 3D end-to-end synthesis network called Bidirectional Mapping Generative Adversarial Networks (BMGAN). Image contexts and latent vectors are effectively used for brain MR-to-PET synthesis. Specifically, a bidirectional mapping mechanism is designed to embed the semantic information of PET images into the high-dimensional latent space. Moreover, the 3D Dense-UNet generator architecture and the hybrid loss functions are further constructed to improve the visual quality of cross-modality synthetic images. The most appealing part is that the proposed method can synthesize perceptually realistic PET images while preserving the diverse brain structures of different subjects. Experimental results demonstrate that the performance of the proposed method outperforms other competitive methods in terms of quantitative measures, qualitative displays, and evaluation metrics for classification.

Publication Date


  • 2022

Citation


  • Hu, S., Lei, B., Wang, S., Wang, Y., Feng, Z., & Shen, Y. (2022). Bidirectional Mapping Generative Adversarial Networks for Brain MR to PET Synthesis. IEEE Transactions on Medical Imaging, 41(1), 145-157. doi:10.1109/TMI.2021.3107013

Scopus Eid


  • 2-s2.0-85113861594

Web Of Science Accession Number


Start Page


  • 145

End Page


  • 157

Volume


  • 41

Issue


  • 1