Aims and objectives
PET-CT has lead to drastic improvements in the detection,
and followup for cancer.
producing high-quality PET tracers is expensive and requires cyclotrons and transportation infrastructure for delivery within a few hours.
These factors significantly limit the impact of such devices,
particularly in developing countries. Machine Learning has significantly improved in its ability to learn to recreate images using unsupervised methods on vast sets of training data.
Methods and materials
We construct the Conditional GAN using the architectures described in the Image-to-Image .
The network breaks down into two sub-networks a generative component and a discriminative component. Sincethese two subnetworksare trained against each other,
they can be trained without any labels or physician input,
we take 210 PET/CT scans from an ensemble of data from the Soft-Tissue Sarcoma study on the TCIA (Cancer Imaging Archive,
NIH) and our local database spanning diseases of Soft-tissue...
The network could be quickly trained to achieve as MSE below 0.2 on the SUV scale based just on the CT images.
Errors were typically highest in regions with tumors or other anomalies and were lowest in features present in every patient like bladder,
heart and brain all of which have consistent locations and texture.
Radiology as a field with huge amounts of data but very little of it being strongly-labeled is in need of different types and approaches to machine learning.
GANs show a great deal of promise for using large PACS Archives to quickly develop new algorithms.
Furthermore the ability to simulate multi-modal datasets offers possibilities for reducing dose and offering scans where they would otherwise be impossible.
Expanding to techniques like PET-MRI and DWI could offer even more opportunities to...
J. et al. (2014) ‘Generative Adversarial Networks’.
Available at: http://arxiv.org/abs/1406.2661 (Accessed: 12 January 2018). Ronneberger,
(2015) ‘U-Net: Convolutional Networks for Biomedical Image Segmentation’.
Available at: http://arxiv.org/abs/1505.04597 (Accessed: 12 January 2018).