The current and upcoming era of radiomics in phaeochromocytoma and paraganglioma
Language English Country Netherlands Media print-electronic
Document type Journal Article, Review, Research Support, Non-U.S. Gov't
PubMed
39227277
DOI
10.1016/j.beem.2024.101923
PII: S1521-690X(24)00077-0
Knihovny.cz E-resources
- Keywords
- deep learning, diagnostic imaging, machine learning, paraganglioma, pheochromocytoma, radiomics, texture analysis,
- MeSH
- Diagnosis, Differential MeSH
- Pheochromocytoma * diagnostic imaging MeSH
- Humans MeSH
- Adrenal Gland Neoplasms * diagnostic imaging MeSH
- Paraganglioma * diagnostic imaging MeSH
- Tomography, X-Ray Computed * methods trends MeSH
- Image Processing, Computer-Assisted methods MeSH
- Radiomics MeSH
- Machine Learning MeSH
- Artificial Intelligence MeSH
- Check Tag
- Humans MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
- Review MeSH
The topic of the diagnosis of phaeochromocytomas remains highly relevant because of advances in laboratory diagnostics, genetics, and therapeutic options and also the development of imaging methods. Computed tomography still represents an essential tool in clinical practice, especially in incidentally discovered adrenal masses; it allows morphological evaluation, including size, shape, necrosis, and unenhanced attenuation. More advanced post-processing tools to analyse digital images, such as texture analysis and radiomics, are currently being studied. Radiomic features utilise digital image pixels to calculate parameters and relations undetectable by the human eye. On the other hand, the amount of radiomic data requires massive computer capacity. Radiomics, together with machine learning and artificial intelligence in general, has the potential to improve not only the differential diagnosis but also the prediction of complications and therapy outcomes of phaeochromocytomas in the future. Currently, the potential of radiomics and machine learning does not match expectations and awaits its fulfilment.
References provided by Crossref.org