Cell segmentation from telecentric bright-field transmitted light microscopy images using a Residual Attention U-Net: A case study on HeLa line
Jazyk angličtina Země Spojené státy americké Médium print-electronic
Typ dokumentu časopisecké články, práce podpořená grantem
PubMed
35809410
DOI
10.1016/j.compbiomed.2022.105805
PII: S0010-4825(22)00560-1
Knihovny.cz E-zdroje
- Klíčová slova
- Cell detection, Deep learning, Microscopy image segmentation, Neural network, Semantic segmentation, Tissue segmentation, Watershed segmentation,
- MeSH
- benchmarking MeSH
- mikroskopie * MeSH
- neuronové sítě MeSH
- počítačové zpracování obrazu * metody MeSH
- sémantika MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
Living cell segmentation from bright-field light microscopy images is challenging due to the image complexity and temporal changes in the living cells. Recently developed deep learning (DL)-based methods became popular in medical and microscopy image segmentation tasks due to their success and promising outcomes. The main objective of this paper is to develop a deep learning, U-Net-based method to segment the living cells of the HeLa line in bright-field transmitted light microscopy. To find the most suitable architecture for our datasets, a residual attention U-Net was proposed and compared with an attention and a simple U-Net architecture. The attention mechanism highlights the remarkable features and suppresses activations in the irrelevant image regions. The residual mechanism overcomes with vanishing gradient problem. The Mean-IoU score for our datasets reaches 0.9505, 0.9524, and 0.9530 for the simple, attention, and residual attention U-Net, respectively. The most accurate semantic segmentation results was achieved in the Mean-IoU and Dice metrics by applying the residual and attention mechanisms together. The watershed method applied to this best - Residual Attention - semantic segmentation result gave the segmentation with the specific information for each cell.
Citace poskytuje Crossref.org