UnMICST: Deep learning with real augmentation for robust segmentation of highly multiplexed images of human tissues.
Clicks: 73
ID: 277013
2022
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
21.6
/100
72 views
25 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
Upcoming technologies enable routine collection of highly multiplexed (20-60 channel), subcellular resolution images of mammalian tissues for research and diagnosis. Extracting single cell data from such images requires accurate image segmentation, a challenging problem commonly tackled with deep learning. In this paper, we report two findings that substantially improve image segmentation of tissues using a range of machine learning architectures. First, we unexpectedly find that the inclusion of intentionally defocused and saturated images in training data substantially improves subsequent image segmentation. Such real augmentation outperforms computational augmentation (Gaussian blurring). In addition, we find that it is practical to image the nuclear envelope in multiple tissues using an antibody cocktail thereby better identifying nuclear outlines and improving segmentation. The two approaches cumulatively and substantially improve segmentation on a wide range of tissue types. We speculate that the use of real augmentations will have applications in image processing outside of microscopy.
Abstract Quality Issue:
This abstract appears to be incomplete or contains metadata (149 words).
Try re-searching for a better abstract.
| Reference Key |
yapp2022unmicstcommunications
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | Yapp, Clarence;Novikov, Edward;Jang, Won-Dong;Vallius, Tuulia;Chen, Yu-An;Cicconet, Marcelo;Maliga, Zoltan;Jacobson, Connor A;Wei, Donglai;Santagata, Sandro;Pfister, Hanspeter;Sorger, Peter K; |
| Journal | Communications biology |
| Year | 2022 |
| DOI |
1263
|
| URL | |
| Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.