Cellbow: a robust customizable cell segmentation program

Huixia Ren, Mengdi Zhao, Bo Liu, Ruixiao Yao, Qi liu, Zhipeng Ren, Zirui Wu, Zongmao Gao, Xiaojing Yang, Chao Tang

PDF(3675 KB)
PDF(3675 KB)
Quant. Biol. ›› 2020, Vol. 8 ›› Issue (3) : 245-255. DOI: 10.1007/s40484-020-0213-6
METHOD
METHOD

Cellbow: a robust customizable cell segmentation program

Author information +
History +

Abstract

Background: Time-lapse live cell imaging of a growing cell population is routine in many biological investigations. A major challenge in imaging analysis is accurate segmentation, a process to define the boundaries of cells based on raw image data. Current segmentation methods relying on single boundary features have problems in robustness when dealing with inhomogeneous foci which invariably happens in cell population imaging.

Methods: Combined with a multi-layer training set strategy, we developed a neural-network-based algorithm — Cellbow.

Results: Cellbow can achieve accurate and robust segmentation of cells in broad and general settings. It can also facilitate long-term tracking of cell growth and division. To facilitate the application of Cellbow, we provide a website on which one can online test the software, as well as an ImageJ plugin for the user to visualize the performance before software installation.

Conclusion: Cellbow is customizable and generalizable. It is broadly applicable to segmenting fluorescent images of diverse cell types with no further training needed. For bright-field images, only a small set of sample images of the specific cell type from the user may be needed for training.

Author summary

Using microscope to study cells growing and dividing is one of the common tasks in a biological lab. However, having taken the pictures of the cells is only half way through. A challenging and often time-consuming work is to recognize, label and track each individual cell from the raw image. These images usually vary greatly in their features and qualities depending on the focal field, experimental conditions, cell types, different labs, etc. Current methods are very limited to solve these generic problems. Here we employed a machine learning method to develop a robust software that is automated, flexible and customizable for this task.

Graphical abstract

Keywords

deep neural network / cell segmentation / fluorescent cell imaging / bright-field cell imaging / lineage tracking

Cite this article

Download citation ▾
Huixia Ren, Mengdi Zhao, Bo Liu, Ruixiao Yao, Qi liu, Zhipeng Ren, Zirui Wu, Zongmao Gao, Xiaojing Yang, Chao Tang. Cellbow: a robust customizable cell segmentation program. Quant. Biol., 2020, 8(3): 245‒255 https://doi.org/10.1007/s40484-020-0213-6

References

[1]
Tantama, M., Martínez-François, J., Mongeon, R.Yellen, G. (2013) Imaging energy status in live cells with a fluorescent biosensor of the intracellular ATP-to-ADP ratio. Nat. Commun., 4, 2550
CrossRef Google scholar
[2]
Li, F., Yin, Z., Jin, G., Zhao, H. and Wong, S. T. C. (2013) Chapter 17: Bioimage informatics for systems pharmacology. PLOS Comput. Biol., 9, e1003043
CrossRef Pubmed Google scholar
[3]
Dimopoulos, S., Mayer, C. E., Rudolf, F. and Stelling, J. (2014) Accurate cell segmentation in microscopy images using membrane patterns. Bioinformatics, 30, 2644–2651
CrossRef Pubmed Google scholar
[4]
Van Valen, D. A., Kudo, T., Lane, K. M., Macklin, D. N., Quach, N. T., DeFelice, M. M., Maayan, I., Tanouchi, Y., Ashley, E. A. and Covert, M. W. (2016) Deep learning automates the quantitative analysis of individual cells in live-cell imaging experiments. PLOS Comput. Biol., 12, e1005177
CrossRef Pubmed Google scholar
[5]
Carpenter, A. E., Jones, T. R., Lamprecht, M. R., Clarke, C., Kang, I. H., Friman, O., Guertin, D. A., Chang, J. H., Lindquist, R. A., Moffat, J., (2006) CellProfiler: image analysis software for identifying and quantifying cell phenotypes. Genome Biol., 7, R100
CrossRef Pubmed Google scholar
[6]
O’Brien, J., Hoque, S., Mulvihill, D. and Sirlantzis, K. (2017) Automated cell segmentation of fission yeast phase images—segmenting cells from light microscopy images. In: Proc.10th Inter. Joint Conf. Biomed. Eng. Syst. Technol., pp. 92–99
[7]
Versari, C., Stoma, S., Batmanov, K., Llamosi, A., Mroz, F., Kaczmarek, A., Deyell, M., Lhoussaine, C., Hersen, P. and Batt, G. (2017) Long-term tracking of budding yeast cells in brightfield microscopy: cellStar and the evaluation platform. J. R. Soc. Interface, 14, 20160705
CrossRef Pubmed Google scholar
[8]
Meijering, E. (2012) Cell segmentation: 50 years down the road. IEEE Signal Process. Mag., 29, 140–145
CrossRef Google scholar
[9]
Hodneland, E., Kögel, T., Frei, D. M., Gerdes, H. H. and Lundervold, A. (2013) CellSegm — a MATLAB toolbox for high-throughput 3D cell segmentation. Source Code Biol. Med., 8, 16
CrossRef Pubmed Google scholar
[10]
Wood, N. E. and Doncic, A. (2019) A fully-automated, robust, and versatile algorithm for long-term budding yeast segmentation and tracking. PLoS One, 14, e0206395
CrossRef Pubmed Google scholar
[11]
Bakker, E., Swain, P. S. and Crane, M. M. (2018) Morphologically constrained and data informed cell segmentation of budding yeast. Bioinformatics, 34, 88–96
CrossRef Pubmed Google scholar
[12]
Peng, J. Y., Chen, Y. J., Green, M. D., Sabatinos, S. A., Forsburg, S. L. and Hsu, C. N. (2013) PombeX: robust cell segmentation for fission yeast transillumination images. PLoS One, 8, e81434
CrossRef Pubmed Google scholar
[13]
Peng, J. Y., Chen, Y. J., Green, M. D., Forsburg, S. L. and Hsu, C. N. (2013) Robust cell segmentation for schizosaccharomyces pombe images with focus gradient. In: Proc. Inter. Sympo. on Biomed. Imag.doi:10.1109/ISBI.2013.6556500.
[14]
Bourne, R. (2010) ImageJ. In: Fundamentals of Digital Imaging in Medicine. London: Springer doi:10.1007/978-1-84882-087-6_9.
[15]
Zhang, Y., Qiu, Z., Yao, T., Liu, D. and Mei, T. (2018) Fully convolutional adaptation networks for semantic segmentation. In: Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 6810–6818
CrossRef Google scholar
[16]
Shelhamer, E., Long, J. and Darrell, T. (2017) Fully convolutional networks for semantic segmentation. IEEE Trans. Pattern Anal. Mach. Intell., 39, 640–651
CrossRef Pubmed Google scholar
[17]
Bajcsy, P., Cardone, A., Chalfoun, J., Halter, M., Juba, D., Kociolek, M., Majurski, M., Peskin, A., Simon, C., Simon, M., (2015) Survey statistics of automated segmentations applied to optical imaging of mammalian cells. BMC Bioinformatics, 16, 330
CrossRef Pubmed Google scholar
[18]
Ljosa, V., Sokolnicki, K. L. and Carpenter, A. E. (2012) Annotated high-throughput microscopy image sets for validation. Nat. Methods, 9, 637
CrossRef Pubmed Google scholar
[19]
Wang, Z. Z. (2016) A new approach for segmentation and quantification of cells or nanoparticles. IEEE Trans. Ind. Informatics, 12, 962–971 doi10.1109/TII.2016.2542043.
[20]
Wang, Z. (2016) A semi-automatic method for robust and efficient identification of neighboring muscle cells. Pattern Recognit., 53, 300–312
CrossRef Google scholar
[21]
Meyer, F. (1994) Topographic distance and watershed lines. Signal Process., 38, 113–125
CrossRef Google scholar
[22]
Abadi, M., Agarwal, A., Barham, P., Brevdo, Chen, Z., Citro, C., Corrado, G.S., Davis, A., Dean, J., Devin, M., (2015) TensorFlow: Large-scale machine learning on heterogeneous systems. arXiv, 1603.04467v2
[23]
Eliceiri, K. W., Berthold, M. R., Goldberg, I. G., Ibáñez, L., Manjunath, B. S., Martone, M. E., Murphy, R. F., Peng, H., Plant, A. L., Roysam, B., (2012) Biological imaging software tools. Nat. Methods, 9, 697–710
CrossRef Pubmed Google scholar

ACKNOWLEDGEMENTS

This work was supported by the Ministry of Science and Technology of China (2015CB910300), the National Key Research and Development Program of China (2018YFA0900700), and the National Natural Science Foundation of China (NSFC31700733). Part of the analysis was performed on the High Performance Computing Platform of the Center for Life Science.

COMPLIANCE WITH ETHICS GUIDELINES

The authors Huixia Ren, Mengdi Zhao, Bo Liu, Ruixiao Yao, Qi liu, Zhipeng Ren, Zirui Wu, Zongmao Gao, Xiaojing Yang and Chao Tang declare that they have no conflict of interests.
This article does not contain any studies with human or animal subjects performed by any of the authors.

RIGHTS & PERMISSIONS

2020 Higher Education Press and Springer-Verlag GmbH Germany, part of Springer Nature
AI Summary AI Mindmap
PDF(3675 KB)

Accesses

Citations

Detail

Sections
Recommended

/