Antecedents
Understanding the mechanisms of cell motility and their regulation is a relevant challenge in biomedical research [1]. The ability of cells to exert forces on their environment and alter their shape as they move [2] is essential to various biological processes such as the immune response [3], embryonic development [4], or tumorigenesis [5]. This has been traditionally done in two dimensions using phase enhancing microscopy techniques such as Phase Contrast or Differential Interference Contrast microscopy. Recent technological advances in three-dimensional (confocal, multiphoton, SPIM) fluorescence microscopy have given researchers the opportunity to examine these processes in three dimensions within a living organism or in culture [6]. This requires accurately detecting the cells (Segmentation) and their changes (Tracking) as they move through the environment. This includes not only traction-related changes, but also mitotic and apoptotic events that affect the cell lineages.
Manually segmenting and tracking cells is an extremely laborious task, due to the large amount of image data acquired during live-cell studies. Thus, the analysis of time-lapse experiments increasingly relies on automated image processing techniques. Most of the standard segmentation and tracking techniques do not perform well under the low-quality conditions (high cell density, inhomogeneous staining, lineage changes) typical of time-lapse video sequences [7] added to the nonlinear, differential response of the optical system at different depths of thick samples [8]. This has encouraged the development of a significant number of algorithms that overcome these problems.
The state of the art of the existing cell segmentation algorithms range from simple thresholding methods [9,10], hysteresis thresholding [11], edge detection [12], or shape matching [13,14], to more sophisticated approaches based on region growing [15-17], machine learning [18-21] or energy minimization approaches [22-29]. See two reviews for a more comprehensive analysis of the existing cell segmentation methods and their antecedents [30,31].
The tracking methods can broadly be classified as tracking by detection [31-35] or tracking by model evolution [24-28,37-40] methods. The key idea of the former is to detect first all cells in the entire time-lapse video sequences and then associate the detected cells between successive frames, typically by optimizing a probabilistic objective function. The principle of the latter involves finding cells in the first frame and updating their position, shape, and orientation through the entire time-lapse series by taking the result from the previous frame into consideration. Each cell to be tracked is represented by a model that is evolved in time to fit the particular cell in the subsequent frames. Active contours, based on either an explicit [41-43] or implicit formulation [44-47], have been the first choice for the tracking by model evolution approach for many years.
Nowadays, the prevailing approach to both segmentation and tracking is to apply machine learning techniques [48-49].
Cell Tracking Challenge History
In 2012 we launched the Cell Tracking Challenge, with the aim of fostering the development of novel, robust cell tracking algorithms, and to help the developers with the evaluation of their new developments. In the first Cell Tracking Challenge, organized under the auspices of ISBI 2013 in San Francisco (CA, USA), we called for submission of algorithms developed under any of the two tracking frameworks described above. Over sixty groups registered for the challenge, of which six groups finally submitted consistent results that were evaluated and compared in terms of accuracy –both shape segmentation and lineage tracking-, and time, as they analyzed sequences of fluorescently labeled cells and nuclei moving in 2D and 3D environments, both real and computer generated. Given the varying nature of the data, 2D, 3D, including either nuclei or whole cells, the evaluation was done separately for each data type. Therefore, participants were allowed to submit more than one algorithm addressing the specific problems of different datasets. A report covering the logistics, methods and results of the challenge was published in Bioinformatics [50].
Given the success of the first edition of the challenge, measured in the number of registered participants and attendees to the workshop held during the ISBI 2013 conference, a second Cell Tracking Challenge was organized under the auspices of ISBI 2014. To broaden the scope of the challenge and increase the interest from potential participants, new datasets were added to the ones used in the first edition. Namely, 3D developmental fluorescence microscopy data, along with 2D, phase contrast and differential interference contrast (DIC) microscopy data were added to the existing ones. We also extended the simulated data using new sequences produced with a more sophisticated version of our existing cell simulator [51]. Over sixty participants registered for the challenge, and eight of them submitted consistent results, that were evaluated, presented at the ISBI 2014 conference in Beijing (China) and posted on the challenge website. A thorough evaluation of the tracking performance measure used in this and future editions of the challenge was published in PLoS One [52].
In the third edition, held under the auspices of the ISBI 2015 conference in Brooklyn (NY, USA) the challenge was consolidated by the increased number of participants and submissions, especially for the most challenging datasets. In addition, given the growing relevance of high-throughput large-scale embryonic developmental data, we added a new dataset consisting of Drosophila melanogaster embryonic data imaged using light-sheet microscopy, to foster the development of automated tools for these extremely challenging datasets [53-54].
A comprehensive description of three editions of the challenge along with the outstanding results obtained at that time was published in Nature Methods in 2017 [55].
In October 2018 we launched the fourth fixed-deadline edition of the challenge with a new segmentation-centric benchmark, evaluating detection and segmentation results submitted for the existing and new challenge datasets. This new benchmark had a fixed initial deadline and was held under the auspices of ISBI 2019. The new challenge datasets consisted of two new types of image data, namely 3D cartographic projections of Tribolium Castaneum embryonic data imaged using light-sheet microscopy and single cells (both real and simulated) with filopodial protrusions that are difficult to segment. Thirteen participants submitted consistent results that were evaluated, presented at the ISBI 2019 conference in Venice (Italy), and posted on the challenge website.
The fifth challenge edition was organized as part of ISBI 2020, broadening the scope of the challenge by adding two brightfield microscopy datasets and one fully 3D+time dataset of developing Tribolium Castaneum embryo. Furthermore, silver reference segmentation annotations were released for the training videos of nine existing datasets to facilitate the tuning of competing methods. The submissions were evaluated, announced at the virtual ISBI 2020 challenge workshop (see PowerPoint presentation [178 MB] and MP4 video [583 MB]), and posted on the challenge website.
Running Challenge Benchmarks
Since February 21st, 2017, the Cell Tracking Benchmark has remained open for online submissions that are evaluated periodically on a monthly basis. Since April 1st, 2019, the Cell Segmentation Benchmark has been opened for online submissions that are evaluated with the same frequency. The leaderboards containing top-3 performing methods for both the Cell Tracking Benchmark and the Cell Segmentation Benchmark are continually updated. Since January 8th, 2020, the silver reference segmentation annotations have been available for nine datasets to further facilitate tuning of competing algorithms, providing denser sets of cell segmentation masks compared to the previously existing gold reference segmentation annotations.
Challenge Related Studies
The gathered pool of annotated datasets along with the gathered pool of segmentation and tracking methods constitutes a unique resource for further studies and analyses. In addition to our own research, we are open to common collaborative projects led by other groups with the involvement of selected Cell Tracking Challenge organizers who can perform evaluations using the non-public gold reference annotations of the challenge datasets. In case of interest, please consult your intention with one of the Steering Committee members who will then discuss the feasibility of your proposal and available human and/or technical resources on our side with the rest of the Steering Committee.
Relevant References (publications by the organizers are highlighted)
[1] C. Zimmer, B. Zhang, A. Dufour, A. Thebaud, S. Berlemont, V. Meas-Yedid, and J.-C. Olivo-Marin, “On the digital trail of mobile cells,” IEEE Signal Processing Magazine, vol. 23, no. 3, pp. 54–62, 2006.
[2] R. Ananthakrishnan and A. Ehrlicher, “The forces behind cell movement,” International Journal of Biological Sciences, vol. 3, no. 5, pp. 303–317, 2007.
[3] R. Evans, I. Patzak, L. Svensson, K. D. Filippo, K. Jones, A. McDowal, and N. Hogg, “Integrins in immunity,” Journal of Cell Science, vol. 122, no. 2, pp. 215–225, 2009.
[4] D. J. Montell, “Morphogenetic cell movements: Diversity from modular mechanical properties,” Science, vol. 322, no. 5907, pp. 1502–1505, 2008.
[5] J. Condeelis and J. W. Pollard, “Macrophages: Obligate partners for tumor cell migration, invasion, and metastasis,” Cell, vol. 124, no. 2, pp. 263–266, 2006.
[6] R. Fernandez-Gonzalez, A. Munoz-Barrutia, M. H. Barcellos-Hoff, and C. Ortiz-de-Solorzano, “Quantitative in vivo microscopy: the return from the ‘omics’,” Current Opinion in Biotechnology, vol. 17, no. 5, pp. 501–510, 2006.
[7] C. Vonesch, F. Aguet, J.-L. Vonesch, and M. Unser, “The colored revolution of bioimaging,” IEEE Signal Processing Magazine, vol. 23, no. 3, pp. 20–31, 2006.
[8] P. Sarder and A. Nehorai, “Deconvolution methods for 3-D fluorescence microscopy images,” IEEE Signal Processing Magazine, vol. 23, no. 3, pp. 32–45, 2006.
[9] B. Lerner, W.F. Clocksin, S. Dhanjal, S. Hultén, C.M. Bishop, “Automatic signal classification in fluorescence in situ hybridization images,”.Cytometry 43, 87-93, 2001.
[10] X. Chen, X. Zhou, S.T.C Wong, “Automated segmentation, classification, and tracking of cancer cell nuclei in time-lapse microscopy,” IEEE Trans. Biomed. Eng. 53, 762-766, 2006.
[11] K.M. Henry, L. Pase, C.F. Ramos-Lopez, G.J. Lieschke, S.A. Renshaw, C.C. Reyes-Aldasoro, “PhagoSight: an open-source MATLAB package for the analysis of fluorescent neutrophil and macrophage migration in a zebrafish model,” PloS ONE 8, e72636, 2013.
[12] C. Wählby, I.M. Sintorn, F. Elandsson, G. Borgefors, E. Bengtsson, “Combining intensity, edge and shape information for 2D and 3D segmentation of cell nuclei in tissue sections,” J. Microsc-Oxford, 215, 67-76, 2004.
[13] M. Cicconet, D. Geiger, K. Gunsalus, “Wavelet-based circular hough-transform and its application in embryo development analysis,” VISAPP 2013, Proceedings of the International Conference on Computer Vision Theory and Applications, 669-674, 2013.
[14] E. Türetken, X. Wang, C.J. Becker, C. Haubold, P. Fua, “Network flow integer programming to track elliptical cells in time-lapse sequences,” IEEE Trans. Med. Imag. 36, 942-951, 2016.
[15] N. Malpica, C. Ortiz-de-Solorzano, J.J. Vaquero, A. Santos, I. Vallcorba, J.M. Garcia-Sagredo, F. Pozo, “Applying watershed algorithms to the segmentation of clustered nuclei,” Cytometry Part A. 28, 289-297, 1997.
[16] C. Ortiz-de-Solorzano, E. García-Rodríguez, A. Jones, D. Pinkel, J.W. Gray, D. Sudar, S.J. Lockett, “Segmentation of confocal microscopy images of cell nuclei in thick tissue sections,” J. Microsc-Oxford, 193, 212-226, 1999.
[17] A. Cliffe, D.P. Doupé, H. Sung, I.K.H. Lim, K.H. Ong, L. Cheng, W. Yu, “Quantitative 3D analysis of complex single border cell behaviors in coordinated collective cell migration,.” Nat. Commun. 8:14905, 2017.
[18] O. Ronneberger, P. Fisher, T. Brox, “U-net: Convolutional networks for biomedical image segmentation,”. In Proc. MICCAI 2015 LCNS 9351, 234-241, 2015.
[19] M. Schiegg, P. Hanslovsky, C. Haubold, U. Koethe, L. Hufnagel, F.A. Hamprecht, “Graphical model for joint segmentation and tracking of multiple dividing cells,” Bioinformatics 31, 948-56, 2015.
[20] C. Castilla, M. Maška, D.V. Sorokin, E. Meijering, C. Ortiz-de-Solorzano, “Three-Dimensional Quantification of Filopodia in Motile Cancer Cells,” IEEE Transactions on Medical Imaging, Early Access, 2018.
[21] C. Payer, D. Stern, T. Neff, H. Bischof, M. Urschler, “Instance Segmentation and Tracking with Cosine Embeddings and Recurrent Hourglass Networks,” Instance Segmentation and Tracking with Cosine Embeddings and Recurrent Hourglass Networks. In: Frangi A., Schnabel J., Davatzikos C., Alberola-López C., Fichtinger G. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2018. MICCAI 2018. Lecture Notes in Computer Science, vol 11071. Springer, Cham
[22] C. Ortiz-de-Solorzano, R. Malladi, S.A. Lelièvre, and S.J. Lockett. Segmentation of nuclei and cells using membrane related protein markers. Journal of Microscopy-Oxford.; 201(3):404-15, 2001.
[23] A. Sarti, C. Ortiz-de-Solórzano, S.J. Lockett, and R. Malladi. A geometric model for 3-D confocal image analysis. IEEE Transactions on Biomedical Engineering; 47(12):1600-9, 2000.
[24] C. Zimmer, E. Labruyere, V. Meas-Yedid, N. Guillen, J.C Olivo-Marin, “Segmentation and tracking of migrating cells in videomicroscopy with parametric active contours: a tool for cell-based drug testing,” IEEE Trans. Med. Imag. 21, 1212-1221, 2002.
[25] A. Dufour, R. Thibeaux, E. Labruyere, N. Guillen, J-C Olivo-Marin, “3D active meshes: fast discrete deformable models for cell tracking in 3D time-lapse microscopy,” IEEE Trans. Image Process. 20, 1925–37, 2011.
[26] M. Maška, O. Daněk, S. Garasa, A. Rouzaut, A. Muñoz-Barrutia, C. Ortiz-de-Solorzano, “Segmentation and shape tracking of whole fluorescent cells base don the Chan-Vese model,” IEEE Transactions on Medical Imaging; 32(6):995-1006, 2013.
[27] O. Dzyubachyk, W. A. van Cappellen, J. Essers, W. J. Niessen, and E. Meijering, “Advanced level-set-based cell tracking in time-lapse fluorescence microscopy,” IEEE Transactions on Medical Imaging, vol. 29, no. 3, pp. 852–867, 2010.
[28] A. Dufour, V. Shinin, S. Tajbakhsh, N. Guillen-Aghion, J.C. Olivo-Marin, C. Zimmer, “Segmenting and tracking fluorescent cells in dynamic 3D microscopy with coupled active surfaces,” IEEE Trans. Image Process. 14, 1396–1410, 2005.
[29] R. Bensch, R., and O. Ronneberger, “Cell segmentation and tracking in phase contrast images using graph cut with asymmetric boundary costs,” In Proc. 2015 IEEE Int. Symp. Biomed. Imaging (ISBI), 1120-1123, 2015.
[30] E. Meijering. “Cell Segmentation: 50 Years Down the Road,” IEEE Signal Processing Magazine 29(5): 140-145, 2012.
[31] C. Ortiz-de-Solorzano, A. Muñoz-Barrutia, E. Meijering, M. Kozubek, “Towards a Morphodynamic Model of the Cell: Signal processing for cell modeling,” IEEE Signal Processing Magazine 32(1): 20-29, 2015.
[32] O. Al-Kofahi, R. J. Radke, S. K. Goderie, Q. Shen, S. Temple, and B. Roysam, “Automated cell lineage construction: A rapid method to analyze clonal development established with murine neural progenitor cells,” Cell Cycle, vol. 5, no. 3, pp. 327–335, 2006.
[33] N. Harder, F. Mora-Bermudez, W. J. Godinez, J. Ellenberg, R. Eils, and K. Rohr, “Automated analysis of the mitotic phases of human cells in 3D fluorescence microscopy,” in Medical Image Computing and Computer- Assisted Intervention, 2006, pp. 840–848
[34] F. Li, X. Zhou, J. Ma, and S. T. C. Wong, “Multiple nuclei tracking using integer programming for quantitative cancer cell cycle analysis,” IEEE Transactions on Medical Imaging, vol. 29, no. 1, pp. 96–105, 2010.
[35] D. Padfield, J. Rittscher, and B. Roysam, “Coupled minimum-cost flow cell tracking for high-throughput quantitative analysis,” Medical Image Analysis, vol. 15, no. 1, pp. 650–668, 2011.
[36] Magnusson,K.E.G. and Jaldén,J. “A batch algorithm using iterative application of the Viterbi algorithm to track cells and construct cell lineages”. In Proceedings of the 9th IEEE International Symposium on Biomedical Imaging, 2012, pp. 382–385.
[37] O. Debeir, P. V. Ham, R. Kiss, and C. Decaestecker, “Tracking of migrating cells under phase-contrast video microscopy with combined mean-shift processes,” IEEE Transactions on Medical Imaging, vol. 24, no. 6, pp. 697–711, 2005
[38] R. Nilanjan, S. T. Acton, and K. Ley, “Tracking leukocytes in vivo with shape and size constrained active contours,” IEEE Transactions on Medical Imaging, vol. 21, no. 10, pp. 1222–1235, 2002.
[39] D. Mukherjee, N. Ray, and S. Acton, “Level set analysis for leukocyte detection and tracking,” IEEE Transactions on Image Processing, vol. 13, no. 4, pp. 562–572, 2004.
[40] D. Padfield, J. Rittscher, N. Thomas, and B. Roysam, “Spatio-temporal cell cycle phase analysis using level sets and fast marching methods,” Medical Image Analysis, vol. 13, no. 1, pp. 143–155, 2009.
[41] M. Kass, A. Witkin, and D. Terzopoulos, “Snakes: Active contour models,” International Journal of Computer Vision, vol. 1, no. 4, pp. 321–331, 1987.
[42] M. Jacob, T. Blu, and M. Unser, “Efficient energies and algorithms for parametric snakes,” IEEE Transactions on Image Processing, vol. 13, no. 9, pp. 1231–1244, 2004.
[43] C. Zimmer and J.-C. Olivo-Marin, “Coupled parametric active contours,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 11, pp. 1838–1842, 2005.
[44] R. Delgado-Gonzalo, N. Chenouard, and M. Unser, “Fast parametric snakes for 3D microscopy,” in Proceedings of the 9th IEEE International Symposium on Biomedical Imaging, 2012, pp. 852–855.
[45] V. Caselles, F. Catt, T. Coll, and F. Dibos, “A geometric model for active contours in image processing,” Numerische Mathematik, vol. 66, no. 1, pp. 1–31, 1993.
[46] V. Caselles, R. Kimmel, and G. Sapiro, “Geodesic active contours,” International Journal of Computer Vision, vol. 22, no. 1, pp. 61–79, 1997.
[47] T. F. Chan and L. A. Vese, “Active contours without edges,” IEEE Transactions on Image Processing, vol. 10, no. 2, pp. 266–277, 2001.
[48] T. He, H. Mao, J. Guo, Z. Yi, “Cell tracking using deep learning neural networks with multi-task learning,” Image and Vision Computing 2016; 60:142-153.
[49] F. Xing, Y. Xie, H. Su, F. Liu, L. Yang, “Deep Learning in Microscopy Image Analysis: A Survey.” IEEE Transactions on Neural Networks and Learning Systems 2017; 29:4550-4568.
[50] M. Maška, V. Ulman, D. Svoboda, P. Matula, P. Matula, C. Ederra, A. Urbiola, T. España, S. Venkatesan, D.M.W. Balak, P. Karas, T. Bolcková, M. Štreitová, C. Carthel, S. Coraluppi, N. Harder, K. Rohr, K.E.G. Magnusson, J. Jaldén, H.M. Blau, O. Dzyubachyk, P. Křížek, G. M. Hagen, D. Pastor-Escuredo, D. Jimenez-Carretero, M. J. Ledesma-Carbayo, A. Muñoz-Barrutia, E. Meijering, M. Kozubek, C. Ortiz-de-Solorzano. A benchmark for comparison of cell tracking algorithms. Bioinformatics 2014; 30(11):1609-1617.
[51] D. Svoboda, V. Ulman, “MitoGen: A Framework for Generating 3D Synthetic Time-Lapse Sequences of Cell Populations in Fluorescence Microscopy.” IEEE Transactions on Medical Imaging; 36(1):310-321, 2017.
[52] Pa. Matula, M. Maška, D.V. Sorokin, Pv. Matula, C. Ortiz-de-Solorzano, M. Kozubek, “Cell Tracking Accuracy Measurement Based on Comparison of Acyclic Oriented Graphs,” PLoS One 10(12):e0144959, 2015.
[53] F. Amat, W. Lemon, D.P. Mossing, K. McDole, Y. Wan, K. Branson, E.W. Myers, P.J. Keller. Fast, accurate reconstruction of cell lineages from large-scale fluorescence microscopy data. Nature Methods 2014 11(9):951-958.
[54] F. Jug, T. Pietzsch, S. Preibisch, P.Tomancak. Bioimage Informatics in the context of Drosophila research. Methods, 2014; 68(1) 60-73.
[55] V. Ulman, M. Maška, K.E.G. Magnusson, O. Ronneberger, C. Haubold, N. Harder, Pa. Matula, Pe. Matula, D. Svoboda, M. Radojevic, I. Smal, K. Rohr, J. Jaldén, H.M. Blau, O. Dzyubachyk, B. Lelieveldt, P. Xiao, Y. Li, S.-Y. Cho, A.C. Dufour, J.C. Olivo-Marin, C.C. Reyes-Aldasoro, J.A. Solis-Lemus, R. Bensch, T. Brox, J. Stegmaier, R. Mikut, S. Wolf, F.A. Hamprecht, T. Esteves, P. Quelhas, Ö. Demirel, L. Malmström, F. Jug, P. Tomancak, E. Meijering, A. Muñoz-Barrutia, M. Kozubek, C. Ortiz-de-Solorzano, “An objective comparison of cell-tracking algorithms,” Nature Methods 14(12):1141-1152, 2017.
Acknowledgment
We gratefully acknowledge the support of the NVIDIA Corporation and their donation of the Quadro P6000 GPU used for the evaluation of challenge results. We also thank the IT4Innovations National Supercomputing Center for hosting the computations of silver reference segmentation annotations on their cluster.