The datasets consist of 2D and 3D time-lapse video sequences of fluorescent counterstained nuclei or cells moving on top or immersed in a substrate, along with 2D Phase Contrast and Differential Interference Contrast (DIC) microscopy videos of cells moving on a flat substrate. The videos cover a wide range of cell types and quality (spatial and temporal resolution, noise levels etc.) In addition, we provide 2D and 3D videos of synthetic fluorescently stained nuclei moving in a realistic way, displaying increasing cell density and noise levels.
All 2D+t and 3D+t datasets are available for download to registered participants. Please check carefully the “Conditions of use of the images” at the end of this page before downloading any of these datasets. For technical support, please contact Martin Maška (email@example.com).
The ground truth, consisting of manually annotated videos (segmentation) and acyclic graphs (detection and tracking), was generated following the annotation guidelines described in the following document:
and both the original and ground truth files were named and created following the conventions described in:
Conditions of use of the images:
2) The fact that you have registered for the challenge does not oblige you in any way to submit results. Your registration is considered as an expression of interest to participate and allows you to download the datasets and submit results.
3) Ideally, we would like to encourage all participants to submit their results for all datasets to get a complete picture of the strengths and weaknesses of all algorithms under different scenarios. However, given the varying nature of the datasets (nuclei or cells, different cell types, 2D or 3D, noise level and general image quality) and microscopy modalities (fluorescence, phase contrast, DIC) you can also submit results for only certain datasets, or submit more than one algorithm, each one targeting one or several specific datasets.
4) All participating teams wishing to be included in the challenge report and in any future publication, will be required to provide a working version of the algorithm used to produce the submitted results, either in the form of an executable or compilable source code. The challenge committee reserves the right to proceed to random tests to verify submitted results by rerunning the algorithms on the challenge datasets. The provided software will not be released publicly if participants do not agree with that. It will be used only for verification purposes.
5) To encourage the participation of groups that may be discouraged by the public display of potentially poor results, the rankings will display only the names of the top-three ranked participants for each dataset. The other lower ranked algorithms will not be listed, but the participants will be informed about the absolute performance of their algorithms. In any case, the absolute performance of their algorithms will be integrated into the rankings only after the challenge organizers are given the explicit permission to do so from the participants.