Submission of Results

Please upload your results at the challenge FTP server (ftps:// on the port number 21 with explicit TSL/SSL encryption enabled). The results and software executables must comply with the guidelines described in the following two documents:

Submission of results and executables.pdf

Naming and file content conventions.pdf

Ideally, we would like to encourage all participants to submit their results for all datasets to get a complete picture of the strengths and weaknesses of all algorithms under different scenarios. However, given the varying nature of the datasets (nuclei or cells, different cell types, 2D or 3D, noise level and general image quality) and microscopy modalities (fluorescence, phase contrast, DIC, brightfield) you can also submit results for only certain datasets, or submit more than one algorithm, each one targeting one or several specific datasets.

All participants are required to provide a working version of the algorithm used to produce the submitted results, either in the form of an executable or compilable source code, thus allowing the challenge organizers to validate all submitted results by rerunning the algorithms on the challenge datasets. The provided software will not be released publicly if participants do not agree with that. It will be used only for validation purposes. More details about the visibility of individual submissions can be found here.

Please note that all results submitted to the Cell Tracking Benchmark will automatically be evaluated within the Cell Segmentation Benchmark too, except for the Fluo-N3DL-DRO, Fluo-N3DL-TRIC, and Fluo-N3DL-TRIF datasets, in which only a subset of cells is evaluated and hence the two benchmarks differ in treating extra cells detected and segmented. For these datasets, participants are encouraged to additionally submit complete segmentation results that will automatically be filtered out by our evaluation software and used for the Cell Segmentation Benchmark.

The evaluation of submitted results is performed on multi-core workstations with at least 32 GB of RAM, running either Microsoft Windows or a GNU-derived Linux operating system and each being equipped with a single NVIDIA Quadro P6000 GPU card with 24 GB of RAM. The challenge organizers reserve the right to exclude results from the evaluation, the reproduction of which takes more than a few days using the mentioned workstations.

The submissions for both the Cell Tracking Benchmark and the Cell Segmentation Benchmark are evaluated monthly, by processing all results received by the end of each month.