FINAL CALL FOR PAPER OFFERINGS for LREC Shared Task on Reproducibility

Did you author a recent paper in the area of Natural Language Processing and Computational Linguistics, and are you interested in further visibility of your paper, and wondering how well other researchers are able to reproduce your results?

Please consider offering your paper for the upcoming Shared Task on Reproducibility!

BACKGROUND

Scientific knowledge is grounded on falsifiable predictions and thus its credibility and raison d’être relies on the possibility of repeating experiments and getting similar results as originally obtained and reported. In many young scientific areas, including ours, acknowledgment and promotion of reproduction of results need very much to be increased.

LREC SESSION ON REPRODUCIBILITY

For this reason, a special reproducibility session will be included into the LREC 2020 regular program (side by side with other sessions on other topics) for papers on reproducibility, and a specific community-wide exercise will be launched to elicit and motivate the spread of work on reproduction: a Shared Task on Reproducibility.

REPRODUCIBILITY AND REPLICATION

We are adhering to the terminology assumed in the introductory note of the special section on replicability and reproducibility of The Language Resources and Evaluation journal (March 2017, Volume 51, Issue 1, pp 1-5), which follows Stodden (2014): "Replication, the practice of independently implementing scientific experiments to validate specific findings, is the cornerstone of discovering scientific truth. Related to replication is reproducibility, which is the calculation of quantitative scientific results by independent scientist using the original datasets and methods." For the ease of reference and when there is no need to be more explicit, we will be using the terms "reproduction" and "reproducibility".

COOPERATIVE SHARED TASK

The shared task is of a new type: it is partly similar to the usual competitive shared tasks --- in the sense that all participants share a common goal; but it is partly different to previous shared tasks --- in the sense that the focus is on seeking support and confirmation of previous results, rather than on overcoming those previous results with superior ones. Thus instead of a competitive shared task, with participant struggling for a top system that scores as far as possible from a rough and ready baseline, this will be a "cooperative shared task", struggling for a system that reproduces as close as possible the results of an original research experiment.

Accepted papers reporting the findings of shared task participants will be published in the regular LREC Proceedings.

CALL FOR PAPER OFFERINGS

For this cooperative shared task, the organizing committee will select a small set of research papers. Since this is the first time such shared task is organized, it seems advisable to enlist only original papers whose authors explicitly accept that their paper is the subject of this exercise.

We therefore are asking authors to offer their paper as a candidate target paper for the Shared Task on Reproducibility. This is a great possibility for your paper to receive additional attention and recognition by the research community. We are seeking offerings by authors of papers that meet the following requirements:

SELECTION OF PAPERS

The selection committee will select a small number of target papers, based on the criteria listed above, and taking into account aspects of balance so that there will be enough variety with respect to research field, computational and algorithmic techniques, target language(s), etc.

DETAILS

If you want your paper to be considered for the Shared Task on Reproducibility, please submit a short note (200-400 words) stating

Please email your note as a PDF attachment to the following address:

ReproSharedTaskSelection@gmail.com

Deadline for paper offerings:

January 15, 2019

STEERING COMMITTEE

SELECTION COMMITTEE

The latest version of this call is available at www.let.rug.nl/vannoord/Repro/.