Reproducibility of Results

Participants will be asked to submit a detailed description of how their forecasts were made and a source, or execution file, for reproducing the forecasts for 100 randomly selected series.

Given the critical importance of objectivity and reproducibility, such description and file will be mandatory for participating in the Competition.

The prerequisite for the Full Reproducibility Prize will be that the code used for generating the forecasts, with the exception of companies providing forecasting services and those claiming proprietary software, will be put on GitHub, not later than 10 days after the end of the competition (i.e., the 10th of June, 2018). In addition, there must be instructions on how to exactly reproduce the M4 submitted forecasts. In this regard, individuals and companies will be able to use the code and the instructions provided, crediting the person/group that has developed them, to improve their organizational forecasts.

Companies providing forecasting services and those claiming proprietary software will have to provide the organisers with a detailed description of how their forecasts were made and a source, or execution file for reproducing their forecasts for 100 randomly selected series. An execution file can be submitted in case that the source program needs to be kept confidential, or, alternatively, a source program with a termination date for running it.

The code for reproducing the results of the 4Theta method, submitted by the Forecasting & Strategy Unit, will be put on GitHub before 27-12-2017. This method will not be considered for any of the Prizes.

The GitHub repository, including the code for reproducing the results of the benchmarks and the participating methods, as well as for evaluating their results, can be found here.

September 28, 2017