Estimate Runtime

A convenience script to estimate the run time overhead of a new optimization method compared to SGD.

By default this script runs SGD as well as the new optimizer 5 times for 3 epochs on the multi-layer perceptron on MNIST while measuring the time. It will output the mean run time overhead of the new optimizer for these runs.

Optionally the setup can be changed, by varying the test problem, the number of epochs, the number of runs, etc. if this allows for a fairer evaluation.


Run a new run script and compare its runtime to SGD.

usage: [-h] [--test_problem TEST_PROBLEM]
                                   [--data_dir DATA_DIR] [--bs BS] [--lr LR]
                                   [-N NUM_EPOCHS] [--num_runs NUM_RUNS]
                                   [--saveto SAVETO]
                                   [--optimizer_args OPTIMIZER_ARGS]

Positional Arguments

run_script Path to the new run_script.

Named Arguments


Name of the test problem to run both scripts.

Default: "mnist_mlp"


Path to the base data dir. If not set, deepobs uses its default.

Default: "data_deepobs"

--bs, --batch_size

The batch size (positive integer).

Default: 128

--lr, --learning_rate

The learning rate of both SGD and the new optimizer, defaults to 1e-5.

Default: 1e-05

-N, --num_epochs

Total number of training epochs per run.

Default: 3


Total number of runs for each optimizer.

Default: 5

--saveto Folder for saving a txt files with a summary.
 Additional arguments for the new optimizer