Plot Results

A convenience script to extract useful information out of the results create by the runners.

This script can return one or all of the below information:

  • Get best run: Returns the best hyperparameter setting for each optimizer in each test problem.
  • Plot learning rate sensitivity: Creates a plot for each test problem showing the relative performance of each optimizer against the learning rate to get a sense of how difficult the tuning process was.
  • Plot performance: Creates a plot for the small and large benchmark set, plotting (if available) all four performance metric (losses and accuracies for both the test and the train data set) for each optimizer.
  • Plot table: Creates the overall performance table for the small and large benchmark set including metrics for the performance, speed and tuneability of each optimizer on each test problem.

By default this script also plots the baseline results for SGD, Momentum and Adam, but this can be turned off.

Usage:

Plotting tool for DeepOBS.

usage: deepobs_plot_results.py [-h] [--get_best_run] [--plot_lr_sensitivity]
                               [--plot_performance] [--plot_table] [--full]
                               [--ignore_baselines]
                               path

Positional Arguments

path Path to the results folder

Named Arguments

--get_best_run

Return best hyperparameter setting per optimizer and testproblem.

Default: False

--plot_lr_sensitivity
 

Plot 'sensitivity' plot for the learning rates.

Default: False

--plot_performance
 

Plot performance plot compared to the baselines.

Default: False

--plot_table

Plot overall performance table including speed and hyperparameters.

Default: False

--full

Run a full analysis and plot all figures.

Default: False

--ignore_baselines
 

Ignore baselines and just plot from results folder.

Default: False