Main Content

experiments.Monitor

Update results table and training plots for custom training experiments

    Description

    When running a custom training experiment in Experiment Manager, use an experiments.Monitor object to track the progress of the training, update information fields in the results table, record values of the metrics used by the training, and produce training plots. For more information on custom training experiments, see Configure Custom Training Experiment.

    Creation

    When you run a custom training experiment, Experiment Manager creates an experiments.Monitor object for each trial of your experiment. Access the object as the second input argument of the training function.

    Alternatively, to debug a custom training experiment, call the experiments.Monitor function to create an object that you can use to run your training function in the MATLAB® Command Window. For more information, see Debug Experiments for Deep Learning.

    Description

    example

    monitor = experiments.Monitor creates an experiments.Monitor object that you can use to diagnose problems in your training function.

    Properties

    expand all

    Training status for a trial, specified as a string or character vector.

    Example: monitor.Status = "Loading Data";

    Data Types: char | string

    Training progress for a trial, specified as a numeric scalar between 0 and 100.

    Example: monitor.Progress = 17;

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64 | fi

    Information column names, specified as a string, character vector, string array, or cell array of character vectors. Valid names begin with a letter, and can contain letters, digits, and underscores. These names appear as column headers in the experiment results table. The values in the information columns do not appear in the training plot.

    You can set this value only once in your training function.

    Example: monitor.Info = ["GradientDecayFactor","SquaredGradientDecayFactor"];

    Data Types: char | string

    Metric column names, specified as a string, character vector, string array, or cell array of character vectors. Valid names begin with a letter, and can contain letters, digits, and underscores. These names appear as column headers in the experiment results table. Additionally, each metric appears in its own training subplot. To plot more than one metric in a single subplot, use the function groupSubPlot.

    You can set this value only once in your training function.

    Example: monitor.Metrics = ["TrainingLoss","ValidationLoss"];

    Data Types: char | string

    Horizontal axis label in the training plot, specified as a string or character vector.

    Set this value before calling the function recordMetrics.

    Example: monitor.XLabel = "Iteration";

    Data Types: char | string

    This property is read-only.

    Flag to stop trial, specified as a numeric or logical 1 (true) or 0 (false). The value of this property changes to true when you click Stop in the Experiment Manager toolstrip or the results table.

    Data Types: logical

    Object Functions

    groupSubPlotGroup metrics in experiment training plot
    recordMetricsRecord metric values in experiment results table and training plot
    updateInfoUpdate information columns in experiment results table

    Examples

    collapse all

    Use an experiments.Monitor object to track the progress of the training, display information and metric values in the experiment results table, and produce training plots for custom training experiments.

    Before starting the training, specify the names of the information and metric columns of the Experiment Manager results table.

    monitor.Info = ["GradientDecayFactor","SquaredGradientDecayFactor"];
    monitor.Metrics = ["TrainingLoss","ValidationLoss"];

    Specify the horizontal axis label for the training plot. Group the training and validation loss in the same subplot.

    monitor.XLabel = "Iteration";
    groupSubPlot(monitor,"Loss",["TrainingLoss","ValidationLoss"]);

    Update the values of the gradient decay factor and the squared gradient decay factor for the trial in the results table.

    updateInfo(monitor, ...
        GradientDecayFactor=gradientDecayFactor, ...
        SquaredGradientDecayFactor=squaredGradientDecayFactor);

    After each iteration of the custom training loop, record the value of training and validation loss for the trial in the results table and the training plot.

    recordMetrics(monitor,iteration, ...
        TrainingLoss=trainingLoss, ...
        ValidationLoss=validationLoss);

    Update the training progress for the trial based on the fraction of iterations completed.

    monitor.Progress = (iteration/numIterations) * 100;

    To debug a custom training experiment, open the training function and set breakpoints as described in Set Breakpoints. Then, in the MATLAB Command Window, call the training function.

    Create a structure with a field for each hyperparameter defined in the experiment. Assign a value to each field from the range indicated for the corresponding hyperparameter. For example, if your experiment has two hyperparameters called WeightsInitializer and BiasInitializer, enter the following.

    params = struct(WeightsInitializer="he", ...
        BiasInitializer="narrow-normal");

    Create an experiments.Monitor object.

    monitor = experiments.Monitor;

    Call the training function using the hyperparameter structure and the experiments.Monitor object as the inputs to the function. For example, if your training function is called ImageComparisonExperiment_training1, enter the following.

    net = ImageComparisonExperiment_training1(params,monitor);

    MATLAB pauses at each line of code indicated by a breakpoint. When the function execution stops at the breakpoint, you can view the values of your variables, step through the code line by line, or continue to the next breakpoint. For more information, see Debug Experiments for Deep Learning.

    Tips

    • Both information and metric columns display values in the results table for your experiment. Additionally, the training plot shows a record of the metric values. Use information columns for text and for numerical values that you want to display in the results table but not in the training plot.

    Version History

    Introduced in R2021a