Main Content

Deep Learning Custom Training Loops

Customize deep learning training loops and loss functions

If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For networks that cannot be created using layer graphs, you can define custom networks as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning network for custom training loops
forwardCompute deep learning network output for training
predictCompute deep learning network output for inference
adamupdateUpdate parameters using adaptive moment estimation (Adam)
rmspropupdate Update parameters using root mean squared propagation (RMSProp)
sgdmupdate Update parameters using stochastic gradient descent with momentum (SGDM)
dlupdate Update parameters using custom function
minibatchqueueCreate mini-batches for deep learning
onehotencodeEncode data labels into one-hot vectors
onehotdecodeDecode probability vectors into class labels
padsequencesPad or truncate sequence data to same length
initializeInitialize learnable and state parameters of a dlnetwork
dlarrayDeep learning array for custom training loops
dlgradientCompute gradients for custom training loops using automatic differentiation
dlfevalEvaluate deep learning model for custom training loops
dimsDimension labels of dlarray
finddimFind dimensions with specified label
stripdimsRemove dlarray data format
extractdataExtract data from dlarray
isdlarrayDetermine whether input is dlarray
functionToLayerGraphConvert deep learning model function to a layer graph
dlconvDeep learning convolution
dltranspconvDeep learning transposed convolution
lstmLong short-term memory
gruGated recurrent unit
embedEmbed discrete data
fullyconnectSum all weighted input data and apply a bias
dlode45Deep learning solution of nonstiff ordinary differential equation (ODE)
reluApply rectified linear unit activation
leakyreluApply leaky rectified linear unit activation
batchnormNormalize data across all observations for each channel independently
crosschannelnormCross channel square-normalize using local responses
groupnormNormalize data across grouped subsets of channels for each observation independently
instancenormNormalize across each channel for each observation independently
layernormNormalize data across all channels for each observation independently
avgpoolPool data to average values over spatial dimensions
maxpoolPool data to maximum value
maxunpoolUnpool the output of a maximum pooling operation
softmaxApply softmax activation to channel dimension
sigmoidApply sigmoid activation
crossentropyCross-entropy loss for classification tasks
l1lossL1 loss for regression tasks
l2lossL2 loss for regression tasks
huberHuber loss for regression tasks
mseHalf mean squared error
ctcConnectionist temporal classification (CTC) loss for unaligned sequence classification
dlaccelerateAccelerate deep learning function for custom training loops
AcceleratedFunctionAccelerated deep learning function
clearCacheClear accelerated deep learning function trace cache

Topics

Custom Training Loops

Train Deep Learning Model in MATLAB

Learn how to training deep learning models in MATLAB®.

Define Custom Training Loops, Loss Functions, and Networks

Learn how to define and customize deep learning training loops, loss functions, and networks using automatic differentiation.

Train Network Using Custom Training Loop

This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.

Specify Training Options in Custom Training Loop

Learn how to specify common training options in a custom training loop.

Define Model Gradients Function for Custom Training Loop

Learn how to define a model gradients function for a custom training loop.

Update Batch Normalization Statistics in Custom Training Loop

This example shows how to update the network state in a custom training loop.

Make Predictions Using dlnetwork Object

This example shows how to make predictions using a dlnetwork object by splitting data into mini-batches.

Train Network on Image and Feature Data

This example shows how to train a network that classifies handwritten digits using both image and feature input data.

Train Network with Multiple Outputs

This example shows how to train a deep learning network with multiple outputs that predict both labels and angles of rotations of handwritten digits.

Classify Videos Using Deep Learning with Custom Training Loop

This example shows how to create a network for video classification by combining a pretrained image classification model and a sequence classification network.

Train Image Classification Network Robust to Adversarial Examples

This example shows how to train a neural network that is robust to adversarial examples using fast gradient sign method (FGSM) adversarial training.

Train Neural ODE Network

This example shows how to train an augmented neural ordinary differential equation (ODE) network.

Train Robust Deep Learning Network with Jacobian Regularization

This example shows how to train a neural network that is robust to adversarial examples using a Jacobian regularization scheme [1].

Solve Ordinary Differential Equation Using Neural Network

This example shows how to solve an ordinary differential equation (ODE) using a neural network.

Assemble Multiple-Output Network for Prediction

This example shows how to assemble a multiple output network for prediction.

Run Custom Training Loops on a GPU and in Parallel

Speed up custom training loops by running on a GPU, in parallel using multiple GPUs, or on a cluster.

Model Functions

Train Network Using Model Function

This example shows how to create and train a deep learning network by using functions rather than a layer graph or a dlnetwork.

Update Batch Normalization Statistics Using Model Function

This example shows how to update the network state in a network defined as a function.

Make Predictions Using Model Function

This example shows how to make predictions using a model function by splitting data into mini-batches.

Initialize Learnable Parameters for Model Function

Learn how to initialize learnable parameters for custom training loops using a model function.

Automatic Differentiation

List of Functions with dlarray Support

View the list of functions that support dlarray objects.

Automatic Differentiation Background

Learn how automatic differentiation works.

Use Automatic Differentiation In Deep Learning Toolbox

How to use automatic differentiation in deep learning.

Deep Learning Function Acceleration

Deep Learning Function Acceleration for Custom Training Loops

Accelerate model functions and model gradients functions for custom training loops by caching and reusing traces.

Accelerate Custom Training Loop Functions

This example shows how to accelerate deep learning custom training loop and prediction functions.

Check Accelerated Deep Learning Function Outputs

This example shows how to check that the outputs of accelerated functions match the outputs of the underlying function.

Evaluate Performance of Accelerated Deep Learning Function

This example shows how to evaluate the performance gains of using an accelerated function.

Featured Examples