Main Content

DAGNetwork

Directed acyclic graph (DAG) network for deep learning

Description

A DAG network is a neural network for deep learning with layers arranged as a directed acyclic graph. A DAG network can have a more complex architecture in which layers have inputs from multiple layers and outputs to multiple layers.

Creation

There are several ways to create a DAGNetwork object:

Note

To learn about other pretrained networks, see Pretrained Deep Neural Networks.

Properties

expand all

This property is read-only.

Network layers, specified as a Layer array.

This property is read-only.

Layer connections, specified as a table with two columns.

Each table row represents a connection in the layer graph. The first column, Source, specifies the source of each connection. The second column, Destination, specifies the destination of each connection. The connection sources and destinations are either layer names or have the form 'layerName/IOName', where 'IOName' is the name of the layer input or output.

Data Types: table

This property is read-only.

Names of the input layers, specified as a cell array of character vectors.

Data Types: cell

This property is read-only.

Names of the output layers, specified as a cell array of character vectors.

Data Types: cell

Object Functions

activationsCompute deep learning network layer activations
classifyClassify data using trained deep learning neural network
predictPredict responses using trained deep learning neural network
plotPlot neural network architecture

Examples

collapse all

Create a simple directed acyclic graph (DAG) network for deep learning. Train the network to classify images of digits. The simple network in this example consists of:

  • A main branch with layers connected sequentially.

  • A shortcut connection containing a single 1-by-1 convolutional layer. Shortcut connections enable the parameter gradients to flow more easily from the output layer to the earlier layers of the network.

Create the main branch of the network as a layer array. The addition layer sums multiple inputs element-wise. Specify the number of inputs for the addition layer to sum. To easily add connections later, specify names for the first ReLU layer and the addition layer.

layers = [
    imageInputLayer([28 28 1])
    
    convolution2dLayer(5,16,'Padding','same')
    batchNormalizationLayer
    reluLayer('Name','relu_1')
    
    convolution2dLayer(3,32,'Padding','same','Stride',2)
    batchNormalizationLayer
    reluLayer
    convolution2dLayer(3,32,'Padding','same')
    batchNormalizationLayer
    reluLayer
    
    additionLayer(2,'Name','add')
    
    averagePooling2dLayer(2,'Stride',2)
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer];

Create a layer graph from the layer array. layerGraph connects all the layers in layers sequentially. Plot the layer graph.

lgraph = layerGraph(layers);
figure
plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Create the 1-by-1 convolutional layer and add it to the layer graph. Specify the number of convolutional filters and the stride so that the activation size matches the activation size of the third ReLU layer. This arrangement enables the addition layer to add the outputs of the third ReLU layer and the 1-by-1 convolutional layer. To check that the layer is in the graph, plot the layer graph.

skipConv = convolution2dLayer(1,32,'Stride',2,'Name','skipConv');
lgraph = addLayers(lgraph,skipConv);
figure
plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Create the shortcut connection from the 'relu_1' layer to the 'add' layer. Because you specified two as the number of inputs to the addition layer when you created it, the layer has two inputs named 'in1' and 'in2'. The third ReLU layer is already connected to the 'in1' input. Connect the 'relu_1' layer to the 'skipConv' layer and the 'skipConv' layer to the 'in2' input of the 'add' layer. The addition layer now sums the outputs of the third ReLU layer and the 'skipConv' layer. To check that the layers are connected correctly, plot the layer graph.

lgraph = connectLayers(lgraph,'relu_1','skipConv');
lgraph = connectLayers(lgraph,'skipConv','add/in2');
figure
plot(lgraph);

Figure contains an axes object. The axes object contains an object of type graphplot.

Load the training and validation data, which consists of 28-by-28 grayscale images of digits.

[XTrain,YTrain] = digitTrain4DArrayData;
[XValidation,YValidation] = digitTest4DArrayData;

Specify training options and train the network. trainNetwork validates the network using the validation data every ValidationFrequency iterations.

options = trainingOptions('sgdm', ...
    'MaxEpochs',8, ...
    'Shuffle','every-epoch', ...
    'ValidationData',{XValidation,YValidation}, ...
    'ValidationFrequency',30, ...
    'Verbose',false, ...
    'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,lgraph,options);

Figure Training Progress (19-Aug-2023 11:52:58) contains 2 axes objects and another object of type uigridlayout. Axes object 1 with xlabel Iteration, ylabel Loss contains 15 objects of type patch, text, line. Axes object 2 with xlabel Iteration, ylabel Accuracy (%) contains 15 objects of type patch, text, line.

Display the properties of the trained network. The network is a DAGNetwork object.

net
net = 
  DAGNetwork with properties:

         Layers: [16x1 nnet.cnn.layer.Layer]
    Connections: [16x2 table]
     InputNames: {'imageinput'}
    OutputNames: {'classoutput'}

Classify the validation images and calculate the accuracy. The network is very accurate.

YPredicted = classify(net,XValidation);
accuracy = mean(YPredicted == YValidation)
accuracy = 0.9934

Extended Capabilities

Version History

Introduced in R2017b