MATLAB Answers

Define custom fully connected layer

21 views (last 30 days)
KOOKMIN UNIV
KOOKMIN UNIV on 18 Oct 2018
Hi. I am trying to define a custom fully connected layer after some convolution layers.
classdef fully < nnet.layer.Layer
properties
% (Optional) Layer properties
% Layer properties go here
end
properties (Learnable)
% (Optional) Layer learnable parameters
% Layer learnable parameters go here
W;
b;
end
methods
function layer = fully(input_shape, output_shape, name)
% (Optional) Create a myLayer
% This function must have the same name as the layer
% Layer constructor function goes here
layer.Name = name;
layer.Description = 'fully self';
layer.b = rand([1 output_shape]);
layer.W = rand([input_shape, output_shape]);
end
function Z = predict(layer, X)
% Forward input data through the layer at prediction time and
% output the result
%
% Inputs:
% layer - Layer to forward propagate through
% X - Input data
% Output:
% Z - Output of layer forward function
% Layer forward function for prediction goes here
disp('W')
size(layer.W)
disp('X')
size(X)
Z = X*layer.W + layer.b;
end
%
% function [Z, memory] = forward(layer, X)
% % (Optional) Forward input data through the layer at training
% % time and output the result and a memory value
% %
% % Inputs:
% % layer - Layer to forward propagate through
% % X - Input data
% % Output:
% % Z - Output of layer forward function
% % memory - Memory value which can be used for
% % backward propagation
%
% % Layer forward function for training goes here
% Z = layer.W*X + layer.b;
% memory.W = layer.W;
% memory.b = layer.b;
% end
function [dLdX, dLdW, dLdB] = backward(layer, X, Z, dLdZ, memory)
% Backward propagate the derivative of the loss function through
% the layer
%
% Inputs:
% layer - Layer to backward propagate through
% X - Input data
% Z - Output of layer forward function
% dLdZ - Gradient propagated from the deeper layer
% memory - Memory value which can be used in
% backward propagation
% Output:
% dLdX - Derivative of the loss with respect to the
% input data
% dLdW1, ..., dLdWn - Derivatives of the loss with respect to each
% learnable parameter
% Layer backward function goes here
dLdX
end
end
end
My question is how could this library parse the input shape of the previous layer. I mean the previous convolution layer so that I could initialize the parameter for W? Thank you so much.

  0 Comments

Sign in to comment.

Answers (0)

Sign in to answer this question.