Skip to main content

Layers

Morph provides built-in neural network layers through the NN module.


Available Layers

LayerDescriptionCreation
NN.Linear(in, out)Fully-connected layerNN.Linear(784, 128)
NN.Conv2D(in, out, kernel)2D convolutionNN.Conv2D(3, 16, 3)
NN.BatchNorm(features)Batch normalizationNN.BatchNorm(128)
NN.Dropout(rate)Dropout regularizationNN.Dropout(0.5)

Layer Usage

layer is NN.Linear(784, 128);
output is layer.Forward(input);

Activation Functions

FunctionUsage
NN.ReLU(x)Rectified Linear Unit
NN.Sigmoid(x)Sigmoid activation
NN.Tanh(x)Hyperbolic tangent
NN.Softmax(x)Softmax normalization

Building a Network

MyNetwork class {
conv1 is NN.Conv2D(1, 32, 3);
bn1 is NN.BatchNorm(32);
fc1 is NN.Linear(32 * 26 * 26, 10);
drop1 is NN.Dropout(0.25);

Forward method(x as Tensor<float>) as Tensor<float> {
x is NN.ReLU(bn1.Forward(conv1.Forward(x)));
x is drop1.Forward(x);
return NN.Softmax(fc1.Forward(x));
}
}

Next Steps

  • Training — Training loop and optimization