Core Functions
Model Creation
chain- Create a neural network modelparameters- Access model parameters
Layer Types
Static- Input layer specifying dimensionsDense- Fully connected layerConv- Convolutional layerMaxPool- Max pooling layerFlatten- Flatten multi-dimensional data
Forward Pass
preallocate- Create forward pass cacheset_inputs!- Set input dataforward!- Execute forward passget_outputs- Extract model outputs
Backward Pass
preallocate_grads- Create gradient cachebackprop!- Execute backward passgradients- Access computed gradients
Loss Functions
MSELoss- Mean squared error lossLogitCrossEntropyLoss- Cross entropy loss for classification
Activation Functions
GPU Support
gpu- Move models/data to GPU
Function Index
SimpleNNs.AdamOptimiserSimpleNNs.ConvSimpleNNs.DenseSimpleNNs.FlattenSimpleNNs.GlorotNormalSimpleNNs.GlorotUniformSimpleNNs.HeNormalSimpleNNs.HeUniformSimpleNNs.InitialiserSimpleNNs.LeCunNormalSimpleNNs.LogitCrossEntropyLossSimpleNNs.MSELossSimpleNNs.MaxPoolSimpleNNs.RMSPropOptimiserSimpleNNs.RMSPropOptimiserSimpleNNs.SGDOptimiserSimpleNNs.SGDOptimiserSimpleNNs.StaticSimpleNNs.ZerosBase.deepcopySimpleNNs.activation_gradient_fnSimpleNNs.add_lossSimpleNNs.backprop!SimpleNNs.chainSimpleNNs.forward!SimpleNNs.get_lossSimpleNNs.get_outputsSimpleNNs.get_predictionsSimpleNNs.gpuSimpleNNs.gradientsSimpleNNs.has_lossSimpleNNs.initialise!SimpleNNs.parametersSimpleNNs.preallocateSimpleNNs.preallocate_gradsSimpleNNs.pullback!SimpleNNs.reluSimpleNNs.remove_lossSimpleNNs.reset!SimpleNNs.set_inputs!SimpleNNs.sigmoidSimpleNNs.tanh_fastSimpleNNs.update!