Procedures

ProcedureLocationProcedure TypeDescription
accuracymod_networkFunction

Given input x and output y, evaluates the position of the maximum value of the output and returns the number of matches relative to the size of the dataset.

activation_functionmod_activationInterface
array1dmod_layerInterface
array1d_constructormod_layerFunction

Overloads the default type constructor.

array2dmod_layerInterface
array2d_constructormod_layerFunction

Overloads the default type constructor.

backpropmod_networkSubroutine

Applies a backward propagation through the network and returns the weight and bias gradients.

constructormod_layerFunction

Layer class constructor. this_size is the number of neurons in the layer. next_size is the number of neurons in the next layer, used to allocate the weights.

db_co_summod_layerSubroutine

Performs a collective sum of bias tendencies.

db_initmod_layerSubroutine

Initialises biases structure.

digitsmod_mnistFunction

Returns an array of 10 reals, with zeros everywhere and a one corresponding to the input number, for example: digits(0) = [1., 0., 0., 0., 0., 0., 0., 0., 0., 0.] digits(1) = [0., 1., 0., 0., 0., 0., 0., 0., 0., 0.] digits(6) = [0., 0., 0., 0., 0., 0., 1., 0., 0., 0.]

dw_co_summod_layerSubroutine

Performs a collective sum of weights tendencies.

dw_initmod_layerSubroutine

Initialises weights structure.

fwdpropmod_networkSubroutine

Performs the forward propagation and stores arguments to activation functions and activations themselves for use in backprop.

gaussianmod_activationFunction

Gaussian activation function.

gaussian_primemod_activationFunction

First derivative of the Gaussian activation function.

initmod_networkSubroutine

Allocates and initializes the layers with given dimensions dims.

label_digitsmod_mnistFunction

Converts an array of MNIST labels into a form that can be input to the network_type instance.

layer_typemod_layerInterface
loadmod_networkSubroutine

Loads the network from file.

load_mnistmod_mnistSubroutine

Loads the MNIST dataset into arrays.

lossmod_networkFunction

Given input x and expected output y, returns the loss of the network.

net_constructormod_networkFunction

Network class constructor. Size of input array dims indicates the total number of layers (input + hidden + output), and the value of its elements corresponds the size of each layer.

network_typemod_networkInterface
output_batchmod_networkFunction

Use forward propagation to compute the output of the network. This specific procedure is for a batch of 1-d input data.

output_singlemod_networkFunction

Use forward propagation to compute the output of the network. This specific procedure is for a single sample of 1-d input data.

print_imagemod_mnistSubroutine

Prints a single image and label to screen.

randnmod_randomInterface
randn1dmod_randomFunction

Generates n random numbers with a normal distribution.

randn2dmod_randomFunction

Generates m x n random numbers with a normal distribution.

read_binary_filemod_ioInterface
read_binary_file_1dmod_ioSubroutine
read_binary_file_2dmod_ioSubroutine
relumod_activationFunction

REctified Linear Unit (RELU) activation function.

relu_primemod_activationFunction

First derivative of the REctified Linear Unit (RELU) activation function.

savemod_networkSubroutine

Saves the network to a file.

set_activationmod_layerSubroutine

Sets the activation function. Input string must match one of provided activation functions, otherwise it defaults to sigmoid. If activation not present, defaults to sigmoid.

set_activation_equalmod_networkSubroutine

A thin wrapper around layer % set_activation(). This method can be used to set an activation function for all layers at once.

set_activation_layersmod_networkSubroutine

A thin wrapper around layer % set_activation(). This method can be used to set different activation functions for each layer separately.

sigmoidmod_activationFunction

Sigmoid activation function.

sigmoid_primemod_activationFunction

First derivative of the sigmoid activation function.

stepmod_activationFunction

Step activation function.

step_primemod_activationFunction

First derivative of the step activation function.

syncmod_networkSubroutine

Broadcasts network weights and biases from specified image to all others.

tanh_primemod_activationFunction

First derivative of the tanh activation function.

tanhfmod_activationFunction

Tangent hyperbolic activation function. Same as the intrinsic tanh, but must be defined here so that we can use procedure pointer with it.

tile_indicesmod_parallelFunction

Given input global array size, return start and end index of a parallel 1-d tile that correspond to this image. start and end indices assuming equal tile sizes if we have any remainder, distribute it to the tiles at the end

train_batchmod_networkSubroutine

Trains a network using input data x and output data y, and learning rate eta. The learning rate is normalized with the size of the data batch. mini-batch size number of layers

train_epochsmod_networkSubroutine

Trains for num_epochs epochs with mini-bachtes of size equal to batch_size.

train_singlemod_networkSubroutine

Trains a network using a single set of input data x and output data y, and learning rate eta.

updatemod_networkSubroutine

Updates network weights and biases with gradients dw and db, scaled by learning rate eta. update biases update weights

call~~graph~~CallGraph proc~update update layers layers proc~update->layers proc~db_init db_init interface~array1d array1d proc~db_init->interface~array1d proc~gaussian gaussian proc~label_digits label_digits proc~load load proc~relu relu interface~activation_function activation_function proc~array1d_constructor array1d_constructor proc~relu_prime relu_prime proc~dw_init dw_init interface~array2d array2d proc~dw_init->interface~array2d proc~train_batch train_batch proc~train_batch->proc~db_init proc~train_batch->proc~dw_init proc~db_co_sum db_co_sum proc~train_batch->proc~db_co_sum proc~tile_indices tile_indices proc~train_batch->proc~tile_indices proc~dw_co_sum dw_co_sum proc~train_batch->proc~dw_co_sum proc~set_activation set_activation proc~sync sync proc~train_single train_single proc~loss loss interface~read_binary_file read_binary_file proc~read_binary_file_1d read_binary_file_1d interface~read_binary_file->proc~read_binary_file_1d proc~read_binary_file_2d read_binary_file_2d interface~read_binary_file->proc~read_binary_file_2d proc~sigmoid sigmoid proc~train_epochs train_epochs proc~output_single output_single proc~output_single->layers proc~sigmoid_prime sigmoid_prime proc~sigmoid_prime->proc~sigmoid proc~array2d_constructor array2d_constructor interface~array2d->proc~array2d_constructor proc~gaussian_prime gaussian_prime proc~gaussian_prime->proc~gaussian proc~step step interface~layer_type layer_type proc~constructor constructor interface~layer_type->proc~constructor interface~randn randn proc~constructor->interface~randn proc~step_prime step_prime proc~tanhf tanhf interface~array1d->proc~array1d_constructor proc~tanh_prime tanh_prime proc~accuracy accuracy proc~set_activation_layers set_activation_layers proc~net_constructor net_constructor proc~output_batch output_batch proc~backprop backprop proc~backprop->proc~db_init proc~backprop->proc~dw_init proc~backprop->layers dims dims proc~backprop->dims proc~save save interface~network_type network_type interface~network_type->proc~net_constructor proc~print_image print_image proc~digits digits proc~load_mnist load_mnist proc~load_mnist->interface~read_binary_file proc~fwdprop fwdprop proc~fwdprop->layers proc~randn1d randn1d interface~randn->proc~randn1d proc~randn2d randn2d interface~randn->proc~randn2d proc~set_activation_equal set_activation_equal proc~init init
Help