This library provides a simple neural network implementation using the MNIST handwritten digits dataset and the simpleneuralnets
library.
It currently does not install on Microsoft Windows due to some compilations issues surrounding the library used for random number generations & wincrypt.h.
Download the binary from the releases tab or directly from the instructions below (might be out of date).
wget https://github.com/marcolussetti/simplemnistneuralnetwork/releases/download/v0.1.2/simplemnistneuralnet
chmod +x simplemnistneuralnet
./simplemnistneuralnet --help
Install Nim from https://github.com/dom96/choosenim#installation
Install the libraries:
nimble install https://github.com/marcolussetti/mnist_tools
nimble install https://github.com/marcolussetti/simpleneuralnets
nimble install https://github.com/marcolussetti/simplemnistneuralnetwork
Run the network:
simplemnistneuralnet --help
Assistance on options is provided by the --help
parameter
$ ./simplemnistneuralnet --help
Usage:
main [optional-params]
Options(opt-arg sep :|=|spc):
-h, --help write this help to stdout
-w, --web_server bool false Enable the web server component, see also port.
-p=, --port= int 8080 Set the port to run the web server on. If port 80 is used, remember to use
sudo.
-t=, --threshold= float 0.0 The minimum output required to consider the number valid and not an
'Unknown' or exceptional case.
-r=, --random_batch= int 0 Randomly searches for ideal hyperparameters, ignores all further
parameters. 0 disables it, otherwise set to the number of searches to
perform.
-l=, --learning_rate= float 0.5 Learning rate or alpha for the backpropagation.
-e=, --epochs= int 20 The number of rates to train the network for.
-a=, --activation= string "tanh" The activation function, either 'tanh' for hyperbolic tangent or 'sigmoid'
for logistic sigmoid.
--hidden_layers= ,SV[int] 10 The composition of the hidden layers. For instance for 3 hidden layer with
20, 10, and 5 neurons in each, write '20,10,5'
./simplemnistneuralnet --learning-rate 0.5 --epochs 10 --hidden-layers=20
./simplemnistneuralnet -w --port 8080 --learning-rate 0.5 --epochs 10 --hidden-layers=20
./simplemnistneuralnet --learning-rate 0.5 --epochs 53 --hidden-layers=70,85 --activation tanh
./simplemnistneuralnet --random-batch 5
./simplemnistneuralnet --learning-rate 0.5 --epochs 10 --hidden-layers=20 --threshold 0.5
./simplemnistneuralnet -w --port 8080 --learning-rate 0.5 --epochs 55 --hidden-layers=70,85 --activation tanh --threshold 0.5
Remember that if running on port 80, you might need to prefix this with sudo.
This project was used as the basis for a small presentation discussing the impact of hyperparameters and methods of choosing hyperparameters.