This is a program for a general feedforward neural network and is intended for educational purposes. It is simple and short, making it easy for a reader to quickly get into the details of how a neural network can be implemented. NeuralNet2.ipynb contains the code for the neural network, the rest of the .ipynb files are example applications, and the .pkl files are the associated data. This code accompanies a set of tutorials on neural networks, including a walkthrough of the NeuralNet2.ipynb, available at https://learningmachinelearning.org/
NeuralNet2.ipynb is a vectorized implementation of a general feedforward neural network in Python
- Four activation functions available: sigmoid, softmax, ReLU, and Leaky ReLU
- Three parameter initialization functions
- Three cost functions available: quadratic, cross entropy, log likelihood
- Network is trained with stochastic gradient descent, variable batch size
- L2 weight regularization available
Two example applications are provided. These demonstrate the effect of different parameter settings and network architectures on performance
- Example1: Two class classification: Separating a ball and a donut
- Example2: MNIST