A neural network class supporting dense neural networks with ReLU activation functions, mean squared error loss and backpropagation coded from scratch importing only numpy. Created as a test of my understanding. This is not intended to be a particularly fast, efficient, or extensible implementation.