Implement BPA and gradient descent from scratch in Python
It contains code for my article on medium titled Perceptron as a Function Approximator. Here, I have used a single perceptron and perform gradient descent on it. Please refer the article for further explanation.
It contains code for implementing back propagation through a Multi-layered perceptron with single hidden layer. I have implemented everything from scratch and the function also takes number of perceptrons in the hidden layer as an input.
I have taken different mathematical functions and performed linear reggression through the Back_Propagation function to check it's functionality and also to see how accuracy varies as we increase the number of perceptrons in the hidden layer.