Skip to content

hemanth-nag/back_prop_gradient_descent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

back_prop_gradient_descent

Implement BPA and gradient descent from scratch in Python

'gradient_descent_single_perceptron.ipynb'

It contains code for my article on medium titled Perceptron as a Function Approximator. Here, I have used a single perceptron and perform gradient descent on it. Please refer the article for further explanation.

'back_prop.ipynb'

It contains code for implementing back propagation through a Multi-layered perceptron with single hidden layer. I have implemented everything from scratch and the function also takes number of perceptrons in the hidden layer as an input.

I have taken different mathematical functions and performed linear reggression through the Back_Propagation function to check it's functionality and also to see how accuracy varies as we increase the number of perceptrons in the hidden layer.

About

Implement BPA and gradient descent from scratch in Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published