Skip to content

sergedesmedt/MathOfNeuralNetworks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Math Of Neural Networks

A deep dive into the mathematics of deep learning with a focus on Image Processing.

The first article discusses the basic math involved in the Rosenblatt perceptron. This is albeit relative simple math: mostly basic vector operations. The article discusses what a vector is, the main calculations used in the Rosenblatt perceptron and finishes by discussing the main drawbacks of this type of perceptron, thus laying ground for the next article on the ADALIN perceptron

Discusses how the limitations of the Rosenblatt perceptron where resolved and finishes with the limitations that still remain by using a single perceptron

Article 3: Neural Networks

How are the limitations of a single perceptron resolved and what new limitations emerge by using multiple perceptrons.

Article 4: Convolutional Neural Networks

Builds up on previous articles to discuss what can still be enhanced with respect to image processing.

Article 5: ...

We'll see by then but right now I don't have any real plans on writing a fifth article.

About

Introduction to the mathematics behind neural networks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published