Skip to content

cwallenwein/neural-networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Networks

Quote: If you can't build it, you don't understand it.

The goal of this repository is to build something like Googles Tensorflow Playground from scratch using NumPy. The focus lies on the algorithms and not so much on the UI.

Alt text

TODO

[x] Forward pass
[x] Backward pass
[x] Activation functions
      [x] ReLU
      [  ] Sigmoid
      [  ] Softmax
      [  ] TanH
[  ] Loss/Cost functions
      [x] L1 loss
      [x] L2 loss
      [  ] Softmax loss
      [  ] Cross-Entropy loss
[  ] Optimizers
      [  ] Stochastic gradient descent
      [  ] Batch gradient descent
      [  ] Minibatch gradient descent
      [  ] Momentum
      [  ] RMSProp
      [  ] Adam
[  ] Visualization
      [  ] x: time, y: loss
      [  ] 2D visualization of model, data & cost
      [  ] 3D visualization of model, data & cost
[  ] Convolutional Neural Networks
      [  ] convolutional layers
      [  ] pooling layers
[  ] Regularization
      [  ] L1 regularization
      [  ] L2 regularization
[  ] Weight initialization
      [  ] Ones
      [  ] Zeros
      [  ] Random Normal initialization
      [  ] Xavier initialization
[  ] Dropout
[  ] Normalization
      [  ] Batch norm
      [  ] Layer norm
      [  ] Instance norm
      [  ] Group norm
[  ] Attention layer
[  ] Add Documentation
[  ] Add Tests
[  ] Simplify code by reimplementing library with JAX instead of NumPy
[  ] Implement traditional machine learning algorithms (regression, KNN, SVM, etc.)

About

A numpy based implementation of neural networks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published