The main goal of this project is to build from scratch and with the help of as less as possible external libraries and ressources neural networks that are lite, powerfull, and fully understandable.
For now, this project has as goal to provide an implementation of a MultiLayer Perceptron with a Genetic algorithm and a gradient descent.
The main project is written in C++ with the standard library and without any dynamic container. v0.1 is only implemented using static programming and variadic templates. In order to simplify the code and to make it more useful for bigger projects, the v0.2 abandons this idea and provide lighter implementations with dynamic containers.
For now, the main core of the project is being written. Genetic algorithm is implemented and fully operationnal. A gradient descent is planned.