As of right now, this crate is far from a usable state. This crate implements feedforward-neural Networks in rust. Unlike the vast majority of NN libraries, like pytorch or tensorflow, deep thought does not use backpropagation to calculate gradients. Instead, it utilizes Dual Numbers, which allow for the calculation of derivatives during the forward pass.
There is still a long way to go before this crate will become usable (if ever).
For more information about Dual Numbers, please take a look at these excellent Blogposts by @Atrix256:
- Dual Numbers & Automatic Differentiation
- Multivariable Dual Numbers & Automatic Differentiation
- Neural Network Gradients: Backpropagation, Dual Numbers, Finite Differences
If you want to learn more about the math behind neural networks, take a look at these links:
Deep_thought makes use of the negative_impls
, auto_traits
and array_zip
features, which are not available on the stable release channel yet.