A simple neural network framework for educational purposes.
- Neural Network Framework
- Features
- Class diagram
- Installation
- Usage example (Red wine quality prediction)
- Component Explanation
This library was a student project for learning and implementing the math behind deep learning. It includes code for training a fully connected neural network, which can be trained on arbitrary datasets. The library is written in a modular way, so that it can be easily extended with new components.
- Fully connected layers
- Dropout layers
- Multiple Loss and Activation functions
- Optimizers (SGD, Momentum)
- PCA
- Evaluation metrics (Accuracy, Precision, Recall, F1, confusion matrix)
- Callback functions for validation score
- ...
As a project:
pip install -r requirements.txt
You can also install it as a package:
pip install git+https://github.com/strasserpatrick/neural_network_framework.git
python examples/winequality-binary-classification.py
Helper class that performs training and evaluation on a model, a dataset and an optimizer. Additionally, it is able to perform sanity checks on the model. Models hyperparameters like learning rate and epochs are set here.
Abstraction for wrapping arbitrary data formats into our dataset class for common handling. NumpyDataset
is a concrete implementation for numpy arrays. There is also a CSVDataset
for pandas dataframes.
Wrapper for dataset, which handles indexing, shuffling, and other data loading related tasks.
Collection of different optimizers for the backpropagation algorithm. Currently we only implemented SGD and SGD with Momentum. Again, an abstraction is used to make it easy to extend the library with new optimizers.
A collection of activation and loss functions, such as Sigmoid, ReLU, Softmax, MSE, CrossEntropy, etc.
A collection of evaluation metrics for the model. This includes all commonly used metrics, such as Accuracy, Precision, Recall, F1, Confusion Matrix, etc.
A collection of data transforms, such as OneHotEncoding, Minority Sampling, MinMaxScaler, Standardizer, etc.
A class for performing PCA on the dataset. This is used for dimensionality reduction. The number of components can be set as a hyperparameter. Also, the explained variance ratio can be plotted.
An abstraction for layers in the neural network used for models. Concrete implementations for fully-connected layers and dropout layers are provided.
Sequential model holds a sequence of layers. The output of the previous layer is the input of the next layer. The model can be trained and evaluated on the dataset. Additionally, it is possible to perform sanity checks on the model, such as gradient checking and overfitting checks.