Skip to content

AlanDoesCS/MNIST-From-Scratch

Repository files navigation

MNIST-From-Scratch

GitHub Issues or Pull Requests GitHub commit activity GitHub contributors GitHub Repo stars GitHub forks

A from-scratch implementation of a feedforward neural network (FFNN) in order to classify handwritten digits form the MNIST dataset, with no machine learning libraries.

Graph: Test Accuracy vs Epoch for Different Learning Rates

Overview

This was created to understand the mathematics behind neural networks, from the theory thereof to the practical implementations, and more.

Key Features

  • Implementation of a FFNN and backpropagation from first principles
  • Experimentation with a range of learning rates
  • Visualizes accuracy over epochs for different learning rates
  • Written purely in Python + NumPy + Matplotlib

Results

  • Achieved ~90% test accuracy after training for 50 epochs
  • Demonstrates sensitivity of performance to learning rate adjustments

Report

A detailed write-up of the mathematical derivations, experimental methodology, and results is available here:
Alan Smith – Investigating the Relationship Between Learning Rate and Gradient Descent Convergence (PDF)

About

Implementing artificial neural networks from scratch in order to learn the MNIST dataset

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages