BabyTorch is a lightweight, educational deep learning framework inspired by PyTorch. It's designed to offer a similar API to PyTorch, but with a minimal implementation. This simplicity makes it an excellent tool for learning and experimenting with deep learning concepts. The framework is structured to be easily understandable and modifiable, making it perfect for educational purposes.
To install BabyTorch, follow these steps:
- Clone the BabyTorch repository:
git clone https://github.com/amjadmajid/BabyTorch.git
- Navigate to the BabyTorch directory:
cd BabyTorch
- For a regular installation, run:
python setup.py install
- For developing BabyTorch itself, run:
pip install -e . --user
BabyTorch includes the following modules, mirroring the structure of PyTorch:
datasets
: Data loading utilities and predefined datasets like MNIST.engine
: Core engine for operations and tensor manipulations.nn
: Neural network layers and loss functions.optim
: Optimization algorithms and learning rate schedulers.visualization
: Tools for visualizing data and model graphs.
Below are some examples of how to use BabyTorch, which also serve as basic tests:
-
Classification Examples:
-
Regression Examples:
-
Tensor Tests:
-
Visualization Tests:
- Graph visualization:
tests/visualization_tests/grapher_test.py
- Graph visualization:
These tests provide practical examples of implementing and using various components of BabyTorch.
We welcome contributions to BabyTorch. It is in an early development stage. If you're interested in improving BabyTorch or adding new features, please check TODO.md
for upcoming tasks or propose your own ideas.
The BabyTorch framework is designed with simplicity and educational value in mind. Here’s an overview of its main components:
-
Datasets (
datasets
)- Handles data loading and preprocessing.
- Includes implementations for standard datasets like MNIST.
- Provides a foundation for custom dataset integration.
-
Engine (
engine
)- The backbone of the framework, handling core operations.
- It contains two files:
operations.py
andtensor.py
. operations.py
implements the underlying computation engine. For each operation the forward and backward passes are implemented.tensor.py
implements the tensor data structure and its operations. For any operation, the corresponding operation inoperations.py
is called but the result is stored in a tensor. In this way we separate the computation and the data.
-
Neural Networks (
nn
)- Provides building blocks for neural networks.
- Includes layers, activation functions, and loss functions.
- Allows easy stacking and customization of layers to build various models.
-
Optimizers (
optim
)- Implements optimization algorithms.
- Contains optimizers like SGD.
- Features learning rate schedulers for controlling the learning rate during training.
-
Visualization (
visualization
)- Tools for visualizing data, model architectures, and training progress.
- Helps in understanding model behavior and performance.
- The
tests
directory contains numerous tests and examples demonstrating the use of BabyTorch's components. It serves as both a testing suite and a source of practical examples for users.
- Designed to be easily extensible, allowing users to add new functionalities or modify existing ones.
- Encourages experimentation and exploration in deep learning.
.
├── README.md
├── TODO.md
├── babytorch
│ ├── __init__.py
│ ├── __pycache__
│ ├── datasets
│ ├── engine
│ ├── nn
│ ├── optim
│ └── visualization
├── images
│ └── babyTorchLogo.jpg
├── setup.py
├── tests
│ ├── tensor_operations
│ └── visualization
└── tutorials
├── classification
└── regression
This addition to the README provides a clear, high-level view of BabyTorch's architecture, making it easier for users to navigate the framework and understand its capabilities. You can tailor the descriptions to match the specific implementations and features of BabyTorch.
This project is licensed under the MIT License.
Happy Learning! 🚀