Skip to content

Commit

Permalink
Update docs [WIP]
Browse files Browse the repository at this point in the history
  • Loading branch information
surajpaib committed Nov 27, 2023
1 parent 1af9ed1 commit 799ad72
Show file tree
Hide file tree
Showing 37 changed files with 2,939 additions and 78 deletions.
Binary file added docs/assets/images/Lighter_demo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/images/PL_demo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
63 changes: 0 additions & 63 deletions docs/basics/5-minute-example.md

This file was deleted.

194 changes: 193 additions & 1 deletion docs/basics/config.md
Original file line number Diff line number Diff line change
@@ -1 +1,193 @@
🚧 Under construction 🚧
# Configuration System

Lighter is a configuration-centric framework where the config. is used for setting up the machine learning workflow from model architecture selection, loss function, optimizer, dataset preparation and running the training/evaluation/inference process.

Our configuration system is heavily based on MONAI bundle parser but with a standardized structure. For every configuration, we expect several items to be mandatorily defined.

Let us take a simple example config to dig deeper into the configuration system of Lighter. You can go through the config and click on the + for more information about specific concepts.

<div class="annotate" markdown>

```yaml title="cifar10.yaml"

trainer:
_target_ (1): pytorch_lightning.Trainer
max_epochs (2): 100

system:
_target_: lighter.LighterSystem
batch_size: 512

model:
_target_: torchvision.models.resnet18
num_classes: 10

criterion:
_target_: torch.nn.CrossEntropyLoss

optimizer:
_target_: torch.optim.Adam
params: "$@system#model.parameters()" (3)
lr: 0.001

datasets:
train:
_target_: torchvision.datasets.CIFAR10
download: True
root: .datasets
train: True
transform:
_target_: torchvision.transforms.Compose
transforms: (4)
- _target_: torchvision.transforms.ToTensor
- _target_: torchvision.transforms.Normalize
mean: [0.5, 0.5, 0.5]
std: [0.5, 0.5, 0.5]

```
</div>
1. `_target_` is a special reserved keyword that initializes a python object from the provided text. In this case, a `Trainer` object from the `pytorch_lightning` library is initialized
2. `max_epochs` is an argument of the `Trainer` class which is passed through this format. Any argument for the class can be passed similarly.
3. `$@` is a combination of `$` which evaluates a python expression and `@` which references a python object. In this case we first reference the model with `@model` which is the `torchvision.models.resnet18` defined earlier and then access its parameters using `[email protected]()`
4. YAML allows passing a list in the format below where each `_target_` specifices a transform that is added to the list of transforms in `Compose`. The `torchvision.datasets.CIFAR10` accepts these with a `transform` argument and applies them to each item.

## Configuration Concepts
As seen in the [Quickstart](./quickstart.md), Lighter has two main components:

### Trainer Setup
```yaml
trainer:
_target_: pytorch_lightning.Trainer # (1)!
max_epochs: 100
```
The trainer object (`pytorch_lightning.Trainer`) is initialized through the `_target_` key. For more info on `_target_` and special keys, click [here](#special-syntax-and-keywords)

The `max_epochs` is an argument provided to the `pytorch_lightning.Trainer` object during its instantiation. All arguments that are accepted during instantiation can be provided similarly.

### LighterSystem Configuration
While Lighter borrows the Trainer from Pytorch Lightning, LighterSystem is a custom component unique to Lighter that draws on several concepts of PL such as LightningModule to provide a simple way to capture all the integral elements of a deep learning system.

Concepts encapsulated by LighterSystem include,

#### Model definition
The `torchvision` library is installed by default in Lighter and therefore, you can choose different torchvision models here. We also have `monai` packaged with Lighter, so if you are looking to use a ResNet, all you need to modify to fit this new model in your config is,

=== "Torchvision ResNet18"

```yaml
LighterSystem:
...
model:
_target_: torchvision.models.resnet18
num_classes: 10
...
```

=== "MONAI ResNet50"

```yaml
LighterSystem:
...
model:
_target_: monai.networks.nets.resnet50
num_classes: 10
spatial_dims: 2
...
```

=== "MONAI 3DResNet50"

```yaml
LighterSystem:
...
model:
_target_: monai.networks.nets.resnet50
num_classes: 10
spatial_dims: 3
...
```

<br/>
#### Criterion/Loss

Similar to overriding models, when exploring different loss types in Lighter, you can easily switch between various loss functions provided by libraries such as `torch` and `monai`. This flexibility allows you to experiment with different approaches to optimize your model's performance without changing code!! Below are some examples of how you can modify the criterion section in your configuration file to use different loss functions.

=== "CrossEntropyLoss"
```yaml
LighterSystem:
...
criterion:
_target_: torch.nn.CrossEntropyLoss
...
```

=== "MONAI's Dice Loss"
```yaml
LighterSystem:
...
criterion:
_target_: monai.losses.DiceLoss
...
```

<br/>
#### Optimizer

Same as above, you can experiment with different optimizer parameters. Model parameters are directly passed to the optimizer in `params` argument.
```yaml hl_lines="5"
LighterSystem:
...
optimizer:
_target_: torch.optim.Adam
params: "$@system#model.parameters()"
lr: 0.001
...
```

You can also define a scheduler for the optimizer as below,
```yaml hl_lines="10"
LighterSystem:
...
optimizer:
_target_: torch.optim.Adam
params: "$@system#model.parameters()"
lr: 0.001
scheduler:
_target_: torch.optim.lr_scheduler.CosineAnnealingLR
optimizer: "@system#optimizer"
eta_min: 1.0e-06
T_max: "%trainer#max_epochs"
...
```
Here, the optimizer is passed to the scheduler with the `optimizer` argument. `%trainer#max_epochs` is also passed to the scheduler where it fetches `max_epochs` from the Trainer class.

<br/>
#### Datasets

```yaml
LighterSystem:
...
datasets:
train:
_target_: torchvision.datasets.CIFAR10
download: True
root: .datasets
train: True
transform:
_target_: torchvision.transforms.Compose
transforms:
- _target_: torchvision.transforms.ToTensor
- _target_: torchvision.transforms.Normalize
mean: [0.5, 0.5, 0.5]
std: [0.5, 0.5, 0.5]
...
```

### Special Syntax and Keywords
- `_target_`: Indicates the Python class to instantiate. If a function is provided, a partial function is created. Any configuration key set with `_target_` will map to a python object.
- **@**: References another configuration value. Using this syntax, keys mapped to python objects can be accessed. For instance, the learning rate of an optimizer, `optimizer` instianted to `torch.optim.Adam` using `_target_` can be accessed using `@model#lr` where `lr` is an attribute of the `torch.optim.Adam` class.
- **$**: Used for evaluating Python expressions.
- **%**: Macro for textual replacement in the configuration.
File renamed without changes.
124 changes: 124 additions & 0 deletions docs/basics/quickstart.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
# Quickstart
Get up and running in under 5 mins!

## Installation

```python
pip install project-lighter
```

For bleeding edge version, run
```python
pip install project-lighter --pre
```


## Building a config
Key to the Lighter ecosystem is a YAML file that serves as the central point of control for your experiments. It allows you to define, manage, and modify all aspects of your experiment without diving deep into the code.

A Lighter config contains two main components:

- Trainer
- LighterSystem

### Trainer
Trainer contains all the information about running a training/evaluation/inference process and is a crucial component of training automation in Pytorch Lightning. Please refer to the [Pytorch Lightning's Trainer documentation](https://lightning.ai/docs/pytorch/stable/common/trainer.html) for more information.

Simply put, you can set several things here such as the number of epochs, the number of gpus, the number of nodes, etc. All the parameters that can be set in the `pytorch_lightning.Trainer` class can also go here. Please refer to the [API](https://lightning.ai/docs/pytorch/stable/common/trainer.html#trainer-class-api) for more information.

Defining this in our config looks something like this
```yaml
trainer:
_target_: pytorch_lightning.Trainer
max_epochs: 100
```
For more information see [here](./config.md)
### LighterSystem
LighterSystem encapsulates all parts of a deep learning setup, such as the model, optimizer, criterion, datasets, metrics, etc. This is where the "science" of building deep learning models is developed. The LighterSystem is highly flexible and contain logic suitable for any task - classification, segmentation, object detection, self-supervised learning, etc. Think of this as writing your code but with predefined structure on where to define each compoenent (such as model, criterion, etc.)
This provides powerful extensibility as training experiments for classification and self-supervised learning can follow a similar template. An example of a LighterSystem for training a supervised classification model on CIFAR10 dataset is shown below,
```yaml
system:
_target_: lighter.LighterSystem
batch_size: 512

model:
_target_: torchvision.models.resnet18
num_classes: 10

criterion:
_target_: torch.nn.CrossEntropyLoss

optimizer:
_target_: torch.optim.Adam
params: "$@system#model.parameters()"
lr: 0.001

datasets:
train:
_target_: torchvision.datasets.CIFAR10
download: True
root: .datasets
train: True
transform:
_target_: torchvision.transforms.Compose
transforms:
- _target_: torchvision.transforms.ToTensor
- _target_: torchvision.transforms.Normalize
mean: [0.5, 0.5, 0.5]
std: [0.5, 0.5, 0.5]
```
For more information about each of the LighterSystem components and how to override them, see [here](./config.md)
## Running this experiment with Lighter
We just combine the Trainer and LighterSystem into a single YAML and run the command in the terminal as shown,
=== "cifar10.yaml"
```yaml

trainer:
_target_: pytorch_lightning.Trainer
max_epochs: 100

system:
_target_: lighter.LighterSystem
batch_size: 512

model:
_target_: torchvision.models.resnet18
num_classes: 10

criterion:
_target_: torch.nn.CrossEntropyLoss

optimizer:
_target_: torch.optim.Adam
params: "$@system#model.parameters()"
lr: 0.001

datasets:
train:
_target_: torchvision.datasets.CIFAR10
download: True
root: .datasets
train: True
transform:
_target_: torchvision.transforms.Compose
transforms:
- _target_: torchvision.transforms.ToTensor
- _target_: torchvision.transforms.Normalize
mean: [0.5, 0.5, 0.5]
std: [0.5, 0.5, 0.5]

```
=== "Terminal"
```
lighter fit --config_file cifar10.yaml
```


Congratulations!! You have run your first training example with Lighter.
1 change: 0 additions & 1 deletion docs/basics/system.md

This file was deleted.

1 change: 0 additions & 1 deletion docs/basics/trainer.md

This file was deleted.

Loading

0 comments on commit 799ad72

Please sign in to comment.