Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

Adds a template task and docs #306

Merged
merged 60 commits into from
May 19, 2021
Merged
Show file tree
Hide file tree
Changes from 28 commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
f5e3c49
Initial commit
ethanwharris May 14, 2021
4cccd79
Updates
ethanwharris May 14, 2021
838e40c
Updates
ethanwharris May 14, 2021
49de5a0
Merge branch 'master' into feature/template
ethanwharris May 17, 2021
7735349
Updates
ethanwharris May 17, 2021
ce2108f
Remove template README
ethanwharris May 17, 2021
04b3b96
Fixes
ethanwharris May 17, 2021
f79f909
Updates
ethanwharris May 17, 2021
9834a47
Add examples
ethanwharris May 17, 2021
53a1ba2
Updates
ethanwharris May 17, 2021
2694f46
Updates
ethanwharris May 17, 2021
28b5eec
Updates
ethanwharris May 17, 2021
c552635
Updates
ethanwharris May 17, 2021
65f9bdd
Add tests
ethanwharris May 17, 2021
cc3001a
Updates
ethanwharris May 17, 2021
b4102f0
Merge branch 'master' into feature/template
ethanwharris May 17, 2021
4ae69fa
Fixes
ethanwharris May 17, 2021
3bcf221
A fix
ethanwharris May 17, 2021
eb7c3e4
Fixes
ethanwharris May 17, 2021
839c99a
More tests
ethanwharris May 17, 2021
e2df1ee
Merge branch 'master' into feature/template
ethanwharris May 17, 2021
afe8142
Updates
ethanwharris May 17, 2021
bee8bdd
Fix
ethanwharris May 17, 2021
382c2cb
Merge branch 'master' into feature/template
ethanwharris May 18, 2021
3a24117
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 18, 2021
acd302e
Update docs/source/reference/template.rst
ethanwharris May 19, 2021
907c927
Respond to comments
ethanwharris May 19, 2021
0af0d28
updates
ethanwharris May 19, 2021
9a2be0e
Update docs/source/template/data.rst
ethanwharris May 19, 2021
9740580
Update docs/source/template/data.rst
ethanwharris May 19, 2021
b6d57a2
Update docs/source/template/data.rst
ethanwharris May 19, 2021
084eb6e
Merge branch 'master' into feature/template
ethanwharris May 19, 2021
e390d32
Update docs/source/template/model.rst
ethanwharris May 19, 2021
166dd4d
Updates
ethanwharris May 19, 2021
2f52577
Merge branch 'feature/template' of https://github.com/PyTorchLightnin…
ethanwharris May 19, 2021
0c0780c
Updates
ethanwharris May 19, 2021
3fdba4a
Fixes
ethanwharris May 19, 2021
fa6ba79
Updates
ethanwharris May 19, 2021
7b201b3
Updates
ethanwharris May 19, 2021
96df2c2
Updates
ethanwharris May 19, 2021
9a9cfd4
Fixes
ethanwharris May 19, 2021
fe2cff7
Fixes
ethanwharris May 19, 2021
8167884
Fix
ethanwharris May 19, 2021
ad976f4
Add backbones
ethanwharris May 19, 2021
fecb316
Add backbones
ethanwharris May 19, 2021
b4d952c
Updates
ethanwharris May 19, 2021
c7b7806
Updates
ethanwharris May 19, 2021
23f2f20
Updates
ethanwharris May 19, 2021
ba83757
Fixes
ethanwharris May 19, 2021
1c43ec9
Add links
ethanwharris May 19, 2021
5aa7cf5
Fixes
ethanwharris May 19, 2021
4d35762
Simplify
ethanwharris May 19, 2021
36a6538
Update CHANGELOG.md
ethanwharris May 19, 2021
17085fb
Merge branch 'master' into feature/template
mergify[bot] May 19, 2021
4550e04
Update docs/source/template/optional.rst
ethanwharris May 19, 2021
5ebc71e
Update docs/source/template/optional.rst
ethanwharris May 19, 2021
71de79c
Update docs/source/template/task.rst
ethanwharris May 19, 2021
0850333
Updates
ethanwharris May 19, 2021
4fd0344
Updates
ethanwharris May 19, 2021
c21e816
Updates
ethanwharris May 19, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added Semantic Segmentation task ([#239](https://github.com/PyTorchLightning/lightning-flash/pull/239) [#287](https://github.com/PyTorchLightning/lightning-flash/pull/287) [#290](https://github.com/PyTorchLightning/lightning-flash/pull/290))
- Added Object detection prediction example ([#283](https://github.com/PyTorchLightning/lightning-flash/pull/283))
- Added Style Transfer task and accompanying finetuning and prediction examples ([#262](https://github.com/PyTorchLightning/lightning-flash/pull/262))
- Added a Template task and tutorials showing how to contribute a task to flash ([#306](https://github.com/PyTorchLightning/lightning-flash/pull/306))

### Changed

Expand Down
3 changes: 2 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,8 @@ Lightning Flash

template/intro
template/data
template/model
template/backbones
template/task
template/optional
template/examples
template/tests
Expand Down
6 changes: 3 additions & 3 deletions docs/source/reference/template.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Here, you should add a short intro to your predict example, and then use ``liter

.. note:: We skip the first 14 lines as they are just the copyright notice.

Our predict example uses a model pre-trained on the iris data.
Our predict example uses a model pre-trained on the Iris data.

.. literalinclude:: ../../../flash_examples/predict/template.py
:language: python
Expand All @@ -37,9 +37,9 @@ For more advanced inference options, see :ref:`predictions`.
Training
********

In this section, we breifly describe the data, and then ``literalinclude`` our finetuning example.
In this section, we briefly describe the data, and then ``literalinclude`` our finetuning example.

Now we'll train on Fisher's classic iris data.
Now we'll train on Fisher's classic Iris data.
It contains 150 records with four features (sepal length, sepal width, petal length, and petal width) in three classes (species of Iris: setosa, virginica and versicolor).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Include link to images to make your description better.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just tabular data, so I'm not sure what images we would show here


Now all we need is to train our task!
Expand Down
45 changes: 45 additions & 0 deletions docs/source/template/backbones.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
.. _contributing_backbones:

*************
The Backbones
*************

Now that you've got a way of loading data, you should implement some backbones to use with your :class:`~flash.core.model.Task`.
Create a :any:`FlashRegistry <registry>` to use with your :class:`~flash.core.model.Task` in `backbones.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/template/classification/backbones.py>`_.

The registry allows you to register backbones for your task that can be selected by the user.
The backbones can come from anywhere as long as you can register a function that loads the backbone.
Furthermore, the user can add their own models to the existing backbones, without having to write their own :class:`~flash.core.model.Task`!

You can create a registry like this:

.. code-block:: python

TEMPLATE_BACKBONES = FlashRegistry("backbones")

Let's add a simple MLP backbone to our registry.
We'll create the backbone and return it along with the output size (so that we can create the model head in our :class:`~flash.core.model.Task`).
Here's the code:

.. literalinclude:: ../../../flash/template/classification/backbones.py
:language: python
:pyobject: load_mlp_128

Here's another example with a slightly more complex model:

.. literalinclude:: ../../../flash/template/classification/backbones.py
:language: python
:pyobject: load_mlp_128_256

More Examples
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
_____________

Here's a more advanced example, which adds ``SimCLR`` to the ``IMAGE_CLASSIFIER_BACKBONES``, from `flash/image/backbones.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/image/backbones.py>`_:

.. literalinclude:: ../../../flash/image/backbones.py
:language: python
:pyobject: load_simclr_imagenet
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved

------

Once you've got some data and some backbones, :ref:`implement your task! <contributing_task>`
202 changes: 121 additions & 81 deletions docs/source/template/data.rst

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions docs/source/template/docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@ The Docs
*********

The final step is to add some docs.
For each :class:`~flash.core.model.Task` in Flash, we have a docs page in ``docs/source/reference``.
For each :class:`~flash.core.model.Task` in Flash, we have a docs page in `docs/source/reference <https://github.com/PyTorchLightning/lightning-flash/blob/master/docs/source/reference>`_.
You should create a ``.rst`` file there with the following:

- a brief description of the task
- the predict example
- the finetuning example
- any relevant API reference

Here are the contents of ``docs/source/reference/template.rst`` which breaks down each of these steps:
Here are the contents of `docs/source/reference/template.rst <https://github.com/PyTorchLightning/lightning-flash/blob/master/docs/source/reference/template.rst>`_ which breaks down each of these steps:

.. literalinclude:: ../reference/template.rst
:language: rest
Expand Down
10 changes: 5 additions & 5 deletions docs/source/template/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ The Examples
************

Now you've implemented your task, it's time to add some examples showing how cool it is!
We usually provide one finetuning example in ``flash_examples/finetuning`` and one predict / inference example in ``flash_examples/predict``.
We usually provide one finetuning example in `flash_examples/finetuning <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash_examples/finetuning/>`_ and one predict / inference example in `flash_examples/predict <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash_examples/predict/>`_.
You can base these off of our ``template.py`` examples.
Let's take a closer look.

Expand All @@ -14,7 +14,7 @@ finetuning

The finetuning example should:

#. download the data
#. download the data (we'll add the example to our CI later on, so choose a dataset small enough that it runs in reasonable time)
#. load the data into a :class:`~flash.core.data.data_module.DataModule`
#. create an instance of the :class:`~flash.core.model.Task`
#. create a :class:`~flash.core.trainer.Trainer`
Expand All @@ -23,7 +23,7 @@ The finetuning example should:
#. generate predictions for a few examples *(optional)*

For our template example we don't have a pretrained backbone, so we can just call :meth:`~flash.core.trainer.Trainer.fit` rather than :meth:`~flash.core.trainer.Trainer.finetune`.
Here's the full example (``flash_examples/finetuning/template.py``):
Here's the full example (`flash_examples/finetuning/template.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash_examples/finetuning/template.py>`_):

.. literalinclude:: ../../../flash_examples/finetuning/template.py
:language: python
Expand All @@ -40,13 +40,13 @@ predict

The predict example should:

#. download the data
#. download the data (this should be the data from the finetuning example)
#. load an instance of the :class:`~flash.core.model.Task` from a checkpoint stored on `S3` (speak with one of us about getting your checkpoint hosted)
#. generate predictions for a few examples
#. generate predictions for a whole dataset, folder, etc.

For our template example we don't have a pretrained backbone, so we can just call :meth:`~flash.core.trainer.Trainer.fit` rather than :meth:`~flash.core.trainer.Trainer.finetune`.
Here's the full example (``flash_examples/predict/template.py``):
Here's the full example (`flash_examples/predict/template.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash_examples/predict/template.py>`_):

.. literalinclude:: ../../../flash_examples/predict/template.py
:language: python
Expand Down
20 changes: 16 additions & 4 deletions docs/source/template/intro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,19 +17,31 @@ Set-up

The Task template is designed to guide you through contributing a task to Flash.
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
It contains the code, tests, and examples for a task that performs classification with a multi-layer perceptron, intended for use with the classic data sets from scikit-learn.
Before you begin, copy the files in ``flash/template/classification`` to the location where you are implementing your task.
Our folders are organised in terms of data-type (image, text, video, etc.), with sub-folders for different task types (classification, regression, etc.).
The Flash tasks are organized in folders by data-type (image, text, video, etc.), with sub-folders for different task types (classification, regression, etc.).

|

Before you begin, copy the files in `flash/template/classification <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/template/classification/>`_ to a new sub-directory under the relevant data-type.
If a data-type folder already exists for your task, then a task type sub-folder should be added containing the template files.
If a data-type folder doesn't exist, then you will need to add that too.
You should also copy the files from ``tests/template/classification`` to the corresponding data-type, task type folder in ``tests``.
You should also copy the files from `tests/template/classification <https://github.com/PyTorchLightning/lightning-flash/blob/master/tests/template/classification/>`_ to the corresponding data-type, task type folder in ``tests``.
For example, if you were adding an image classification task, you would do:

.. code-block:: bash

mkdir flash/image/classification
cp flash/template/classification/* flash/image/classification/
mkdir tests/image/classification
cp tests/template/classification/* tests/image/classification/

Tutorials
=========

The tutorials in this section will walk you through all of the components you need to implement (or adapt from the template) for your custom task.

- :ref:`contributing_data`: our first tutorial goes over the best practices for implementing everything you need to connect data to your task
- :ref:`contributing_task`: now that we have the data, in this tutorial we create our custom task
- :ref:`contributing_backbones`: the second tutorial shows you how to create an extensible backbone registry for your task
- :ref:`contributing_task`: now that we have the data and the models, in this tutorial we create our custom task
- :ref:`contributing_optional`: this tutorial covers some optional extras you can add if needed for your particular task
- :ref:`contributing_examples`: this tutorial guides you through creating some simple examples showing your task in action
- :ref:`contributing_tests`: in this tutorial, we cover best practices for writing some tests for your new task
Expand Down
40 changes: 0 additions & 40 deletions docs/source/template/model.rst

This file was deleted.

42 changes: 7 additions & 35 deletions docs/source/template/optional.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,11 @@
Optional Extras
***************

transforms.py
=============
Organize your transforms in transforms.py
=========================================

Sometimes you'd like to have quite a few transforms by default (standard augmentations, normalization, etc.).
If you do then, for better organization, you can define a ``transforms.py`` which houses your default transforms to be referenced in your :class:`~flash.core.data.process.Preprocess`.
Here's an example from ``image/classification/transforms.py`` which creates some default transforms given the desired image size:
If you have a lot of default transforms, then for better organization you can add a ``transforms.py`` which houses your default transforms to be referenced in your :class:`~flash.core.data.process.Preprocess`.
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
Here's an example from `image/classification/transforms.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/image/classification/transforms.py>`_ which creates some default transforms given the desired image size:

.. literalinclude:: ../../../flash/image/classification/transforms.py
:language: python
Expand All @@ -21,40 +20,13 @@ Here's how we create our transforms in the :class:`~flash.image.classification.d
:language: python
:pyobject: ImageClassificationPreprocess.default_transforms

backbones.py
============

In Flash, we love to provide as much access to the state-of-the-art as we can.
To this end, we've created the :any:`FlashRegistry <registry>`.
The registry allows you to register backbones for your task that can be selected by the user.
The backbones can come from anywhere as long as you can register a function that loads the backbone.
If you want to configure some backbones for your task, it's best practice to include these in a ``backbones.py`` file.
Here's an example adding ``SimCLR`` to the ``IMAGE_CLASSIFIER_BACKBONES``, from ``image/backbones.py``:

.. literalinclude:: ../../../flash/image/backbones.py
:language: python
:pyobject: load_simclr_imagenet

In ``image/classification/model.py``, we attach ``IMAGE_CLASSIFIER_BACKBONES`` to the :class:`~flash.image.classification.model.ImageClassifier` as a class attribute ``backbones``.
Now we get the backbone from the registry and create a head in the ``__init__``:

.. literalinclude:: ../../../flash/image/classification/model.py
:language: python
:pyobject: ImageClassifier.__init__

Finally, we use our backbone and head in a custom forward pass:

.. literalinclude:: ../../../flash/image/classification/model.py
:language: python
:pyobject: ImageClassifier.forward

serialization.py
================
Add serializers for use with your Task
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
======================================

Sometimes you want to give the user some control over their prediction format.
:class:`~flash.core.data.process.Postprocess` can do the heavy lifting (anything you always want to apply to the predictions), but one or more custom :class:`~flash.core.data.process.Serializer` implementations can be used to convert the predictions to a desired output format.
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
You should add your :class:`~flash.core.data.process.Serializer` implementations in a ``serialization.py`` file and set a good default in your :class:`~flash.core.model.Task`.
Some good examples are in ``core/classification.py``.
Some good examples are in `flash/core/classification.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/core/classification.py>`_.
Here's the :class:`~flash.core.classification.Classes` :class:`~flash.core.data.process.Serializer`:

.. literalinclude:: ../../../flash/core/classification.py
Expand Down
66 changes: 66 additions & 0 deletions docs/source/template/task.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
.. _contributing_task:

********
The Task
********

Once you've implemented a Flash :class:`~flash.core.data.data_module.DataModule` and some backbones, you should implement your :class:`~flash.core.model.Task` in `model.py <https://github.com/PyTorchLightning/lightning-flash/blob/master/flash/template/classification/model.py>`_.
The :class:`~flash.core.model.Task` is responsible for: setting up the backbone, performing the forward pass of the model, and calculating the loss and any metrics.
Remember that, under the hood, the Flash :class:`~flash.core.model.Task` is simply a :any:`pytorch_lightning:lightning_module` with some helpful defaults.

Task
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
^^^^
You can override the base :class:`~flash.core.model.Task` or any of the existing :class:`~flash.core.model.Task` implementations.
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
For example, in our scikit-learn example, we can just override :class:`~flash.core.classification.ClassificationTask` which provides good defaults for classification.

You should attach your backbones registry as a class attribute like this:

.. code-block:: python

ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
backbones: FlashRegistry = TEMPLATE_BACKBONES

In the :meth:`~flash.core.model.Task.__init__`, you will need to configure defaults for the:
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved

- loss function
- optimizer
- metrics
- backbone / model

You will also need to create the backbone from the registry and create the model head.
Here's the code:

.. literalinclude:: ../../../flash/template/classification/model.py
:language: python
:dedent: 4
:pyobject: TemplateSKLearnClassifier.__init__

You should also override the ``{train,val,test,predict}_step`` methods.
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
The default ``{train,val,test,predict}_step`` implementations in :class:`~flash.core.model.Task` expect a tuple containing the input and target, and should be suitable for most applications.
ethanwharris marked this conversation as resolved.
Show resolved Hide resolved
In our template example, we just extract the input and target from the input mapping and forward them to the ``super`` methods.
Here's the code for the ``training_step``:

.. literalinclude:: ../../../flash/template/classification/model.py
:language: python
:dedent: 4
:pyobject: TemplateSKLearnClassifier.training_step

We use the same code for the ``validation_step`` and ``test_step``.
For ``predict_step`` we don't need the targets, so our code looks like this:

.. literalinclude:: ../../../flash/template/classification/model.py
:language: python
:dedent: 4
:pyobject: TemplateSKLearnClassifier.predict_step

.. note:: You can completely replace the ``{train,val,test,predict}_step`` methods (that is, without a call to ``super``) if you need more custom behaviour for your :class:`~flash.core.model.Task` at a particular stage.

Finally, we use our backbone and head in a custom forward pass:

.. literalinclude:: ../../../flash/template/classification/model.py
:language: python
:dedent: 4
:pyobject: TemplateSKLearnClassifier.forward

------

Now that you've got your task, take a look at some :ref:`optional advanced features you can add <contributing_optional>` or go ahead and :ref:`create some examples showing your task in action! <contributing_examples>`
Loading