Skip to content

Commit

Permalink
🚀🚀🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
alexisbiver committed Dec 20, 2024
1 parent 09a8c90 commit 3eacdab
Show file tree
Hide file tree
Showing 28 changed files with 428 additions and 156 deletions.
8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -185,8 +185,11 @@ python examples/train_from_config.py examples/configs/mlp_teleopicub_with_center

The script `start_multiple_trainings.py` is an example of how to generate variations of configuration files and running them using methods from `train_from_config.py`

Also for evaluation purposes, you can see an example running tests and plots using AutoPredictor and AutoDataset in `load_and_plot_predictors.py`
Also for evaluation purposes, you can see an example running tests and plots using AutoPredictor and AutoDataset in `load_and_plot_predictors.py` or animations such as in `plot_pred_over_time.py`

<p align="center">
<img alt="Trajectory prediction animation" src="https://raw.githubusercontent.com/hucebot/prescyent/main/assets/test_datasetMultipleTasks_BottleTable_12_animation.gif" width="80%">
</p>

### Extend the lib with a custom dataset or predictor
Predictors inherit from the BasePredictor class, which define interfaces and core methods to keep consistency between each new implementation.
Expand All @@ -195,13 +198,12 @@ If you want to implement a new ML predictor using PyTorch follow the structure o
- in `module.py` you create your torch.nn.Module and forward method as you would usually do, you may want to inherit from BaseTorchModule instead of just torch.nn.Module and decorate your forward method with `@self_auto_batch` and `@BaseTorchModule.deriv_tensor` to benefit from some of the lib's features.
- in `config.py` create your [pydantic BaseModel](https://docs.pydantic.dev/latest/) inheriting from `ModuleConfig` to ensure your predictor's config has all the needed variables, and add any new values you want as variables in your model's architecture.
- finally `predictor.py` simply connects the above two by declaring both classes as class attributes for this specific predictor. Most of the magic happens in the parent classes using pytorch_lightning with your torch module.
If you want your predictor to be able to be loaded by AutoPredictor, you must add it to the PREDICTOR_MAP and PREDICTOR_LIST in `prescyent.predictor.__init__.py`.

In the same way you can extend the dataset module with a new Dataset inheriting from TrajectoriesDataset with its own DatasetConfig. Again taking examples on one of our implementation as TeleopIcubDataset, you must:
- in `dataset.py`, inherit from the TrajectoriesDataset class and implement a `prepare_data` method where you must init `self.trajectories` with a `Trajectories` instance built from your data/files.
- in `config.py` create your [pydantic BaseModel](https://docs.pydantic.dev/latest/) inheriting from `TrajectoriesDatasetConfig` to ensure you have all variables for the dataset processes, and add any new value you want as variables in your dataset's architecture.
- optionally use `metadata.py` as we did to store some constant describing your dataset.
All the logic creating the datasamples and dataloaders is handled in the parent class as long as self.trajectories is defined and the config is valid. If you want your dataset to be able to be loaded by AutoDataset, you must add it to the DATASET_MAP and DATASET_LIST in `prescyent.dataset.__init__.py`.
All the logic creating the datasamples and dataloaders is handled in the parent class as long as self.trajectories is defined and the config is valid.
If you simply want to test a Predictor over some data, you can create an instance of CustomDataset. As long as you turned your lists of episodes into Trajectories, the CustomDataset allows you to split them into training samples using a generic DatasetConfig and use all the functionalities of the library as usual (except that a CustomDataset cannot be loaded using AutoDataset)...

## Ros2
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 7 additions & 4 deletions benchmark/andydataset/train_mlp_variations.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@
TrajectoryDimensions.FEATURE,
],
# MODEL
"model_config.name": [
"MlpPredictor",
"model_config.predictor_class": [
"prescyent.predictor.lightning.models.sequence.mlp.predictor.MlpPredictor",
],
"model_config.loss_fn": [LossFunctions.MSELOSS],
"model_config.hidden_size": [64, 128],
Expand Down Expand Up @@ -112,7 +112,10 @@
"scaler_config"
]
del config_dict["scaler_config"]
if config_dict["model_config"]["name"] in AUTO_REGRESSIVE_MODELS:
if (
"prescyent.predictor.lightning.models.autoreg"
in config_dict["model_config"]["predictor_class"]
):
config_dict["dataset_config"]["learning_type"] = LearningTypes.AUTOREG
with open(config_paths[-1], "w", encoding="utf-8") as config_file:
json.dump(config_dict, config_file, indent=4)
Expand All @@ -135,7 +138,7 @@
dataset = AndyDataset(dataset_config)
exp_path = (
f"data/models/{dataset.DATASET_NAME}_ee"
f"/h{dataset_config.history_size}_f{dataset_config.future_size}_{dataset.frequency}hz"
f"/h{dataset_config.history_size}_f{dataset_config.future_size}_{dataset.config.frequency}hz"
f"/i_All_{''.join([feat.__class__.__name__ for feat in dataset_config.in_features])}_o_RightHand_{''.join([feat.__class__.__name__ for feat in dataset_config.in_features])}"
)
# train and test a baseline first to compare with
Expand Down
16 changes: 8 additions & 8 deletions benchmark/andydataset/train_simlpe_variations.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,17 +33,14 @@
TrajectoryDimensions.FEATURE,
],
# MODEL
"model_config.name": [
"siMLPe",
"model_config.predictor_class": [
"prescyent.predictor.lightning.models.sequence.simlpe.predictor.SiMLPePredictor",
],
"model_config.loss_fn": [LossFunctions.MTDLOSS, LossFunctions.MSELOSS],
# "model_config.num_layers": [48], # commented out to leave default model values here instead
# "model_config.spatial_fc_only": [True, False], # commented out to leave default model values here instead
# "model_config.dct": [True, False], # commented out to leave default model values here instead
"model_config.deriv_on_last_frame": [
True,
False
],
"model_config.deriv_on_last_frame": [True, False],
# ...
# TRAINING
"training_config.max_epochs": [200],
Expand Down Expand Up @@ -117,7 +114,10 @@
"scaler_config"
]
del config_dict["scaler_config"]
if config_dict["model_config"]["name"] in AUTO_REGRESSIVE_MODELS:
if (
"prescyent.predictor.lightning.models.autoreg"
in config_dict["model_config"]["predictor_class"]
):
config_dict["dataset_config"]["learning_type"] = LearningTypes.AUTOREG
with open(config_paths[-1], "w", encoding="utf-8") as config_file:
json.dump(config_dict, config_file, indent=4)
Expand All @@ -140,7 +140,7 @@
dataset = AndyDataset(dataset_config)
exp_path = (
f"data/models/{dataset.DATASET_NAME}_ee"
f"/h{dataset_config.history_size}_f{dataset_config.future_size}_{dataset.frequency}hz"
f"/h{dataset_config.history_size}_f{dataset_config.future_size}_{dataset.config.frequency}hz"
f"/i_All_{''.join([feat.__class__.__name__ for feat in dataset_config.in_features])}_o_RightHand_{''.join([feat.__class__.__name__ for feat in dataset_config.in_features])}"
)
# train and test a baseline first to compare with
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information

project = 'PreScyent'
copyright = '2024, Alexis Biver'
copyright = '2024, Inria Nancy, Larsen Team'
author = 'Alexis Biver'

# -- General configuration ---------------------------------------------------
Expand Down
18 changes: 6 additions & 12 deletions docs/enums.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,67 +6,61 @@ Enums
:maxdepth: 1


**ActivationFunctions**
ActivationFunctions
=======================
.. autoclass:: prescyent.utils.enums.activation_functions.ActivationFunctions
:members:
:exclude-members:
:undoc-members:
:inherited-members:
:show-inheritance:

----


**LearningTypes**
LearningTypes
=================
.. autoclass:: prescyent.utils.enums.learning_types.LearningTypes
:members:
:exclude-members:
:undoc-members:
:inherited-members:
:show-inheritance:

----

**LossFunctions**
LossFunctions
=================
.. autoclass:: prescyent.utils.enums.loss_functions.LossFunctions
:members:
:exclude-members:
:undoc-members:
:inherited-members:
:show-inheritance:

----

**Profilers**
Profilers
=============
.. autoclass:: prescyent.utils.enums.profilers.Profilers
:members:
:exclude-members:
:undoc-members:
:inherited-members:
:show-inheritance:

----

**Scalers**
Scalers
===========
.. autoclass:: prescyent.utils.enums.scalers.Scalers
:members:
:exclude-members:
:undoc-members:
:inherited-members:
:show-inheritance:



**TrajectoryDimensions**
TrajectoryDimensions
========================
.. autoclass:: prescyent.utils.enums.trajectory_dimensions.TrajectoryDimensions
:members:
:exclude-members:
:undoc-members:
:inherited-members:
:show-inheritance:
6 changes: 3 additions & 3 deletions docs/features.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,15 @@ Features
:maxdepth: 1


**Any**
Any
=======
.. autoclass:: prescyent.dataset.features.feature.any.Any
:members: get_distance
:exclude-members:
:undoc-members:
:show-inheritance:

**Coordinate**
Coordinate
==============

.. automodule:: prescyent.dataset.features.feature.coordinate
Expand Down Expand Up @@ -55,7 +55,7 @@ Features
------


**Rotation**
Rotation
==============

.. automodule:: prescyent.dataset.features.feature.rotation
Expand Down
3 changes: 1 addition & 2 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -176,11 +176,10 @@ If you want to implement a new ML predictor using PyTorch follow the structure o
- in `module.py` you create your torch.nn.Module and forward method as you would usually do, you may want to inherit from BaseTorchModule instead of just torch.nn.Module and decorate your forward method with `@self_auto_batch` and `@BaseTorchModule.deriv_tensor` to benefit from some of the lib's features.
- in `config.py` create your [pydantic BaseModel](https://docs.pydantic.dev/latest/) inheriting from `ModuleConfig` to ensure your predictor's config has all the needed variables, and add any new values you want as variables in your model's architecture.
- finally `predictor.py` simply connects the above two by declaring both classes as class attributes for this specific predictor. Most of the magic happens in the parent classes using pytorch_lightning with your torch module.
If you want your predictor to be able to be loaded by AutoPredictor, you must add it to the PREDICTOR_MAP and PREDICTOR_LIST in `prescyent.predictor.__init__.py`.

In the same way you can extend the dataset module with a new Dataset inheriting from TrajectoriesDataset with its own DatasetConfig. Again taking examples on one of our implementation as TeleopIcubDataset, you must:
- in `dataset.py`, inherit from the TrajectoriesDataset class and implement a `prepare_data` method where you must init `self.trajectories` with a `Trajectories` instance built from your data/files.
- in `config.py` create your [pydantic BaseModel](https://docs.pydantic.dev/latest/) inheriting from `TrajectoriesDatasetConfig` to ensure you have all variables for the dataset processes, and add any new value you want as variables in your dataset's architecture.
- optionally use `metadata.py` as we did to store some constant describing your dataset.
All the logic creating the datasamples and dataloaders is handled in the parent class as long as self.trajectories is defined and the config is valid. If you want your dataset to be able to be loaded by AutoDataset, you must add it to the DATASET_MAP and DATASET_LIST in `prescyent.dataset.__init__.py`.
All the logic creating the datasamples and dataloaders is handled in the parent class as long as self.trajectories is defined and the config is valid.
If you simply want to test a Predictor over some data, you can create an instance of CustomDataset. As long as you turned your lists of episodes into Trajectories, the CustomDataset allows you to split them into training samples using a generic DatasetConfig and use all the functionalities of the library as usual (except that a CustomDataset cannot be loaded using AutoDataset)...
9 changes: 6 additions & 3 deletions docs/predictors.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,11 +49,14 @@ An architecture mapping an input sequence to an output sequence, that originated
[Config](configuration_files.rst#seq2seqconfiguration_files.rst#sarlstmconfig)

### MLP
Simple ML Baselines consisting of a configurable Fully Connected MultiLayer Perceptron
Simple ML Baselines consisting of a configurable Fully Connected MultiLayer Perceptron.
It's a simple architecture you can use as an example for sequence to sequence training and quick tests.

[Config](configuration_files.rst#mlpconfig)

### SARLSTM
Simple ML Baselines consisting of an autoregressive architecture with LSTMs
Simple ML Baselines consisting of an autoregressive architecture with LSTMs.
It's an architecture you can use as an example for an auto regressive training.
This model requires x, y pairs that differ from classical sequence to sequence training, build your dataset using an auto regressive [LearningType](enums.rst#learningtypes).

[Config](config)
[Config](configuration_files.rst#sarlstmconfig)
2 changes: 1 addition & 1 deletion examples/mlp_icub_train.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@
)
delayed = DelayedPredictor(config=delayed_config)
delayed.test(dataset)
print(f"Your predictor is saved in: {model_dir}")
print(f"Your predictor is saved in: {predictor.log_root_path}")
print(
"You can visualize all logs from this script at xp_dir using tensorboard like this:"
)
Expand Down
24 changes: 14 additions & 10 deletions examples/start_multiple_trainings.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,11 +42,11 @@
TrajectoryDimensions.FEATURE,
],
# MODEL
"model_config.name": [
"MlpPredictor",
"Seq2SeqPredictor",
"siMLPe",
"SARLSTMPredictor", # Warning ! Can't be used in all conditions
"model_config.predictor_class": [
"prescyent.predictor.lightning.models.sequence.mlp.predictor.MlpPredictor",
"prescyent.predictor.lightning.models.sequence.seq2seq.predictor.Seq2SeqPredictor",
"prescyent.predictor.lightning.models.sequence.simlpe.predictor.SiMLPePredictor",
"prescyent.predictor.lightning.models.autoreg.sarlstm.predictor.SARLSTMPredictor", # Warning ! Can't be used in all conditions
],
"model_config.loss_fn": [
LossFunctions.MTDLOSS,
Expand All @@ -56,7 +56,7 @@
], # Warning ! Cannot be used in all conditions
# ...
# TRAINING
"training_config.max_epochs": [200],
"training_config.max_epochs": [1],
# "training_config.devices": [1],
# "training_config.accelerator": ["gpu"],
"training_config.early_stopping_patience": [20],
Expand All @@ -65,13 +65,14 @@
"dataset_config.frequency": [FREQUENCY],
"dataset_config.history_size": [HISTORY_SIZE],
"dataset_config.future_size": [FUTURE_SIZE],
"dataset_config.name": ["TeleopIcub"],
"dataset_config.dataset_class": [
"prescyent.dataset.datasets.teleop_icub.dataset.TeleopIcubDataset"
],
"dataset_config.hdf5_path": ["data/datasets/AndyData-lab-prescientTeleopICub.hdf5"],
"dataset_config.subsets": [["BottleTable"]],
"dataset_config.batch_size": [256],
}

AUTO_REGRESSIVE_MODELS = ["SARLSTMPredictor"]
DATASET_IS_STATIC = True
MAX_WORKERS = 1

Expand Down Expand Up @@ -132,7 +133,10 @@
"scaler_config"
]
del config_dict["scaler_config"]
if config_dict["model_config"]["name"] in AUTO_REGRESSIVE_MODELS:
if (
"prescyent.predictor.lightning.models.autoreg"
in config_dict["model_config"]["predictor_class"]
):
config_dict["dataset_config"]["learning_type"] = LearningTypes.AUTOREG
with open(config_paths[-1], "w", encoding="utf-8") as config_file:
json.dump(config_dict, config_file, indent=4)
Expand All @@ -142,7 +146,7 @@
Path(__file__).parent.resolve()
/ "data"
/ "models"
/ "TeleopIcub"
/ "TeleopIcubDataset"
/ f"{FREQUENCY}Hz_{HISTORY_SIZE}in_{FUTURE_SIZE}out"
)
for i, config_path in enumerate(config_paths):
Expand Down
Loading

0 comments on commit 3eacdab

Please sign in to comment.