Skip to content

Commit

Permalink
Update docs for map_location (#920)
Browse files Browse the repository at this point in the history
* update docs for map location

* update return description
  • Loading branch information
Adrian Wälchli authored Feb 23, 2020
1 parent 5778a41 commit c56ee8b
Showing 1 changed file with 13 additions and 5 deletions.
18 changes: 13 additions & 5 deletions pytorch_lightning/core/lightning.py
Original file line number Diff line number Diff line change
Expand Up @@ -1023,10 +1023,15 @@ def load_from_metrics(cls, weights_path, tags_csv, map_location=None):
drop_prob,0.2
batch_size,32
map_location (dict): A dictionary mapping saved weight GPU devices to new
GPU devices (example: {'cuda:1':'cuda:0'})
map_location (dict | str | torch.device | function):
If your checkpoint saved a GPU model and you now load on CPUs
or a different number of GPUs, use this to map to the new setup
(example: {'cuda:1':'cuda:0'}).
The behaviour is the same as in
`torch.load <https://pytorch.org/docs/stable/torch.html#torch.load>`_.
Return:
LightningModule with loaded weights
LightningModule with loaded weights and hyperparameters (if available).
Example
-------
Expand Down Expand Up @@ -1097,11 +1102,14 @@ def __init__(self, hparams):
Args:
checkpoint_path (str): Path to checkpoint.
map_location (dic): If your checkpoint saved from a GPU model and you now load on CPUs
map_location (dict | str | torch.device | function):
If your checkpoint saved a GPU model and you now load on CPUs
or a different number of GPUs, use this to map to the new setup.
The behaviour is the same as in
`torch.load <https://pytorch.org/docs/stable/torch.html#torch.load>`_.
Return:
LightningModule with loaded weights.
LightningModule with loaded weights and hyperparameters (if available).
Example
-------
Expand Down

0 comments on commit c56ee8b

Please sign in to comment.