Skip to content

Commit

Permalink
Merge branch 'master' into fix_signal_connector_has_already_handler
Browse files Browse the repository at this point in the history
  • Loading branch information
awaelchli committed Nov 30, 2021
2 parents f5fa603 + 1d28785 commit 6bc06ab
Show file tree
Hide file tree
Showing 55 changed files with 773 additions and 434 deletions.
10 changes: 8 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,10 +68,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Raised an error if the `batch_size` cannot be inferred from the current batch if it contained a string or was a custom batch object ([#10541](https://github.com/PyTorchLightning/pytorch-lightning/pull/10541))


-
- Moved optimizer related logics from `Accelerator` to `TrainingTypePlugin` ([#10596](https://github.com/PyTorchLightning/pytorch-lightning/pull/10596))


-
- Moved `batch_to_device` method from `Accelerator` to `TrainingTypePlugin` ([#10649](https://github.com/PyTorchLightning/pytorch-lightning/pull/10649))


-
Expand Down Expand Up @@ -184,6 +184,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

### Fixed

- Fixed support for `--key.help=class` with the `LightningCLI` ([#10767](https://github.com/PyTorchLightning/pytorch-lightning/pull/10767))


- Fixed `_compare_version` for python packages ([#10762](https://github.com/PyTorchLightning/pytorch-lightning/pull/10762))


Expand All @@ -193,6 +196,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed a consolidation error in Lite when attempting to save the state dict of a sharded optimizer ([#10746](https://github.com/PyTorchLightning/pytorch-lightning/pull/10746))


- Fixed the default logging level for batch hooks associated with training from `on_step=False, on_epoch=True` to `on_step=True, on_epoch=False` ([#10756](https://github.com/PyTorchLightning/pytorch-lightning/pull/10756))


- Fixed `SignalConnector._has_already_handler` check for callable type ([#10483](https://github.com/PyTorchLightning/pytorch-lightning/pull/10483))


Expand Down
14 changes: 7 additions & 7 deletions dockers/nvidia/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.

ARG PYTORCH_VERSION=21.07
ARG PYTORCH_VERSION=21.11

# https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes
FROM nvcr.io/nvidia/pytorch:${PYTORCH_VERSION}-py3
Expand All @@ -30,18 +30,18 @@ RUN \
# replace by specific version if asked
if [ ! -z "$LIGHTNING_VERSION" ] ; then \
rm -rf pytorch-lightning ; \
wget https://github.com/PyTorchLightning/pytorch-lightning/archive/${LIGHTNING_VERSION}.zip --progress=bar:force:noscroll ; \
unzip ${LIGHTNING_VERSION}.zip ; \
mv pytorch-lightning-*/* pytorch-lightning ; \
rm -rf pytorch-lightning-* ; \
rm *.zip ; \
git clone https://github.com/PyTorchLightning/pytorch-lightning.git ; \
cd pytorch-lightning ; \
git checkout ${LIGHTNING_VERSION} ; \
git submodule update --init --recursive ; \
cd .. ; \
fi && \
# save the examples
mv pytorch-lightning/_notebooks notebooks && \
mv pytorch-lightning/pl_examples . && \

# Installations
python .github/prune-packages.py ./pytorch-lightning/requirements/extra.txt "horovod" && \
python ./pytorch-lightning/.github/prune-packages.py ./pytorch-lightning/requirements/extra.txt "horovod" && \
pip install "Pillow>=8.2, !=8.3.0" "cryptography>=3.4" "py>=1.10" --no-cache-dir --upgrade-strategy only-if-needed && \
pip install -r ./pytorch-lightning/requirements/extra.txt --no-cache-dir --upgrade-strategy only-if-needed && \
pip install -r ./pytorch-lightning/requirements/examples.txt --no-cache-dir --upgrade-strategy only-if-needed && \
Expand Down
2 changes: 1 addition & 1 deletion docs/source/common/weights_loading.rst
Original file line number Diff line number Diff line change
Expand Up @@ -168,7 +168,7 @@ To load a model along with its weights, biases and hyperparameters use the follo

.. code-block:: python
model = MyLightingModule.load_from_checkpoint(PATH)
model = MyLightningModule.load_from_checkpoint(PATH)
print(model.learning_rate)
# prints the learning_rate you used in this checkpoint
Expand Down
4 changes: 2 additions & 2 deletions docs/source/extensions/accelerators.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
############
Accelerators
############
Accelerators connect a Lightning Trainer to arbitrary accelerators (CPUs, GPUs, TPUs, etc). Accelerators
Accelerators connect a Lightning Trainer to arbitrary accelerators (CPUs, GPUs, TPUs, IPUs). Accelerators
also manage distributed communication through :ref:`Plugins` (like DP, DDP, HPC cluster) and
can also be configured to run on arbitrary clusters or to link up to arbitrary
computational strategies like 16-bit precision via AMP and Apex.
Expand All @@ -26,7 +26,7 @@ One to handle differences from the training routine and one to handle different
from pytorch_lightning.plugins import NativeMixedPrecisionPlugin, DDPPlugin

accelerator = GPUAccelerator(
precision_plugin=NativeMixedPrecisionPlugin(16, "cuda"),
precision_plugin=NativeMixedPrecisionPlugin(precision=16, device="cuda"),
training_type_plugin=DDPPlugin(),
)
trainer = Trainer(accelerator=accelerator)
Expand Down
Loading

0 comments on commit 6bc06ab

Please sign in to comment.