Skip to content

Standard weekly patch release

Compare
Choose a tag to compare
@awaelchli awaelchli released this 21 Dec 18:33
· 35 commits to release/1.5.x since this release

[1.5.7] - 2021-12-21

Fixed

  • Fixed NeptuneLogger when using DDP (#11030)
  • Fixed a bug to disable logging hyperparameters in logger if there are no hparams (#11105)
  • Avoid the deprecated onnx.export(example_outputs=...) in torch 1.10 (#11116)
  • Fixed an issue when torch-scripting a LightningModule after training with Trainer(sync_batchnorm=True) (#11078)
  • Fixed an AttributeError occuring when using a CombinedLoader (multiple dataloaders) for prediction (#11111)
  • Fixed bug where Trainer(track_grad_norm=..., logger=False) would fail (#11114)
  • Fixed an incorrect warning being produced by the model summary when using bf16 precision on CPU (#11161)

Changed

  • DeepSpeed does not require lightning module zero 3 partitioning (#10655)
  • The ModelCheckpoint callback now saves and restores attributes best_k_models, kth_best_model_path, kth_value, and last_model_path (#10995)

Contributors

@awaelchli @borchero @carmocca @guyang3532 @kaushikb11 @ORippler @Raalsky @rohitgr7 @SeanNaren

If we forgot someone due to not matching commit email with GitHub account, let us know :]