Skip to content

Weekly patch release

Compare
Choose a tag to compare
@Borda Borda released this 10 Feb 16:57
· 1955 commits to release/stable since this release
c24b4bb

App

Added

  • Added lightning open command (#16482)
  • Added experimental support for interruptable GPU in the cloud (#16399)
  • Added FileSystem abstraction to simply manipulate files (#16581)
  • Added Storage Commands (#16606)
    • ls: List files from your Cloud Platform Filesystem
    • cd: Change the current directory within your Cloud Platform filesystem (terminal session based)
    • pwd: Return the current folder in your Cloud Platform Filesystem
    • cp: Copy files between your Cloud Platform Filesystem and local filesystem
  • Prevent to cd into non-existent folders (#16645)
  • Enabled cp (upload) at project level (#16631)
  • Enabled ls and cp (download) at project level (#16622)
  • Added lightning connect data to register data connection to s3 buckets (#16670)
  • Added support for running with multiprocessing in the cloud (#16624)
  • Initial plugin server (#16523)
  • Connect and Disconnect node (#16700)

Changed

  • Changed the default LightningClient(retry=False) to retry=True (#16382)
  • Add support for async predict method in PythonServer and remove torch context (#16453)
  • Renamed lightning.app.components.LiteMultiNode to lightning.app.components.FabricMultiNode (#16505)
  • Changed the command lightning connect to lightning connect app for consistency (#16670)
  • Refactor cloud dispatch and update to new API (#16456)
  • Updated app URLs to the latest format (#16568)

Fixed

  • Fixed a deadlock causing apps not to exit properly when running locally (#16623)
  • Fixed the Drive root_folder not parsed properly (#16454)
  • Fixed malformed path when downloading files using lightning cp (#16626)
  • Fixed app name in URL (#16575)

Fabric

Fixed

  • Fixed error handling for accelerator="mps" and ddp strategy pairing (#16455)
  • Fixed strict availability check for torch_xla requirement (#16476)
  • Fixed an issue where PL would wrap DataLoaders with XLA's MpDeviceLoader more than once (#16571)
  • Fixed the batch_sampler reference for DataLoaders wrapped with XLA's MpDeviceLoader (#16571)
  • Fixed an import error when torch.distributed is not available (#16658)

Pytorch

Fixed

  • Fixed an unintended limitation for calling save_hyperparameters on mixin classes that don't subclass LightningModule/LightningDataModule (#16369)
  • Fixed an issue with MLFlowLogger logging the wrong keys with .log_hyperparams() (#16418)
  • Fixed logging more than 100 parameters with MLFlowLogger and long values are truncated (#16451)
  • Fixed strict availability check for torch_xla requirement (#16476)
  • Fixed an issue where PL would wrap DataLoaders with XLA's MpDeviceLoader more than once (#16571)
  • Fixed the batch_sampler reference for DataLoaders wrapped with XLA's MpDeviceLoader (#16571)
  • Fixed an import error when torch.distributed is not available (#16658)

Contributors

@akihironitta, @awaelchli, @Borda, @BrianPulfer, @ethanwharris, @hhsecond, @justusschock, @Liyang90, @RuRo, @senarvi, @shenoynikhil, @tchaton

If we forgot someone due to not matching commit email with GitHub account, let us know :]