How to use Lightning-Flash with TPU on Kaggle? #632
-
I am trying to import flash on kaggle but am constantly experiences bugs. pytorch_xla-1.7: pytorch_xla-1.8: pytorch_xla-1.8.1: I want to know how to use on kaggle. Thanks! PS: using the below given way of installing PyTorch/XLA
Edit: I am installing flash from master-branch |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
@prikmm Hi, thanks for opening a discussion! The error may be caused by the wrong pair of versions of PyTorch and xla. Currently, it's tested only with Related issue: Lightning-AI/pytorch-lightning#8315 |
Beta Was this translation helpful? Give feedback.
@prikmm Hi, thanks for opening a discussion! The error may be caused by the wrong pair of versions of PyTorch and xla. Currently, it's tested only with
torch-xla==1.8
, so please use the version and try to match thetorch
version withtorch-xla
. You can check installed versions withpip list | grep torch
.Related issue: Lightning-AI/pytorch-lightning#8315