-
Notifications
You must be signed in to change notification settings - Fork 483
Issues: pytorch/xla
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Dataloading hangs on Trillium when using recent whls
dataloading
#8407
opened Nov 21, 2024 by
miladm
Getting "undefined symbol: _ZN5torch4lazy13MetricFnValueB5cxx11E" with torch-xla nightly wheel for 2.6
#8406
opened Nov 21, 2024 by
jeffhataws
Add multiple TPU generations to the TPU CI tests
CI
CI related change
tpuci
usability
Bugs/features related to improving the usability of PyTorch/XLA
#8399
opened Nov 19, 2024 by
miladm
Prepare a subsection to educate users on the PyTorch workloads on AI-Hypercomputer
documentation
#8389
opened Nov 18, 2024 by
miladm
How to write in-place custom ops compatible with torch.compile using pallas
#8385
opened Nov 15, 2024 by
soodoshll
Support splitting physical axis in HybridMesh
SPMD / Distributed
#8381
opened Nov 14, 2024 by
tengyifei
The precision error when xla parses irrational numbers leads to inconsistent calculation results with torch cuda
#8372
opened Nov 12, 2024 by
mars1248
Query regarding using 1 chip (2 cores of TPU v3) for Inference
#8359
opened Nov 6, 2024 by
deepakkumar2440
Offer user guide instructions to users to leverage various Bugs/features related to improving the usability of PyTorch/XLA
libtpu
versions
documentation
usability
#8355
opened Nov 4, 2024 by
miladm
Error when usecollective communication in torch_xla.core.xla_model.all_to_all in SPMD mopde
#8345
opened Oct 31, 2024 by
DarkenStar
Bug - Using Sharding in Flash Attention with segment ids.
#8334
opened Oct 29, 2024 by
dudulightricks
Previous Next
ProTip!
Find all open issues with in progress development work with linked:pr.