-
Notifications
You must be signed in to change notification settings - Fork 89
Issues: Lightning-AI/lightning-thunder
Label tracking meta-issue (edit me to get automatically CC'ed...
#72
opened Mar 25, 2024 by
carmocca
Open
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Debug mode that associates backward ops to forward ops in the trace
#306
opened Mar 8, 2024 by
carmocca
make traces own proxies and bsyms
enhancement
New feature or request
tracing architecture
#1606
opened Jan 6, 2025 by
t-vi
Provide debugging traces and options as a ENV variable or JIT option
debugging
enhancement
New feature or request
#304
opened Apr 30, 2024 by
parthmannan
Investigate the difference in speedup between Llama3-8b and Mistral-7B-v0.1
enhancement
New feature or request
performance
#1341
opened Oct 22, 2024 by
mpatel31415
Support RN50 BatchNorm fusions with cudnn
cudnn
enhancement
New feature or request
#487
opened May 29, 2024 by
vedaanta
Enable Thunder as a This is a largish feature / design
enhancement
New feature or request
torch.compile
backend
design
#915
opened Aug 2, 2024 by
IvanYashchuk
avoid joint trace in rematerialize forward backward
rematerialization
#1618
opened Jan 8, 2025 by
t-vi
Support NeMo NeVA Model
enhancement
New feature or request
high priority
nemo
Issues needed to support NVIDIA NeMo models.
neva
operators
#343
opened May 1, 2024 by
athitten
Add a notebook demonstrating usage of Thunder as a Dynamo backend
dynamo
#963
opened Aug 13, 2024 by
IvanYashchuk
fusion_type="dataflow" can lead to invalid trace
fusion logic
#1858
opened Mar 7, 2025 by
kshitij12345
Multiple accesses for non-cached property fails in Thunder JIT
jit
#729
opened Jul 8, 2024 by
IvanYashchuk
dtype inconsistencies when dividing/rounding tensors
bug
Something isn't working
#467
opened May 29, 2024 by
k223kim
jit: Something isn't working
warnings & errors
torch.cuda.stream
and other related functionality are silently ignored when jitting.
bug
#280
opened Apr 25, 2024 by
kshitij12345
Difference in backward output compared to eager
numerical accuracy
#969
opened Aug 14, 2024 by
kshitij12345
Different behavior in torch.full: checks if for things that could be applicable to the dynamo+thunder frontend
fill_value
can be cast to dtype
.
operators
thunderfx
#960
opened Aug 12, 2024 by
kiya00
Distributed and Bucketing Performance Improvements
bug
Something isn't working
distributed
enhancement
New feature or request
performance
#348
opened May 2, 2024 by
parthmannan
value_and_grad returns None gradients with thunder.jit
bug
Something isn't working
jit
transforms
#211
opened Apr 17, 2024 by
kshitij12345
have a method to compare speed of different parts of training between compilation backends
enhancement
New feature or request
#444
opened May 22, 2024 by
mpatel31415
Consider if functions like full_like and expand_as should be symbols (or not)
operators
#362
opened May 3, 2024 by
mruberry
Split saved for backward information based on types
autograd
#959
opened Aug 12, 2024 by
IvanYashchuk
Handling inplace through SSA
enhancement
New feature or request
help wanted
Extra attention is needed
#145
opened Apr 9, 2024 by
t-vi
Previous Next
ProTip!
no:milestone will show everything without a milestone.