Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

avoid compiling compute_loss #511

Closed
wants to merge 1 commit into from

Conversation

galrotem
Copy link
Contributor

Summary:

Context

[Describe motivations and existing situation that led to creating this diff. Don't be cheap with context, it is the basis for a good code review.]

This diff

[List all the changes that this diff introduces and explain the ones that are not trivial. Give directions for the reviewer if needed.]

What’s next

[If this diff is part of a stack or if it has direct continuation in a future diff, share these plans with your reviewer.]

Differential Revision: D48581289

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48581289

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48581289

galrotem added a commit to galrotem/tnt that referenced this pull request Aug 22, 2023
Summary:
Pull Request resolved: pytorch#511

# Context
As per discussion in D48361308, looks like we can remove this since it's only used for torch dynamo but it's not officially supported

# This diff
Separating the change from the above since we also need to clean up some existing UTs which don't make sense anymore. Going to rebase the above change on top of this

Differential Revision: D48581289

fbshipit-source-id: 2808db907b77af373234a12d70c1314217635945
@codecov
Copy link

codecov bot commented Aug 22, 2023

Codecov Report

Merging #511 (05537e9) into master (a690136) will increase coverage by 0.17%.
The diff coverage is 0.00%.

@@            Coverage Diff             @@
##           master     #511      +/-   ##
==========================================
+ Coverage   86.95%   87.12%   +0.17%     
==========================================
  Files         106      106              
  Lines        8407     8363      -44     
==========================================
- Hits         7310     7286      -24     
+ Misses       1097     1077      -20     
Files Changed Coverage Δ
tests/framework/test_auto_unit.py 73.29% <0.00%> (+1.57%) ⬆️
torchtnt/framework/auto_unit.py 79.61% <ø> (-0.07%) ⬇️

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48581289

galrotem added a commit to galrotem/tnt that referenced this pull request Aug 23, 2023
Summary:
Pull Request resolved: pytorch#511

# Context
As per discussion in D48361308, looks like we can remove this since it's only used for torch dynamo but it's not officially supported

# This diff
Separating the change from the above since we also need to clean up some existing UTs which don't make sense anymore. Going to rebase the above change on top of this

Differential Revision: D48581289

fbshipit-source-id: 272c8d9e8a534b6f148b06bfa1797ea81b3bcd8a
Summary:
Pull Request resolved: pytorch#511

# Context
As per discussion in D48361308, looks like we can remove this since it's only used for torch dynamo but it's not officially supported

# This diff
Separating the change from the above since we also need to clean up some existing UTs which don't make sense anymore. Going to rebase the above change on top of this

Reviewed By: JKSenthil

Differential Revision: D48581289

fbshipit-source-id: 94219fa45190467b6b87533500f66735a909d1cb
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48581289

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants