Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow custom optimizers, improve JaxTrainingPlan #1747

Merged
merged 5 commits into from
Oct 18, 2022
Merged

Conversation

adamgayoso
Copy link
Member

@adamgayoso adamgayoso commented Oct 17, 2022

  • Easily add optax/pytorch custom optimizers easily
  • Breaking change where we force keyword args to the training plans

@adamgayoso adamgayoso marked this pull request as ready for review October 17, 2022 17:56
@codecov
Copy link

codecov bot commented Oct 17, 2022

Codecov Report

Base: 90.96% // Head: 90.87% // Decreases project coverage by -0.08% ⚠️

Coverage data is based on head (4d80c89) compared to base (00e0342).
Patch coverage: 76.47% of modified lines in pull request are covered.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1747      +/-   ##
==========================================
- Coverage   90.96%   90.87%   -0.09%     
==========================================
  Files         116      116              
  Lines        9757     9764       +7     
==========================================
- Hits         8875     8873       -2     
- Misses        882      891       +9     
Impacted Files Coverage Δ
scvi/train/_trainingplans.py 93.46% <75.00%> (-2.20%) ⬇️
scvi/model/base/_jaxmixin.py 88.23% <100.00%> (+0.35%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

@adamgayoso adamgayoso added this to the 0.19.0 milestone Oct 17, 2022
Comment on lines +84 to +88
# Allclose because on GPU, the values are not exactly the same
# as latents are moved to cpu in latent mode
np.testing.assert_allclose(
params_latent[k], params_orig[k], rtol=3e-1, atol=5e-1
)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @watiss needed to change this. LMK if the comment makes sense!

Copy link
Contributor

@martinkim0 martinkim0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@adamgayoso adamgayoso merged commit c408241 into main Oct 18, 2022
@adamgayoso adamgayoso deleted the jax_training branch October 18, 2022 17:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants