Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT] Alpha differentiability in semirelaxed_gromov_wasserstein2 #483

Merged

Conversation

SoniaMaz8
Copy link
Contributor

@SoniaMaz8 SoniaMaz8 commented Jun 8, 2023

Types of changes

Addition of differentiability over the trade-off parameter $\alpha$ in the semirelaxed_gromov_wasserstein2 function in ot/gromov/_semirelaxed.py.

Motivation and context / Related issue

How has this been tested (if it applies)

PR checklist

  • I have read the CONTRIBUTING document.
  • The documentation is up-to-date with the changes I made (check build artifacts).
  • All tests passed, and additional code has been covered with new tests.
  • I have added the PR and Issue fix to the RELEASES.md file.

@rflamary
Copy link
Collaborator

rflamary commented Jun 9, 2023

Hello @SoniaMaz8 that is great.

You should also check that the gradients are properly set during a .backward() in pytorh by editing the following test:

def test_semirelaxed_fgw2_gradients():

@rflamary rflamary changed the title Alpha differentiability in semirelaxed_gromov_wasserstein2 [FEAT] Alpha differentiability in semirelaxed_gromov_wasserstein2 Jun 12, 2023
@rflamary rflamary merged commit f0dab2f into PythonOT:master Jun 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants