Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check if alpha is not None when restricting it to be at most 1 #199

Merged
merged 5 commits into from
Aug 24, 2020

Conversation

panispani
Copy link
Contributor

@panispani panispani commented Jul 12, 2020

Types of changes

scipy.optimize.linesearch.scalar_search_armijo can return alpha=None (function definition). In optim.py:line_search_armijo as part of a previous issue #184, we restrict the value of alpha to be at most 1 with min(1, alpha) but this throws an Exception when alpha is None. In this PR, we just check the value of alpha is not None when making the comparison with 1.

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Motivation and context / Related issue

Issue: #198

How has this been tested (if it applies)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document.
  • All tests passed, and additional code has been covered with new tests.

@panispani panispani force-pushed the alpha_line_search_armijo branch from 4203747 to ae7fbfe Compare July 12, 2020 14:35
ot/optim.py Outdated
@@ -69,7 +69,7 @@ def phi(alpha1):
alpha, phi1 = scalar_search_armijo(
phi, phi0, derphi0, c1=c1, alpha0=alpha0)

return min(1, alpha), fc[0], phi1
return min(1, alpha) if alpha is not None else 1, fc[0], phi1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you put the test in another line with a small comment about what is necessary?

It is a bit hard to follow in the return

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also I guess you have an example where the original function failed. Could you do a simple no regression test it test_optim.py to check that this problem never re-appears again? It feels as if this function always needs tuning ;)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @panpan2 could you please take into account my comment so that we can merge your PR?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated! :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rflamary any other changes needed to merge this?

@rflamary rflamary merged commit 24a7a04 into PythonOT:master Aug 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants