Skip to content

Rprop Fix#1421

Merged
11 commits merged intodotnet:mainfrom
alinpahontu2912:rprop_fix
Dec 23, 2024
Merged

Rprop Fix#1421
11 commits merged intodotnet:mainfrom
alinpahontu2912:rprop_fix

Conversation

@alinpahontu2912
Copy link
Member

@alinpahontu2912 alinpahontu2912 commented Dec 6, 2024

Fixes #1417

Remove unnecesary max_step not zero check that modified the gradient. Based on algorithm here: https://pytorch.org/docs/stable/_modules/torch/optim/rprop.html#Rprop

Copy link

@NiklasGustafsson NiklasGustafsson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I'm missing something, but I don't see what the fix is, unless the default LR is actually changed (but I think it's not).

@alinpahontu2912 alinpahontu2912 marked this pull request as ready for review December 16, 2024 13:59
@ghost
Copy link

ghost commented Dec 17, 2024

I'm assuming that change is based on this documentation:
https://pytorch.org/docs/2.5/_modules/torch/optim/rprop.html#Rprop.step

@alinpahontu2912 , would it be possible to add more information in the PR description for "what was wrong, what we've changed?"

@alinpahontu2912
Copy link
Member Author

The results for the code submitted by the issue owner: #1417 (comment) are here:
current_version.txt
after_fix_results.txt

@ghost ghost requested review from a user and NiklasGustafsson December 20, 2024 10:20
@ghost ghost merged commit 3760ba3 into dotnet:main Dec 23, 2024
@alinpahontu2912 alinpahontu2912 deleted the rprop_fix branch February 21, 2025 10:06
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Optim.Rprop issue?

2 participants