-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix issue where precomputed knn aren't being reused #2171
Conversation
@manivaradarajan: Let’s make sure to get this in for the 1.14 release. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. This looks correct, and appears to fix the divergence for me.
Before this commit, running t-SNE with perplexity=8 then re-running with
perplexity=100 and waiting for a minute or so caused clusters to spiral
farther and farther apart. After this commit, it behaves correctly.
Almost a shame—the wrong behavior was actually really pretty. :-)
@wchargin I wound up fixing another knn-related bug in this PR (that @nsthorat found while debugging the standalone release) - You'll probably want to give it another look. tl;dr is there was some slight drift between the |
Ah, didn’t see your comment before starting the review. Thanks for the |
OK cool, I'll move the other fix into a separate PR |
New PR opened here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great; thank you! Ideally the whitespace changes on otherwise-unchanged
lines in this PR should be omitted, too—what I meant by my comment was
“please avoid introducing new whitespace errors”. If we could keep
future PRs localized, that’d be great.
Fixes #2082.