Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix index error when computing ppl on long-text prompt #2697

Merged
merged 2 commits into from
Nov 1, 2024

Conversation

lvhan028
Copy link
Collaborator

@lvhan028 lvhan028 commented Nov 1, 2024

Fix #2693

Copy link

@MaiziXiao MaiziXiao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@irexyc
Copy link
Collaborator

irexyc commented Nov 1, 2024

Some users complained that get_ppl actually gets the loss value, and that exp(loss) is needed to get ppl. Do we need to clarify this in the documentation?

@lvhan028 lvhan028 merged commit 993aa14 into InternLM:main Nov 1, 2024
5 checks passed
lvhan028 added a commit that referenced this pull request Nov 5, 2024
* fix index error when computing ppl on long-text prompt

* update user guide
AllentDan pushed a commit to AllentDan/lmdeploy that referenced this pull request Nov 13, 2024
* fix index error when computing ppl on long-text prompt

* update user guide
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug] cannot get gpqa's score on Qwen2.5-7b model by using lmdeploy backend and opencompass
3 participants