Skip to content

Conversation

@AesSedai
Copy link
Contributor

@AesSedai AesSedai commented Oct 5, 2025

There was a PR recently on mainline to add the additional percentiles to llama-perplexity here and since I was doing some perplexity testing with ik_llama, I figured I'd put up a PR to fix the 90% being listed twice, and add in the other percentiles to match mainline.

I compiled and ran a llama-perplexity run to verify that the output is showing correctly:

...
====== KL divergence statistics ======
Mean    KLD:   0.017475 ±   0.000763
Maximum KLD:  11.152802
99.9%   KLD:   1.098822
99.0%   KLD:   0.221226
95.0%   KLD:   0.055542
90.0%   KLD:   0.029520
Median  KLD:   0.004062
10.0%   KLD:   0.000020
 5.0%   KLD:   0.000005
 1.0%   KLD:   0.000000
 0.1%   KLD:  -0.000003
Minimum KLD:  -0.000423
...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants