Replies: 2 comments 2 replies
-
|
@rim99 If the beginning part is normal, but there are more meaningless chars appear in the end of output. It's another issue. Thank you! |
Beta Was this translation helpful? Give feedback.
-
|
@NeoZhangJianyu Command I use The screenshot of a similar case, but not exacly the case matches with these logs
Also, the verison info Let me know if you need anything else. I'm happy to help! |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm playing llama.cpp with SYCL backend using integrated graphics of Core Gen 13.
I find that when the context size goes beyond about 750 tokens, the 1.7B model I'm testing with starts to generate meaningless characters.
The limit of 750-token would disappear if parameter
-nkvoadded in the cmdline.Is it expected? Or it needs to be improved?
Beta Was this translation helpful? Give feedback.
All reactions