Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf流式输出报错 #64

Open
ccly1996 opened this issue Jul 1, 2024 · 5 comments
Open

perf流式输出报错 #64

ccly1996 opened this issue Jul 1, 2024 · 5 comments

Comments

@ccly1996
Copy link

ccly1996 commented Jul 1, 2024

加上--stream参数流式输出测试是会报错
image
非流式输出无问题

@ccly1996
Copy link
Author

ccly1996 commented Jul 1, 2024

命令:llmuses perf --url 'http://127.0.0.1:8000/v1/chat/completions' --parallel 128 --model 'gpt-4-32k' --log-every-n-query 10 --read-timeout=120 -n 1 --max-prompt-length 128000 --api openai --stream --n-choices 3 --stop-token-ids 128001 128009 --dataset openqa --dataset-path 'open_qa.jsonl'

@liuyhwangyh
Copy link
Collaborator

看起来是兼容性问题,请问您压的是什么服务?stream返回的格式是什么?给个例子把, 您这个命令兼容openai接口。

@ccly1996
Copy link
Author

ccly1996 commented Jul 3, 2024 via email

@liuyhwangyh
Copy link
Collaborator

用的是fastchat部署的vllm 服务,openai接口,忘了截图了

---- 回复的原邮件 ---- | 发件人 | @.> | | 发送日期 | 2024年07月03日 09:53 | | 收件人 | modelscope/eval-scope @.> | | 抄送人 | ccly1996 @.>, Author @.> | | 主题 | Re: [modelscope/eval-scope] perf流式输出报错 (Issue #64) | 看起来是兼容性问题,请问您压的是什么服务?stream返回的格式是什么?给个例子把, 您这个命令兼容openai接口。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

目前openai兼容接口,非流式返回chat.completion, 流式返回"object":"chat.completion.chunk", 可能您的结果和目前openai接口不兼容,如果能够提供示例,可以修改代码兼容。

@jinweida
Copy link

image 我也有同样的问题。因为流式返回结果可能不包含 object字段 image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants