You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Could be nice to add custom model names.
Currently client displays only gpt-4* and gpt-3* models available to select.
(qwen-v1, ernie, spark, llama, chatglm are not available to select)
For the self-host Models names could be different, this names can be queried */v1/models.
Currently actual self-host Models names are not displayed and it is no option to select it or enter manually.
So it leads to error:
"object": "error",
"message": "Only ABC MODEL allowed now, your model gpt-3.5-turbo",
"code": 40301
Could be nice to add
parameter to select default model name, so every user will not need to select it manually:
docker run -e MODEL_NAME="ABC MODEL"
query available model names */v1/models and display actual models names in settings.
The text was updated successfully, but these errors were encountered:
Could be nice to add custom model names.
Currently client displays only gpt-4* and gpt-3* models available to select.
(qwen-v1, ernie, spark, llama, chatglm are not available to select)
For the self-host Models names could be different, this names can be queried */v1/models.
Currently actual self-host Models names are not displayed and it is no option to select it or enter manually.
So it leads to error:
Could be nice to add
docker run -e MODEL_NAME="ABC MODEL"
The text was updated successfully, but these errors were encountered: