You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
but the "model_ouput" is of string type, and its format is not consistant. Thus I cant extract desired output from this string using a common function.
Describe the solution you'd like
I want to know a way, where I can get output in a consistant structured format.
Describe alternatives you've considered
I've tried this, but it doesnt work
llm = Llama(model_path=model_pth,
n_ctx=CONTEXT_SIZE,
chat_format="alpaca")
llm.create_chat_completion(
messages=[
{
"role": "system",
"content": "You are a helpful assistant that outputs in JSON.",
},
{"role": "user", "content": "Who won the world series in 2020"},
],
response_format={
"type": "json_object",
},
temperature=0.7,
)
Additional context
I am new to this field, any suggestion is appriciated. Thanks !
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
I have my own gguf file.
I was loading it like this :
but the "model_ouput" is of string type, and its format is not consistant. Thus I cant extract desired output from this string using a common function.
Describe the solution you'd like
I want to know a way, where I can get output in a consistant structured format.
Describe alternatives you've considered
I've tried this, but it doesnt work
Additional context
I am new to this field, any suggestion is appriciated. Thanks !
The text was updated successfully, but these errors were encountered: