Format prompts
Using Chat Templates in SequenceGeneratorAdapter
#987
Labels
prompts
Using Chat Templates in SequenceGeneratorAdapter
#987
Related: #756
What behavior of the library made you think about the improvement?
Currently when using
outlines.generate
, chat templates aren't applied by default. It's awkward and unintuitive to structure your prompts as chat templates. For example, a well structured input for a llama-3 model might look likeI'd prefer
Why We Should Apply Chat Templates by Default
Without the application of chat templates, the model emulates the continuation of a monologue. Where-as chat template format generally follows a query-response structure.
No Chat Template
With Chat Template
How would you like it to behave?
By default
generator(prompt)
applies the chat template.Current behavior should remain available via
generator(prompt, raw=True)
Alternatively it might make sense to have the raw argument in the generator constructing function (e.g.
outlines.generate.text(model, raw=True)
The text was updated successfully, but these errors were encountered: