Infer DSL and automatic input serialization #327
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Feature Enhancement: Automatic Serialization and Generation with Infer DSL
This pull request allows auto serialization of inputs as prompts to the
ChatWithFunctions
model. This enhancement includes introducing theInfer
DSL, a tool that tags inputs with generation placeholders for the LLM. This new approach offers enhanced flexibility and control over the input data for generating desired outputs.Key Features:
Infer
DSL empowers you to tag your input data with placeholders that provide additional control over the LLM's behavior. Using the ' Infer ' DSL, you can specify parameters such as temperature and other settings directly within the input with expressions such asinferString
orinferString(Config("temperature" to "0.0"))
or any other made up Sudolang style params you may think the LLM may understand.Example Usage:
Consider the following example code snippets from the diff which creates a recipe given a mix of
user-supplied and LLM inferred parameters through the use of
inferString
,inferInt
etc.In the first example, structured input (
Question
) is automatically serialized and used as a prompt for the LLM. In the second example, theInfer
DSL is employed to customize input using generation placeholders, allowing you to dynamically set temperature and other parameters.