Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infer DSL and automatic input serialization #327

Merged
merged 6 commits into from
Aug 16, 2023
Merged

Conversation

raulraja
Copy link
Contributor

Feature Enhancement: Automatic Serialization and Generation with Infer DSL

This pull request allows auto serialization of inputs as prompts to the ChatWithFunctions model. This enhancement includes introducing the Infer DSL, a tool that tags inputs with generation placeholders for the LLM. This new approach offers enhanced flexibility and control over the input data for generating desired outputs.

Key Features:

  1. Automatic Serialization: With this enhancement, structured input data is automatically serialized before being sent to the LLM. This ensures that complex data structures, such as objects or lists, are effectively transmitted and processed by the LLM.
package com.xebia.functional.xef.auto

import com.xebia.functional.xef.auto.llm.openai.OpenAI
import com.xebia.functional.xef.auto.llm.openai.prompt
import kotlinx.serialization.Serializable

@Serializable data class Question(val question: String)

@Serializable data class Answer(val answer: String)

/** Demonstrates how to use any structured serializable input as a prompt. */
suspend fun main() {
  OpenAI.conversation {
    val question = Question("What is your name?")
    println("question: $question")
    val answer: Answer = prompt(question)
    println("answer: $answer")
  }
}
  1. Infer DSL for Generation Control: The Infer DSL empowers you to tag your input data with placeholders that provide additional control over the LLM's behavior. Using the ' Infer ' DSL, you can specify parameters such as temperature and other settings directly within the input with expressions such as inferString or inferString(Config("temperature" to "0.0")) or any other made up Sudolang style params you may think the LLM may understand.

Example Usage:

Consider the following example code snippets from the diff which creates a recipe given a mix of
user-supplied and LLM inferred parameters through the use of inferString, inferInt etc.

package com.xebia.functional.xef.auto.expressions

import com.xebia.functional.xef.auto.Description
import com.xebia.functional.xef.auto.llm.openai.OpenAI
import com.xebia.functional.xef.prompt.Prompt
import com.xebia.functional.xef.prompt.lang.Infer
import kotlinx.serialization.Serializable

enum class Cuisine {
  Italian,
  Indian,
  Chinese,
  Mediterranean,
  Vegan,
  Keto,
  Paleo,
  Infer
}

enum class MainIngredient {
  Chicken,
  Beef,
  Tofu,
  Mushroom,
  Lentils,
  Shrimp,
  Infer
}

enum class CookingMethod {
  Bake,
  Fry,
  Steam,
  Saute,
  Grill,
  SlowCook,
  Infer
}

@Serializable
data class Ingredient(val name: String, val quantity: String, val unit: String? = null)

@Serializable
data class RecipeState(
  val title: String,
  val cuisine: Cuisine,
  val mainIngredient: MainIngredient,
  val cookingMethod: CookingMethod,
  val ingredients: List<Ingredient>,
  val description: String,
  val steps: List<String>,
  val totalTime: Int
)

@Serializable
data class DietaryConstraints(
  val allergens: List<String>,
  val dislikedIngredients: List<String>,
  val calorieLimit: Int
)

@Serializable
data class GenerateRecipe(val state: RecipeState, val constraints: DietaryConstraints)

@Serializable
data class RecipePrompt(
  @Description(
    [
      "Generate a detailed and mouthwatering recipe.",
      "Make sure to use appropriate culinary terms.",
      "Recipe should be easy to follow for a beginner."
    ]
  )
  val title: String,
  val ingredients: List<Ingredient>,
  val prepTime: String, // in minutes
  val cookTime: String, // in minutes
  val servings: Int,
  val steps: List<String>,
  val notes: String? = null
)

suspend fun main() {
  OpenAI.conversation {
    val infer = Infer(OpenAI.FromEnvironment.DEFAULT_SERIALIZATION, conversation)
    val recipe: RecipePrompt =
      infer(
        Prompt(
          """
                    Assume the role of a world-class chef. Your task is to create unique and delicious recipes tailored 
                    to specific dietary constraints and preferences using the inputs provided.
                    """
            .trimIndent()
        )
      ) {
        GenerateRecipe(
          state =
            RecipeState(
              title = inferString,
              cuisine = Cuisine.Mediterranean,
              mainIngredient = MainIngredient.Chicken,
              cookingMethod = CookingMethod.Grill,
              ingredients =
                listOf(
                  Ingredient(name = inferString, quantity = inferString, unit = inferString),
                  Ingredient(name = inferString, quantity = inferString, unit = inferString)
                ),
              description = inferString,
              steps = listOf(inferString, inferString, inferString),
              totalTime = inferInt
            ),
          constraints =
            DietaryConstraints(
              allergens = listOf("nuts", "shellfish"),
              dislikedIngredients = listOf("brussels sprouts"),
              calorieLimit = 600
            )
        )
      }

    println(recipe)
  }
}

In the first example, structured input (Question) is automatically serialized and used as a prompt for the LLM. In the second example, the Infer DSL is employed to customize input using generation placeholders, allowing you to dynamically set temperature and other parameters.

Copy link
Contributor

@javipacheco javipacheco left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think about my comment?

Copy link
Contributor

@javipacheco javipacheco left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! 🚀

@raulraja raulraja merged commit 19243b0 into main Aug 16, 2023
@raulraja raulraja deleted the symbolic-programs branch August 16, 2023 15:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants