-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generic AI client and models + open-ai client lib #196
Conversation
@xebia-functional/team-ai |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To my limited understanding, seems ok...
core/src/commonMain/kotlin/com/xebia/functional/xef/auto/AIRuntime.kt
Outdated
Show resolved
Hide resolved
...onMain/kotlin/com/xebia/functional/xef/llm/models/chat/ChatCompletionRequestWithFunctions.kt
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/auto/AIRuntime.kt
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Besides a minor comment, the changes look good to me.
Note: I'm wondering if OpenAI-Kotlin implements Embeddings
in a way that we can use in OpenAIEmbeddings
.
core/src/commonMain/kotlin/com/xebia/functional/xef/auto/AIRuntime.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/auto/AIRuntime.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/llm/openai/OpenAIClient.kt
Outdated
Show resolved
Hide resolved
There is indeed a class called |
# Conflicts: # core/src/commonMain/kotlin/com/xebia/functional/xef/llm/openai/OpenAIEmbeddings.kt
# Conflicts: # core/src/commonMain/kotlin/com/xebia/functional/xef/auto/CoreAIScope.kt # core/src/commonMain/kotlin/com/xebia/functional/xef/llm/models/functions/CFunction.kt # core/src/commonMain/kotlin/com/xebia/functional/xef/llm/openai/models.kt # kotlin/src/commonMain/kotlin/com/xebia/functional/xef/auto/DeserializerLLMAgent.kt # kotlin/src/commonMain/kotlin/com/xebia/functional/xef/auto/serialization/functions/FunctionSchema.kt # scala/src/main/scala/com/xebia/functional/xef/scala/auto/package.scala
@nomisRev I adapted these changes to the latest from main after the Java stuff has been merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A couple of questions, and suggestions. Looks good overall. I am not sure I understand the dependency cycle between OpenAI and Core. It seems the dependencies need to be grouped in the top-level language specific modules if we want to provide top-level defaults rather than couple them in core.
core/src/commonMain/kotlin/com/xebia/functional/xef/auto/AIRuntime.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/llm/AIClient.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/llm/models/chat/Role.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/llm/models/embeddings/EmbeddingModel.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/llm/openai/OpenAIClient.kt
Outdated
Show resolved
Hide resolved
core/src/commonMain/kotlin/com/xebia/functional/xef/llm/openai/OpenAIEmbeddings.kt
Outdated
Show resolved
Hide resolved
… and java depends on openai module for defaults. xef core does not depend on open ai
This is how it is reflected now, the AIClient impl for OpenAI uses the library embeddings. Our embeddings delegate to the client embedding request, which delegates to the library. |
Delegates the open ai client runtime to https://github.com/aallam/ and reuses previous models as generic classes for xef core.
Adds a new hierarchy for LLM Models to be typed based on their capabilities. This prevents sending the wrong kind of model to an operation that does not support that capability.
For example the
prompt
function can only be invoked with a model that has serialization capabilities through functions.I plan to eliminate the AIClient in the next PR and implement the capabilities in the models directly to remove the client dependency from the scope. The client should not be a scoped object because each call can be to different models with different client implementations inside the AI block.