Conversation
15f9e59 to
b6c8ec3
Compare
plugin/trino-ai-functions/src/main/java/io/trino/plugin/ai/functions/AiModule.java
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Let's rather use strings instead of enums.
There are more AI services out there and limiting the providers to only the ones we know of seems a limiting factor.
I'd salute having a more pluggable way to think about AI providers.
There was a problem hiding this comment.
We don't support plugins for plugins in Trino. Adding new providers will require modifying the code, so anyone doing that can easily add their own enum.
Using an enum means that users will get an immediate config validation error if the value doesn't exist. I've been changing other similar string configs to enums.
There was a problem hiding this comment.
Also .. I am sure we are happy to expand to other providers but with Ollama we support a LOT already
|
Awesome .. |
mosabua
left a comment
There was a problem hiding this comment.
Gave the docs a review for starters.. very exciting to see this come alive.
|
Fyi @alaturqua |
|
Great to see this is being implemented. |
|
Open webui is using ollama to serve many models locally with openai interface. And a possibility to connect to openai, groq and similar ai services. Might be worth to take a look at. https://github.com/open-webui/open-webui |
11e4c17 to
8f9a6fd
Compare
|
I removed |
c26f64d to
0cedb38
Compare
|
Updated to include support for Anthropic and add OpenTelemetry tracing. |
mosabua
left a comment
There was a problem hiding this comment.
A couple of minor adjustments for the docs. Code wise I think @dain already looked in detail so there is no need for me to dig in more.
I expect that all this will evolve quickly so I am okay with merging and follow up PRs for docs and any further changes.
plugin/trino-ai-functions/src/main/java/io/trino/plugin/ai/functions/AbstractAiClient.java
Outdated
Show resolved
Hide resolved
| @Override | ||
| protected String generateCompletion(String model, String prompt) | ||
| { | ||
| URI uri = uriBuilderFrom(endpoint) |
There was a problem hiding this comment.
Hi @electrum, I'm wondering why here decided to build raw http request rather than using libs from com.openai.client.OpenAIClient?
There was a problem hiding this comment.
We use the Airlift HTTP client because it’s used everywhere else in Trino and supports our standard configuration system. It already has extensive configs for things like TLS certificates that people often need to customize for their system. If we used a different client, then we’d have a long process of adding configs as people need them.
Also, we support backends other than OpenAI, and for something this simple, we don’t want to add a third party dependency.
Description
This adds AI functions that invoke an LLM by calling Ollama, Anthropic, or OpenAI (or compatible APIs). The functions are modeled after the Databricks SQL functions.
Testing
The tests utilize Hoverfly to execute against recorded responses from the provider APIs. The responses can be updated by changing the test class from
@HoverflySimulateto@HoverflyCapture, providing a valid API key, and then running the modified tests. We have a custom Hoverfly JUnit 5 extension that stores responses separately for each test method, allowing to easily update the response for a single method.Release notes
(x) Release notes are required, with the following suggested text: