-
Notifications
You must be signed in to change notification settings - Fork 546
feat: Added support for Foundry-Local #649
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mainly spelling mistakes and minor nits. The only major thing that needs fixing is that Foundry Local does support tool use (and tool results), otherwise they wouldn't allow you to add tools in the first place.
EDIT: Pasting the link here so it doesn't get lost in the review stuff: https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/how-to/use-chat-completions?pivots=programming-language-python#use-tools
rig-core/src/providers/foundry.rs
Outdated
| /// # Panics | ||
| /// - If the reqwest client cannot be built (if the TLS backend cannot be initialized). | ||
| pub fn new() -> Self { | ||
| Self::builder().build().expect("Ollama client should build") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ollama -> Foundry Local
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeahh i basically took a lot of code from ollama and forgot to rename lol
rig-core/src/providers/foundry.rs
Outdated
| where | ||
| Self: Sized, | ||
| { | ||
| let api_base = std::env::var("OLLAMA_API_BASE_URL").expect("OLLAMA_API_BASE_URL not set"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ollama -> Foundry Local
rig-core/src/providers/foundry.rs
Outdated
| } | ||
| } | ||
|
|
||
| // these i took from gemini ( review needed josh) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can find the Azure AI Foundry model catalog here: https://ai.azure.com/catalog/models
I would suggest having a look as there are quite a few of these where the capital letter casing is not quite correct.
rig-core/src/providers/foundry.rs
Outdated
| impl TryFrom<CompletionResponse> for completion::CompletionResponse<CompletionResponse> { | ||
| type Error = CompletionError; | ||
| fn try_from(resp: CompletionResponse) -> Result<Self, Self::Error> { | ||
| let mut assitant_contents = Vec::new(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
assitant -> assistant
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ahhh :(
rig-core/src/providers/foundry.rs
Outdated
| // foundry only responds with an array of choices which have | ||
| // role and content (role is always "assistant" for responses) | ||
| for choice in resp.choices.clone() { | ||
| assitant_contents.push(completion::AssistantContent::text(&choice.message.content)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
assitant -> assistant
rig-core/src/providers/foundry.rs
Outdated
| .text() | ||
| .await | ||
| .map_err(|e| CompletionError::ProviderError(e.to_string()))?; | ||
| tracing::debug!(target: "rig", "Foundry chat response: {}", text); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Foundry -> Foundry Local (I know this is a bit pedantic, but as Azure AI Foundry itself is already a known service, if we end up adding support for it then it will be a much more obvious differentiation to make)
rig-core/src/providers/foundry.rs
Outdated
|
|
||
| let data_line = if let Some(data) = line.strip_prefix("data: "){ | ||
| data | ||
| }else{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
very small nit: } else {
rig-core/src/providers/foundry.rs
Outdated
| } | ||
|
|
||
| #[derive(Debug, Serialize, Deserialize)] | ||
| pub enum Role { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can use #[serde(rename_all = "lowercase")] here rather than manually renaming each one
rig-core/src/providers/foundry.rs
Outdated
| use crate::message::Message as InternalMessage; | ||
| match internal_msg { | ||
| InternalMessage::User { content, .. } => { | ||
| // Foundry doesn't support tool results in messages, so we skip them |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unfortunately for you, it actually does: https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/how-to/use-chat-completions?pivots=programming-language-python#use-tools
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i was looking at their rest api docs ... shouldve digged deeper
rig-core/src/providers/mod.rs
Outdated
| pub mod azure; | ||
| pub mod cohere; | ||
| pub mod deepseek; | ||
| pub mod foundry; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please rename to foundry_local (removes ambiguity in case we decide to add Azure AI Foundry officially later on)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alrightt
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just needs an example (as well as pending changes) then should be good for another review
| streaming::{self, RawStreamingChoice}, | ||
| }; | ||
|
|
||
| const FOUNDRY_API_BASE_URL: &str = "http://localhost:42069"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please use a different port number (yes, I know 42069 is funny)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alright alright :(
|
Marked |
closes #630