This repository has been archived by the owner on Nov 13, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 121
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
A few bug fixes following OpenAI client update (#178)
* [cli] Bug fix - OpenAI models call changed We had a call for openai.Models to check the OpenAI connection, but the client has changed * [cli] Improve error handling - Only verify OpenAI connection if OpenAI is used in configuration - Verify configuration before starting the server, to give a clear error message - Don't error out on the main `canopy` command (always show the intro \ help message) * [llm] Bug fix in OpenAILLM.available_models Did not update to use the new OpenAI client * [test] Added unit test for OpenAILLM.availble_models It was missing a test, appraently * [CLI] Correct typo in error message You look away one sec and Co-Pilot is havocing your error messages... * [cli] Verify pinecone connection on `canopy start` Meant to add it before and forgot * [tests] fix typo
- Loading branch information
1 parent
5897d16
commit 0698ff2
Showing
3 changed files
with
54 additions
and
23 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters