Skip to content

Releases: samestrin/llm-interface

v2.0.14

17 Jul 15:41
Compare
Choose a tag to compare

Improvement: Auto Retry Logic

v2.0.11

12 Jul 23:37
Compare
Choose a tag to compare

Skipped v2.0.10

  • New LLM Providers: Anyscale, Bigmodel, Corcel, Deepseek, Hyperbee AI, Lamini, Neets AI, Novita AI, NVIDIA, Shuttle AI, TheB.AI, and Together AI.
  • Caching: Supports multiple caches: simple-cache, flat-cache, and cache-manager. flat-cache is now an optional package.
  • Logging: Improved logging with the loglevel.
  • Improved Documentation: Improved documentation with new examples, glossary, and provider details. Updated API key details, model alias breakdown, and usage information.
  • More Examples: LangChain.js RAG, Mixture-of-Authorities (MoA), and more.
  • Removed Dependency: @anthropic-ai/sdk is no longer required.

v2.0.9

29 Jun 18:01
997a86f
Compare
Choose a tag to compare
  • New LLM Providers: Added support for AIML API (currently not respecting option values), DeepSeek, Forefront, Ollama, Replicate, and Writer.
  • New LLMInterface Methods: LLMInterface.setApiKey, LLMInterface.sendMesage, and LLMInterface.streamMessage.
  • Streaming: Streaming support available for: AI21 Studio, AIML API, DeepInfra, DeepSeek, Fireworks AI, FriendliAI, Groq, Hugging Face, LLaMa.CPP, Mistral AI, Monster API, NVIDIA,
    Octo AI, Ollama, OpenAI, Perplexity, Together AI, and Writer.
  • New Interface Function: LLMInterfaceStreamMessage
  • Test Coverage: 100% test coverage for all interface classes.
  • Examples: New usage examples.

v2.0.8

27 Jun 01:32
Compare
Choose a tag to compare
  • Removing Dependencies: The removal of OpenAI and Groq SDKs results in a smaller bundle, faster installs, and reduced complexity.

v2.0.7

27 Jun 00:15
7a64df6
Compare
Choose a tag to compare
  • New LLM Providers: Added support for DeepInfra, FriendliAI, Monster API, Octo AI, Together AI, and NVIDIA.
  • Improved Test Coverage: New DeepInfra, FriendliAI, Monster API, NVIDIA, Octo AI, Together AI, and watsonx.ai test cases.
  • Refactor: Improved support for OpenAI compatible APIs using new BaseInterface class.

v2.0.6

25 Jun 19:43
Compare
Choose a tag to compare
  • New LLM Provider: Added support for watsonx.ai.

v2.0.3

24 Jun 19:05
Compare
Choose a tag to compare
  • New LLM Providers Functions: LLMInterface.getAllModelNames() and LLMInterface.getModelConfigValue(provider, configValueKey).

v1.0.1

19 Jun 23:49
Compare
Choose a tag to compare
Update NPM packages and README version badge

v0.0.11

17 Jun 22:02
Compare
Choose a tag to compare
  • Simple Prompt Handler: Added support for simplified prompting.

v0.0.10

17 Jun 20:57
Compare
Choose a tag to compare
  • Hugging Face: Added support for new LLM provider Hugging Face (over 150,000 publicly accessible machine learning models)
  • Perplexity: Added support for new LLM provider Perplexity
  • AI21: Add support for new LLM provider AI21 Studio
  • JSON Output Improvements: The json_object mode for OpenAI and Gemini now guarantees the return a valid JSON object or null.
  • Graceful Retries: Retry LLM queries upon failure.