Skip to content

Conversation

westonbrown
Copy link
Contributor

Description

Adds llama.cpp model provider support to the Strands Agents SDK, enabling integration with locally-hosted llama.cpp enabling the ability to run agent workloads at the edge on resource constrained devices with small foundation models. This implementation provides full support for llama.cpp-specific features.

Key Features:

  • llama.cpp support: Direct integration with llama.cpp servers via OpenAI-compatible API
  • Advanced sampling parameters: Full support for llama.cpp-specific parameters (mirostat, top_k, min_p, typical_p, tfs_z, top_a, etc.)
  • Grammar constraints: GBNF grammar support for constrained generation
  • Multimodal capabilities: Audio and image content support for compatible models (e.g., Qwen2.5-Omni)
  • JSON schema validation: Native structured output with schema constraints
  • Tool calling support: Full support for function calling with proper formatting
  • Custom error handling: Specific exceptions for context overflow and server overload scenarios

Implementation Details:

  • Architecture: Inherits from base Model class following SDK patterns (similar to Ollama)
  • Parameter Handling:
    • OpenAI-compatible parameters (temperature, max_tokens, top_p, etc.) go in request root
    • llama.cpp-specific parameters go in extra_body for clean separation
    • Grammar and json_schema parameters are placed directly in request body
  • Supported Parameters:
    • llama.cpp-specific: repeat_penalty, top_k, min_p, typical_p, tfs_z, top_a, mirostat, mirostat_lr, mirostat_ent, penalty_last_n, n_probs, min_keep, ignore_eos, logit_bias, cache_prompt, slot_id, samplers
  • Test Coverage: 25 comprehensive tests covering all functionality

Related Issues

Enables local model support via llama.cpp servers

Documentation PR

Documentation to be added in follow-up PR

Type of Change

New feature

Testing

How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli

  • I ran hatch run prepare - All checks passed successfully
  • Comprehensive test suite with 25 tests covering:
    • Basic configuration and initialization
    • Request formatting with all parameter types
    • Streaming response handling
    • Structured output with JSON schemas
    • Grammar constraint application
    • Multimodal content (audio/image) formatting
    • Error handling (context overflow, server overload)
    • Tool calling and function formatting
    • All tests passing (100% success rate)

Example Usage:

from strands import Agent
from strands.models.llamacpp import LlamaCppModel

# Basic usage
model = LlamaCppModel(base_url="http://localhost:8080")
agent = Agent(model=model)
response = agent("Tell me about AI")

# With advanced parameters
model = LlamaCppModel(
    base_url="http://localhost:8080",
    params={
        "temperature": 0.7,
        "max_tokens": 100,
        "repeat_penalty": 1.1,
        "top_k": 40,
        "min_p": 0.05
    }
)

# Grammar constraints
model.use_grammar_constraint('''
    root ::= answer
    answer ::= "yes" | "no"
''')

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly (README.md updated to include llama.cpp in supported providers)
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

This reverts commit 0fe0249.
cagataycali
cagataycali previously approved these changes Aug 6, 2025
Copy link
Member

@awsarron awsarron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for this awesome contribution @westonbrown.

I had a handful of comments, then we can get this one merged. Additionally it would be helpful if you can provide:

  1. Steps to set up and run the integration test locally so that I can confirm it myself
  2. A link to documentation updates PR for this model provider

@westonbrown
Copy link
Contributor Author

@awsarron @cagataycali sorry for the delay! Please find a PR for the docs for this model provider class here

@awsarron awsarron enabled auto-merge (squash) September 10, 2025 19:17
@awsarron awsarron merged commit ab125f5 into strands-agents:main Sep 10, 2025
19 of 22 checks passed
@github-project-automation github-project-automation bot moved this from Coming Soon to Just Shipped in Strands Agents Roadmap Sep 10, 2025
This was referenced Sep 17, 2025
Unshure pushed a commit to Unshure/sdk-python that referenced this pull request Sep 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Just Shipped

Development

Successfully merging this pull request may close these issues.

4 participants