Skip to content

LMCache/lmcache-agentic

Repository files navigation

LMCache Agentic

This repository provides several agent applications that work with production stack and LMCache endpoints. Each agent is designed to demonstrate different AI capabilities and deployment patterns, all integrated with LMCache's caching infrastructure for optimal performance.

🚀 Available Agents

An intelligent dual-agent academic paper analysis system that compares reviews from two different LLM endpoints side-by-side, leveraging LMCache for efficient endpoint management.

Key Features:

  • 🌐 Modern Gradio web interface
  • 🔄 Dual-agent comparison system
  • 📄 PDF processing and analysis
  • 📊 Real-time progress tracking
  • 🔄 LangGraph workflow orchestration
  • 🏠 Flexible LLM support (OpenAI API + local models)
  • ⚡ LMCache integration for optimized endpoint usage

A Dify-powered research paper chatbot that demonstrates performance comparison between LMCache and vLLM inference engines. Upload research papers and ask questions with side-by-side responses from both engines.

Key Features:

  • 📄 Document upload (PDF, images)
  • 🤖 Dual LLM comparison (LMCache vs vLLM)
  • 📊 Performance metrics and token usage statistics
  • 🔄 Real-time comparison interface
  • 🎯 Research-focused Q&A capabilities
  • ⚡ Direct performance benchmarking

More Agents Coming Soon

This repository is actively expanding with additional agentic applications. Each new agent will include:

  • Complete source code and documentation
  • Production-ready Helm charts
  • Docker containerization
  • CI/CD pipeline integration
  • Comprehensive testing suite
  • LMCache endpoint integration

📚 Documentation

🆘 Support

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published