Skip to content

Conversation

@ptovam
Copy link
Contributor

@ptovam ptovam commented Jul 17, 2025

✨ Add Support for Custom Stat Logger Classes via CLI Argument
This PR adds support for dynamically loading custom StatLoggerBase classes via a new command-line argument: --stat-loggers.
The purpose is to allow injection of user-defined metric loggers into the vLLM runtime without modifying internal code.

🔧 What’s New
Added a new CLI argument:
--stat-loggers: comma-separated list of fully-qualified class paths
(e.g., mylogger.PrometheusLogger,mylogger.JsonLogger)

These classes are dynamically imported using importlib, and passed to AsyncLLM.from_vllm_config.

📍 Example Usage

python -m vllm.entrypoints.openai.api_server \
  --stat-loggers "mylogger.PrometheusLogger,mylogger.JsonLogger"

💬 Context & Motivation
This PR builds on the work in #14661, which enabled AsyncLLM to accept custom StatLoggerBase instances directly.

It takes the next step by introducing a CLI-level hook for dynamically loading loggers.

This enables plug-and-play observability setups, supports multiple concurrent loggers, and follows a clean, extensible pattern that avoids hardcoding.

@ptovam ptovam requested a review from aarnphm as a code owner July 17, 2025 08:32
@mergify mergify bot added the frontend label Jul 17, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a valuable feature for custom metrics logging via the --stat-loggers CLI argument. The implementation is straightforward. I've identified one high-severity issue related to error handling for invalid logger class paths, which could cause the server to crash on startup. My suggestion includes adding a try-except block to gracefully handle such errors and provide a more informative message to the user, as well as verifying the loaded class is of the correct type. Addressing this will make the feature more robust and user-friendly.

@ptovam ptovam force-pushed the cli-custom-stat-loggers branch from e6d16e9 to 1a1cf1f Compare July 17, 2025 08:39
@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify
Copy link

mergify bot commented Jul 24, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @ptovam.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@njhill
Copy link
Member

njhill commented Aug 8, 2025

@ptovam will you be closing this in favour of #22456?

@ptovam ptovam closed this Aug 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants