Skip to content

[gRPC] Fix scheduler startup broken by context parallel refactor#18933

Merged
slin1237 merged 1 commit intomainfrom
grpc-fix-again
Feb 17, 2026
Merged

[gRPC] Fix scheduler startup broken by context parallel refactor#18933
slin1237 merged 1 commit intomainfrom
grpc-fix-again

Conversation

@slin1237
Copy link
Copy Markdown
Collaborator

Motivation

PR #17213 (refactor context parallel state) added attn_cp_rank and moe_dp_rank as new required parameters to run_scheduler_process, but the gRPC scheduler_launcher.py was not updated. This caused gRPC server startup to fail with incorrect arguments being passed to the scheduler process.

Modifications

  • Compute attn_cp_rank and moe_dp_rank in the gRPC scheduler launcher, matching the logic in engine.py
  • Update the moe_ep_rank calculation to account for moe_dp_size
  • Pass the new arguments in the correct positional order to run_scheduler_process

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @slin1237, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a critical bug that prevented the gRPC scheduler from starting correctly. The issue arose from a previous refactor that introduced new required parallelism rank parameters (attn_cp_rank and moe_dp_rank) to the scheduler process, which were not being calculated or passed by the gRPC launcher. The changes involve adding the necessary rank calculations and updating the function call to include these new arguments, thereby restoring proper scheduler functionality.

Highlights

  • Parallelism Rank Calculation: Implemented calculations for attn_cp_rank and moe_dp_rank within the gRPC scheduler launcher, aligning with existing logic in engine.py.
  • MoE Expert Parallel Rank Update: Revised the calculation for moe_ep_rank to correctly incorporate moe_dp_size.
  • Scheduler Process Argument Passing: Ensured attn_cp_rank and moe_dp_rank are passed as arguments to the run_scheduler_process function, resolving a startup failure.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • python/sglang/srt/grpc/scheduler_launcher.py
    • Added logic to compute attn_dp_size, attn_tp_size, attn_cp_rank, and moe_dp_rank.
    • Updated the calculation for moe_ep_rank to be more comprehensive and account for moe_dp_size.
    • Included attn_cp_rank and moe_dp_rank in the arguments passed to the scheduler process creation.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@slin1237
Copy link
Copy Markdown
Collaborator Author

/tag-and-rerun-ci

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request correctly addresses the gRPC scheduler startup failure by synchronizing the rank calculation logic and the run_scheduler_process argument list with the changes introduced in the context parallel refactor. The rank calculations for attn_cp_rank, moe_dp_rank, and moe_ep_rank are consistent with the logic used in engine.py and data_parallel_controller.py, ensuring that the gRPC launcher correctly handles various parallelism configurations. The positional arguments passed to the scheduler process also align perfectly with the updated function signature in scheduler.py. This fix restores the functionality of the gRPC server startup which was previously broken due to argument mismatches.

)

PR #17213 added attn_cp_rank and moe_dp_rank parameters to
run_scheduler_process but the gRPC scheduler_launcher was not updated,
causing startup failure due to missing arguments.
@slin1237 slin1237 merged commit bf08d3f into main Feb 17, 2026
60 of 91 checks passed
@slin1237 slin1237 deleted the grpc-fix-again branch February 17, 2026 16:52
magicYang1573 pushed a commit to magicYang1573/sglang that referenced this pull request Mar 9, 2026
Wangzheee pushed a commit to Wangzheee/sglang that referenced this pull request Mar 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants