Skip to content

[EPLB] Config Rename wrapper#6111

Merged
wangxiyuan merged 1 commit intovllm-project:releases/v0.13.0from
shenchuxiaofugui:rename-dev
Jan 22, 2026
Merged

[EPLB] Config Rename wrapper#6111
wangxiyuan merged 1 commit intovllm-project:releases/v0.13.0from
shenchuxiaofugui:rename-dev

Conversation

@shenchuxiaofugui
Copy link
Copy Markdown
Collaborator

@shenchuxiaofugui shenchuxiaofugui commented Jan 22, 2026

What this PR does / why we need it?

#5533
Add a wrapper for the eplb startup configuration; this is a forward-compatible update.

Does this PR introduce any user-facing change?

before this pr:
--additional-config '{"dynamic_eplb":true, "num_iterations_eplb_update": 4000, "num_wait_worker_iterations": 150, "init_redundancy_expert": 16, "expert_map_path": "xxx.json"}'

after this pr:
--additional-config '{"eplb_config":{"dynamic_eplb":true,"expert_heat_collection_interval":4000, "algorithm_execution_interval":150,"num_redundant_experts": 16, "expert_map_path": "xxx.json"}}'

How was this patch tested?

qwen3-30b dialogue
Okay, the user is asking, "What is deep learning?" I need to explain this in a clear and concise way. Let me start by recalling what I know about deep learning. It's a subset of machine learning, right? So first, I should mention that it's part of machine learning, which is a branch of AI. Then, the key point is that deep learning uses neural networks with multiple layers. The term "deep" refers to the number of layers in the network.\n\nI should explain what neural networks are. Maybe start with the basics: they're inspired by the human brain, with layers of nodes (neurons). Each layer processes data and passes it to the next. The more layers, the deeper the network. But I need to make sure not to get too technical here.\n\nExamples would help. Maybe mention applications like image recognition, speech recognition, natural language processing. For instance, when you use a smartphone's facial recognition, that's deep learning. Or when you ask a virtual assistant like Siri or Alexa, that's also deep learning in action.\n\nI should also touch on how deep learning works. It requires a lot of data and computational power. The process involves training the network with labeled data, adjusting the weights of the connections between neurons through backpropagation. The more data and layers, the better the model can learn complex patterns.\n\nWait, but the user might not know what backpropagation is. Maybe I should avoid that term unless necessary.

Signed-off-by: shenchuxiaofugui <1311027364@qq.com>
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the EPLB configuration by grouping related parameters under an eplb_config dictionary, which is a good step towards better organization. However, the implementation has a couple of significant issues. First, the old top-level configuration initializations are not removed, leading to redundant code and the potential for silent configuration errors. Second, the new configuration keys are not consistently reflected in the class attribute names, which creates confusion and harms maintainability. My review includes suggestions to address these points to make the refactoring complete and robust.

Comment thread vllm_ascend/ascend_config.py
Comment on lines +159 to +164
self.init_redundancy_expert = config.get("num_redundant_experts", 0)
self.dynamic_eplb = config.get("dynamic_eplb", False)
self.num_iterations_eplb_update = config.get(
"expert_heat_collection_interval", 4000)
self.num_wait_worker_iterations = config.get(
"algorithm_execution_interval", 150)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

There's an inconsistency between the class attribute names and the new configuration keys they are populated from. For instance, self.num_iterations_eplb_update is populated from the key expert_heat_collection_interval. This mismatch makes the code difficult to understand and maintain. To improve clarity, the class attributes should be renamed to match the new configuration keys. While this requires updating usages elsewhere, it's a crucial step for long-term code health.

I suggest the following renames:

  • self.init_redundancy_expert -> self.num_redundant_experts
  • self.num_iterations_eplb_update -> self.expert_heat_collection_interval
  • self.num_wait_worker_iterations -> self.algorithm_execution_interval
Suggested change
self.init_redundancy_expert = config.get("num_redundant_experts", 0)
self.dynamic_eplb = config.get("dynamic_eplb", False)
self.num_iterations_eplb_update = config.get(
"expert_heat_collection_interval", 4000)
self.num_wait_worker_iterations = config.get(
"algorithm_execution_interval", 150)
self.num_redundant_experts = config.get("num_redundant_experts", 0)
self.dynamic_eplb = config.get("dynamic_eplb", False)
self.expert_heat_collection_interval = config.get(
"expert_heat_collection_interval", 4000)
self.algorithm_execution_interval = config.get(
"algorithm_execution_interval", 150)

@Angazenn Angazenn added ready read for review ready-for-test start test by label for PR labels Jan 22, 2026
@wangxiyuan wangxiyuan merged commit 600fc87 into vllm-project:releases/v0.13.0 Jan 22, 2026
17 checks passed
845473182 pushed a commit to 845473182/vllm-ascend that referenced this pull request Jan 22, 2026
…lm-ascend into FIA_v0.13.0

* 'releases/v0.13.0' of https://github.com/vllm-project/vllm-ascend:
  [EPLB] Config Rename wrapper (vllm-project#6111)
  [v0.13.0][Bugfix] Fix the input constraints checks for the mlapo and bmm_transpose operators (vllm-project#5764) (vllm-project#6088)
starmountain1997 pushed a commit to starmountain1997/vllm-ascend that referenced this pull request Jan 31, 2026
### What this PR does / why we need it?
vllm-project#5533 
Add a wrapper for the eplb startup configuration; this is a
forward-compatible update.

### Does this PR introduce _any_ user-facing change?
before this pr:
--additional-config '{"dynamic_eplb":true, "num_iterations_eplb_update":
4000, "num_wait_worker_iterations": 150, "init_redundancy_expert": 16,
"expert_map_path": "xxx.json"}'

after this pr:
--additional-config
'{"eplb_config":{"dynamic_eplb":true,"expert_heat_collection_interval":4000,
"algorithm_execution_interval":150,"num_redundant_experts": 16,
"expert_map_path": "xxx.json"}}'

### How was this patch tested?
qwen3-30b dialogue
Okay, the user is asking, \"What is deep learning?\" I need to explain
this in a clear and concise way. Let me start by recalling what I know
about deep learning. It's a subset of machine learning, right? So first,
I should mention that it's part of machine learning, which is a branch
of AI. Then, the key point is that deep learning uses neural networks
with multiple layers. The term \"deep\" refers to the number of layers
in the network.\n\nI should explain what neural networks are. Maybe
start with the basics: they're inspired by the human brain, with layers
of nodes (neurons). Each layer processes data and passes it to the next.
The more layers, the deeper the network. But I need to make sure not to
get too technical here.\n\nExamples would help. Maybe mention
applications like image recognition, speech recognition, natural
language processing. For instance, when you use a smartphone's facial
recognition, that's deep learning. Or when you ask a virtual assistant
like Siri or Alexa, that's also deep learning in action.\n\nI should
also touch on how deep learning works. It requires a lot of data and
computational power. The process involves training the network with
labeled data, adjusting the weights of the connections between neurons
through backpropagation. The more data and layers, the better the model
can learn complex patterns.\n\nWait, but the user might not know what
backpropagation is. Maybe I should avoid that term unless necessary.

Signed-off-by: shenchuxiaofugui <1311027364@qq.com>
tangtiangu pushed a commit to tangtiangu/jiusi-vllm-ascend that referenced this pull request Feb 24, 2026
### What this PR does / why we need it?
vllm-project#5533 
Add a wrapper for the eplb startup configuration; this is a
forward-compatible update.

### Does this PR introduce _any_ user-facing change?
before this pr:
--additional-config '{"dynamic_eplb":true, "num_iterations_eplb_update":
4000, "num_wait_worker_iterations": 150, "init_redundancy_expert": 16,
"expert_map_path": "xxx.json"}'

after this pr:
--additional-config
'{"eplb_config":{"dynamic_eplb":true,"expert_heat_collection_interval":4000,
"algorithm_execution_interval":150,"num_redundant_experts": 16,
"expert_map_path": "xxx.json"}}'

### How was this patch tested?
qwen3-30b dialogue
Okay, the user is asking, \"What is deep learning?\" I need to explain
this in a clear and concise way. Let me start by recalling what I know
about deep learning. It's a subset of machine learning, right? So first,
I should mention that it's part of machine learning, which is a branch
of AI. Then, the key point is that deep learning uses neural networks
with multiple layers. The term \"deep\" refers to the number of layers
in the network.\n\nI should explain what neural networks are. Maybe
start with the basics: they're inspired by the human brain, with layers
of nodes (neurons). Each layer processes data and passes it to the next.
The more layers, the deeper the network. But I need to make sure not to
get too technical here.\n\nExamples would help. Maybe mention
applications like image recognition, speech recognition, natural
language processing. For instance, when you use a smartphone's facial
recognition, that's deep learning. Or when you ask a virtual assistant
like Siri or Alexa, that's also deep learning in action.\n\nI should
also touch on how deep learning works. It requires a lot of data and
computational power. The process involves training the network with
labeled data, adjusting the weights of the connections between neurons
through backpropagation. The more data and layers, the better the model
can learn complex patterns.\n\nWait, but the user might not know what
backpropagation is. Maybe I should avoid that term unless necessary.

Signed-off-by: shenchuxiaofugui <1311027364@qq.com>
tangtiangu pushed a commit to tangtiangu/jiusi-vllm-ascend that referenced this pull request Feb 24, 2026
### What this PR does / why we need it?
vllm-project#5533 
Add a wrapper for the eplb startup configuration; this is a
forward-compatible update.

### Does this PR introduce _any_ user-facing change?
before this pr:
--additional-config '{"dynamic_eplb":true, "num_iterations_eplb_update":
4000, "num_wait_worker_iterations": 150, "init_redundancy_expert": 16,
"expert_map_path": "xxx.json"}'

after this pr:
--additional-config
'{"eplb_config":{"dynamic_eplb":true,"expert_heat_collection_interval":4000,
"algorithm_execution_interval":150,"num_redundant_experts": 16,
"expert_map_path": "xxx.json"}}'

### How was this patch tested?
qwen3-30b dialogue
Okay, the user is asking, \"What is deep learning?\" I need to explain
this in a clear and concise way. Let me start by recalling what I know
about deep learning. It's a subset of machine learning, right? So first,
I should mention that it's part of machine learning, which is a branch
of AI. Then, the key point is that deep learning uses neural networks
with multiple layers. The term \"deep\" refers to the number of layers
in the network.\n\nI should explain what neural networks are. Maybe
start with the basics: they're inspired by the human brain, with layers
of nodes (neurons). Each layer processes data and passes it to the next.
The more layers, the deeper the network. But I need to make sure not to
get too technical here.\n\nExamples would help. Maybe mention
applications like image recognition, speech recognition, natural
language processing. For instance, when you use a smartphone's facial
recognition, that's deep learning. Or when you ask a virtual assistant
like Siri or Alexa, that's also deep learning in action.\n\nI should
also touch on how deep learning works. It requires a lot of data and
computational power. The process involves training the network with
labeled data, adjusting the weights of the connections between neurons
through backpropagation. The more data and layers, the better the model
can learn complex patterns.\n\nWait, but the user might not know what
backpropagation is. Maybe I should avoid that term unless necessary.

Signed-off-by: shenchuxiaofugui <1311027364@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready read for review ready-for-test start test by label for PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants