Skip to content

[megatron] feat: checkpoint save as HF PEFT format#5575

Merged
ETOgaosion merged 2 commits intoverl-project:mainfrom
HollowMan6:lora_megatron_hf
Mar 26, 2026
Merged

[megatron] feat: checkpoint save as HF PEFT format#5575
ETOgaosion merged 2 commits intoverl-project:mainfrom
HollowMan6:lora_megatron_hf

Conversation

@HollowMan6
Copy link
Copy Markdown
Collaborator

@HollowMan6 HollowMan6 commented Mar 12, 2026

What does this PR do?

New API endpoints added on Megatron-Bridge side, need NVIDIA-NeMo/Megatron-Bridge#2574 (has been merged)

Checklist Before Starting

  • Search for similar PRs. Paste at least one query link here: ...
  • Format the PR title as [{modules}] {type}: {description} (This will be checked by the CI)
    • {modules} include fsdp, megatron, veomni, sglang, vllm, rollout, trainer, ci, training_utils, recipe, hardware, deployment, ray, worker, single_controller, misc, perf, model, algo, env, tool, ckpt, doc, data, cfg, reward, fully_async, one_step_off
    • If this PR involves multiple modules, separate them with , like [megatron, fsdp, doc]
    • {type} is in feat, fix, refactor, chore, test
    • If this PR breaks any API (CLI arguments, config, function signature, etc.), add [BREAKING] to the beginning of the title.
    • Example: [BREAKING][fsdp, megatron] feat: dynamic batching

Test

For changes that can not be tested by CI (e.g., algorithm implementation, new model support), validate by experiment(s) and show results like training curve plots, evaluation results, etc.

API and Usage Example

Demonstrate how the API changes if any, and provide usage example(s) if possible.

# Add code snippet or script demonstrating how to use this

Design & Code Changes

Demonstrate the high-level design if this PR is complex, and list the specific changes.

Checklist Before Submitting

Important

Please check all the following items before requesting a review, otherwise the reviewer might deprioritize this PR for review.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the PEFT checkpoint saving and loading mechanism to leverage the Megatron-Bridge, which is a positive change that centralizes logic and removes custom implementations. The overall approach is sound. I've identified a couple of areas for improvement related to code duplication and the use of private APIs, which could enhance the long-term maintainability of this critical checkpointing functionality.

@HollowMan6 HollowMan6 marked this pull request as ready for review March 12, 2026 18:24
Copilot AI review requested due to automatic review settings March 12, 2026 18:24
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates VERL’s Megatron-Bridge PEFT checkpointing flow to rely on Megatron’s distributed checkpointing and adds support for saving PEFT adapters in HuggingFace (PEFT) format via Megatron-Bridge.

Changes:

  • Switch PEFT adapter load in make_megatron_module() from the repo’s custom adapter checkpoint format to Megatron distributed checkpoint loading with a PEFT filter.
  • Update MegatronCheckpointManager to filter model state dicts to adapter-only when PEFT is enabled, and to save HF PEFT adapters via a new bridge.save_hf_adapter() API.
  • Remove the legacy *_adapter_checkpoint save/load utilities and exports from megatron_peft_utils.py.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 6 comments.

File Description
verl/utils/megatron_utils.py Loads PEFT adapter weights through Megatron distributed checkpointing during pre-wrap PEFT transformation.
verl/utils/megatron_peft_utils.py Removes legacy adapter-only checkpoint save/load helpers and related exports.
verl/utils/checkpoint/megatron_checkpoint_manager.py Filters dist-checkpoint model state to adapter-only for PEFT, adjusts strictness on load, and adds HF PEFT adapter saving via Megatron-Bridge.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

New API endpoints added on Megatron-Bridge side

Signed-off-by: Hollow Man <hollowman@opensuse.org>
Signed-off-by: Hollow Man <hollowman@opensuse.org>
Copy link
Copy Markdown
Collaborator

@ETOgaosion ETOgaosion left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work! Can I understand this as runnable PEFT format saving refactor? The saving methods before have some bugs?

Do we need to modify some docs in another PR?

@ETOgaosion ETOgaosion merged commit e9aa879 into verl-project:main Mar 26, 2026
78 of 118 checks passed
@HollowMan6
Copy link
Copy Markdown
Collaborator Author

HollowMan6 commented Mar 26, 2026

Thank you!

Can I understand this as runnable PEFT format saving refactor?

Yes, mainly to move away from Verl's own customized PEFT checkpointing format (was introduced by me in #4063 hh) into the official APIs introduced by NVIDIA-NeMo/Megatron-Bridge#2574

The saving methods before have some bugs?

I don't find bugs previously unless other people find anything, the main goal was to centralize the code to Megatron-Bridge and keep related Verl codebase clean.

Do we need to modify some docs in another PR?

The PEFT checkpointing format stuff was not documented in https://github.com/verl-project/verl/blob/main/docs/advance/ppo_lora.rst, so feel free to add a section about it.

@HollowMan6 HollowMan6 deleted the lora_megatron_hf branch March 26, 2026 09:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants