Skip to content

chore: Change submodule pointer for release#2191

Merged
ko3n1g merged 9 commits intor0.3.0from
ko3n1g/chore/update-submodule
Feb 7, 2026
Merged

chore: Change submodule pointer for release#2191
ko3n1g merged 9 commits intor0.3.0from
ko3n1g/chore/update-submodule

Conversation

@ko3n1g
Copy link
Contributor

@ko3n1g ko3n1g commented Feb 3, 2026

What does this PR do ?

Add a one line overview of what this PR aims to accomplish.

Changelog

  • Add specific line by line info of high level changes in this PR.

GitHub Actions CI

See the CI sectionin the Contributing doc for how to trigger the CI. A Nvidia developer will need to approve and trigger the CI for external contributors.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

If you haven't finished some of the above items you can still open "Draft" PR.

Additional Information

  • Related to # (issue)

Summary by CodeRabbit

  • Chores
    • Updated Megatron-LM dependency to the latest version.

Signed-off-by: oliver könig <okoenig@nvidia.com>
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 3, 2026

📝 Walkthrough

Walkthrough

Updated the git submodule pointer for 3rdparty/Megatron-LM to reference a newer commit, changing the pinned revision from bbbedbb9f53343762e4dc70abc771b813a83d817 to 76e81893449d69e54c3d43bdf6fca1accb9ce6e9.

Changes

Cohort / File(s) Summary
Submodule Reference
3rdparty/Megatron-LM
Updated git submodule commit pointer to pull in latest upstream changes.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~2 minutes

Suggested reviewers

  • thomasdhc
🚥 Pre-merge checks | ✅ 4
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: updating a git submodule pointer for the Megatron-LM project for release purposes.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
Test Results For Major Changes ✅ Passed PR contains only minor changes—a submodule pointer update with a single line modified and explicitly labeled as a 'chore' task.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch ko3n1g/chore/update-submodule

Important

Action Needed: IP Allowlist Update

If your organization protects your Git platform with IP whitelisting, please add the new CodeRabbit IP address to your allowlist:

  • 136.113.208.247/32 (new)
  • 34.170.211.100/32
  • 35.222.179.152/32

Failure to add the new IP will result in interrupted reviews.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Signed-off-by: oliver könig <okoenig@nvidia.com>
@copy-pr-bot
Copy link

copy-pr-bot bot commented Feb 6, 2026

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

…l, mock tokenizer

- Remove explicit mtp_loss_scaling_factor=None from Qwen3NextModelProvider80B_A3B
  to inherit new mcore default of 0.1
- Mark Qwen3 MoE quantization tests as xfail: ModelOpt _QuantMoELayer
  does not support padding_mask yet
- Add mock tokenizer with vocab_size, eod, and unique_identifiers to
  test_samplers for MockGPTLowLevelDataset compatibility
@copy-pr-bot
Copy link

copy-pr-bot bot commented Feb 6, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@yaoyu-33
Copy link
Contributor

yaoyu-33 commented Feb 6, 2026

/ok to test 0bac8c0

MockGPTLowLevelDataset requires config.tokenizer.vocab_size and
config.tokenizer.eod when building mock datasets. The three data loader
tests were missing this, causing 'MockGPTDataset failed to build as a
mock data generator' errors.
@yaoyu-33
Copy link
Contributor

yaoyu-33 commented Feb 6, 2026

/ok to test 67d03f6

Signed-off-by: oliver könig <okoenig@nvidia.com>
@ko3n1g
Copy link
Contributor Author

ko3n1g commented Feb 6, 2026

/ok to test db91177

Signed-off-by: oliver könig <okoenig@nvidia.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants