Skip to content

Fix lint and import error in perf script llama3_llm_finetune.py#2592

Merged
chtruong814 merged 1 commit intomainfrom
kmorabia/fix-lint-perf
Feb 27, 2026
Merged

Fix lint and import error in perf script llama3_llm_finetune.py#2592
chtruong814 merged 1 commit intomainfrom
kmorabia/fix-lint-perf

Conversation

@kevalmorabia97
Copy link
Copy Markdown
Contributor

@kevalmorabia97 kevalmorabia97 commented Feb 27, 2026

Follow-up from convo with @chtruong814

This merged yesterday, which removed llama3_70b_finetune_config from that file b162358

This merged today but was a stale branch, still assuming llama3_70b_finetune_config was imported in that file #2397

What does this PR do ?

Add a one line overview of what this PR aims to accomplish.

Changelog

  • Add specific line by line info of high level changes in this PR.

GitHub Actions CI

See the CI sectionin the Contributing doc for how to trigger the CI. A Nvidia developer will need to approve and trigger the CI for external contributors.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

If you haven't finished some of the above items you can still open "Draft" PR.

Additional Information

  • Related to # (issue)

Summary by CodeRabbit

  • Chores
    • Updated model fine-tuning configuration initialization and parameter setup for improved consistency across deployment configurations.

This merged yesterday, which removed llama3_70b_finetune_config from that file
b162358

This merged today but was a stale branch, still assuming llama3_70b_finetune_config was imported in that file
#2397

Signed-off-by: Keval Morabia <28916987+kevalmorabia97@users.noreply.github.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Feb 27, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between aca1816 and 086fae5.

📒 Files selected for processing (1)
  • scripts/performance/configs/llama/llama3_llm_finetune.py

📝 Walkthrough

Walkthrough

This change refactors LLAMA3 LoRA finetune configuration initialization by replacing two separate config constructors with a unified PEFT config approach, explicitly setting mixed precision, sequence length to 4096, and related configuration fields.

Changes

Cohort / File(s) Summary
LLAMA3 LoRA Config Refactoring
scripts/performance/configs/llama/llama3_llm_finetune.py
Replaced separate LoRA config constructors with unified llama3_70b_peft_config(peft_scheme="lora") for both b300 and b200 models, then explicitly set mixed_precision from precision_config and seq_length to 4096 across model, dataset, and packed_sequence_specs configurations.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested labels

performance

🚥 Pre-merge checks | ✅ 4
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: fixing a lint and import error in the llama3_llm_finetune.py performance script, which aligns with the changeset that resolves import issues by replacing outdated config constructors.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
Test Results For Major Changes ✅ Passed This PR fixes a lint and import error by replacing a removed function reference with an alternative configuration approach. The minimal scope (+12/-12 lines) of this targeted bug fix does not require test results documentation.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch kmorabia/fix-lint-perf

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@chtruong814 chtruong814 enabled auto-merge (squash) February 27, 2026 23:30
@chtruong814 chtruong814 merged commit 5c86b2b into main Feb 27, 2026
64 of 66 checks passed
@chtruong814 chtruong814 deleted the kmorabia/fix-lint-perf branch February 27, 2026 23:38
malay-nagda pushed a commit that referenced this pull request Mar 2, 2026
Signed-off-by: Keval Morabia <28916987+kevalmorabia97@users.noreply.github.com>
malay-nagda added a commit that referenced this pull request Mar 2, 2026
… (#2608)

Signed-off-by: Keval Morabia <28916987+kevalmorabia97@users.noreply.github.com>
Co-authored-by: Keval Morabia <28916987+kevalmorabia97@users.noreply.github.com>
copy-pr-bot bot pushed a commit that referenced this pull request Mar 19, 2026
Signed-off-by: Keval Morabia <28916987+kevalmorabia97@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants