forked from NVIDIA/NeMo
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Virtual pipeline parallel support for MegatronGPTSFTModel (NVIDIA#7964)
* Virtual pipeline parallel support for MegatronGPTSFTModel Signed-off-by: Valerie Sarge <[email protected]> * Deduplicate word embedding init code in MegatronGPTModel and MegatronGPTSFTModel into one method Signed-off-by: Valerie Sarge <[email protected]> * Correct TP group init call in MegatronGPTSFTModel to check for both TE and MCore, as in MegatronGPTModel Signed-off-by: Valerie Sarge <[email protected]> * Correct accidental double pipeline parallel size check Signed-off-by: Valerie Sarge <[email protected]> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Correct get_gpt_module_list -> get_model_module_list from SFT model Signed-off-by: Valerie Sarge <[email protected]> --------- Signed-off-by: Valerie Sarge <[email protected]> Co-authored-by: Eric Harper <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Signed-off-by: Sasha Meister <[email protected]>
- Loading branch information
1 parent
8ff1683
commit 74fea6d
Showing
2 changed files
with
26 additions
and
32 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters