Skip to content

Commit

Permalink
[AutoParallel]: fix ci error about print model
Browse files Browse the repository at this point in the history
  • Loading branch information
liufengwei0103 committed Feb 24, 2025
1 parent 5761457 commit 79c954e
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 2 deletions.
1 change: 0 additions & 1 deletion llm/run_finetune.py
Original file line number Diff line number Diff line change
Expand Up @@ -430,7 +430,6 @@ def compute_metrics_do_generation(eval_preds):
trainer.set_optimizer_grouped_parameters(trainable_parameters)

# Train
print(trainer.model)
if training_args.do_train:
checkpoint = None
if training_args.resume_from_checkpoint is not None:
Expand Down
4 changes: 3 additions & 1 deletion scripts/distribute/ci_case_auto.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3048,7 +3048,9 @@ function llama_lora_static_graph_auto_bs_2_bf16_DP2-TP2-PP1() {
case_log_dir="output/$task_name""_log"

rm -rf output/$task_name/
rm -rf "log/$task_name""_log"

ls -la ./
ehco $PWD

python -u -m paddle.distributed.launch \
--gpus "0,1,2,3" \
Expand Down

0 comments on commit 79c954e

Please sign in to comment.