Skip to content

Commit

Permalink
[Fix] Fix Mixtral LoRA setting (#312)
Browse files Browse the repository at this point in the history
set target_modules
  • Loading branch information
LZHgrla authored Jan 12, 2024
1 parent 28c0556 commit 8ab2762
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,9 @@
r=64,
lora_alpha=16,
lora_dropout=0.1,
target_modules=[
'q_proj', 'k_proj', 'v_proj', 'o_proj', 'w1', 'w2', 'w3'
],
bias='none',
task_type='CAUSAL_LM'))

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,9 @@
r=64,
lora_alpha=16,
lora_dropout=0.1,
target_modules=[
'q_proj', 'k_proj', 'v_proj', 'o_proj', 'w1', 'w2', 'w3'
],
bias='none',
task_type='CAUSAL_LM'))

Expand Down

0 comments on commit 8ab2762

Please sign in to comment.