Skip to content

Commit

Permalink
Merge pull request #1103 from bghira/main
Browse files Browse the repository at this point in the history
merge
  • Loading branch information
bghira authored Oct 26, 2024
2 parents 14f4ae3 + 6a7efd8 commit 2340fa7
Show file tree
Hide file tree
Showing 3 changed files with 2 additions and 3 deletions.
2 changes: 1 addition & 1 deletion TUTORIAL.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ Which brings up the next point: **you should use as much high quality training d

### Captioning

SimpleTuner provides multiple [captioning](/toolkit/captioning/README.md) scripts that can be used to mass-rename files in a format that is acceptable to SimpleTuner.
SimpleTuner provides multiple [captioning](/toolkit/README.md) scripts that can be used to mass-rename files in a format that is acceptable to SimpleTuner.

Options:

Expand Down
2 changes: 1 addition & 1 deletion helpers/data_backend/factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -453,7 +453,7 @@ def configure_multi_databackend(args: dict, accelerator, text_encoders, tokenize
accelerator.wait_for_everyone()
if args.caption_dropout_probability == 0.0:
logger.warning(
"Not using caption dropout will potentially lead to overfitting on captions, eg. CFG will not work very well. Set --caption-dropout_probability=0.1 as a recommended value."
"Not using caption dropout will potentially lead to overfitting on captions, eg. CFG will not work very well. Set --caption_dropout_probability=0.1 as a recommended value."
)

# We don't compute the text embeds at this time, because we do not really have any captions available yet.
Expand Down
1 change: 0 additions & 1 deletion helpers/training/save_hooks.py
Original file line number Diff line number Diff line change
Expand Up @@ -250,7 +250,6 @@ def _save_lora(self, models, weights, output_dir):
output_dir,
unet_lora_layers=unet_lora_layers_to_save,
text_encoder_lora_layers=text_encoder_1_lora_layers_to_save,
transformer_lora_layers=transformer_lora_layers_to_save,
)
elif self.args.model_family == "sdxl" or self.args.model_family == "kolors":
self.pipeline_class.save_lora_weights(
Expand Down

0 comments on commit 2340fa7

Please sign in to comment.