Skip to content

Commit 4bdea0d

Browse files
authored
Fix the added_proj_bias default value (#8800)
checking
1 parent 84708c4 commit 4bdea0d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/attention_processor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ def __init__(
103103
cross_attention_norm_num_groups: int = 32,
104104
qk_norm: Optional[str] = None,
105105
added_kv_proj_dim: Optional[int] = None,
106-
added_proj_bias: Optional[bool] = None,
106+
added_proj_bias: Optional[bool] = True,
107107
norm_num_groups: Optional[int] = None,
108108
spatial_norm_dim: Optional[int] = None,
109109
out_bias: bool = True,

0 commit comments

Comments
 (0)