Skip to content

Commit

Permalink
Merge pull request damian0815#80 from spezialspezial/patch-1
Browse files Browse the repository at this point in the history
fix small error in call to get_token_ids
  • Loading branch information
damian0815 authored Apr 4, 2024
2 parents 5ac57ad + fa23b83 commit c79844a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/compel/embeddings_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -500,7 +500,7 @@ def tokenizer(self):
def get_token_ids(self, *args, **kwargs):
# get token ids does not use padding. The padding ID is the only ID that can differ between tokenizers
# so for simplicity, we just return `get_token_ids` of the first tokenizer
return self.embedding_providers[0].get_token_ids(self, *args, **kwargs)
return self.embedding_providers[0].get_token_ids(*args, **kwargs)

def get_pooled_embeddings(
self, texts: List[str], attention_mask: Optional[torch.Tensor] = None, device: Optional[str] = None
Expand Down

0 comments on commit c79844a

Please sign in to comment.