Skip to content

Commit

Permalink
make it so perceiver io decoding queries can omit the batch dimension
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains committed Oct 10, 2021
1 parent 483a6ed commit 2a1b039
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 2 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ model = PerceiverIO(
)

seq = torch.randn(1, 512, 32)
queries = torch.randn(1, 128, 32)
queries = torch.randn(128, 32)

logits = model(seq, queries = queries) # (1, 128, 100) - (batch, decoder seq, logits dim)
```
Expand Down
5 changes: 5 additions & 0 deletions perceiver_pytorch/perceiver_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -171,6 +171,11 @@ def forward(
if not exists(queries):
return x

# make sure queries contains batch dimension

if queries.ndim == 2:
queries = repeat(queries, 'n d -> b n d', b = b)

# cross attend from decoder queries to latents

latents = self.decoder_cross_attn(queries, context = x)
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'perceiver-pytorch',
packages = find_packages(),
version = '0.7.4',
version = '0.7.5',
license='MIT',
description = 'Perceiver - Pytorch',
author = 'Phil Wang',
Expand Down

0 comments on commit 2a1b039

Please sign in to comment.