Skip to content

Implementation of the paper: "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" from Google in pyTORCH

License

Notifications You must be signed in to change notification settings

kyegomez/Infini-attention

Repository files navigation

Multi-Modality

Infini Attention

install

$ pip install infini-torch

usage

import torch
from infini_torch.attention import InfiniAttention

# Create a random tensor of shape (1, 32, 64)
x = torch.randn(1, 32, 64)

# Create an instance of InfiniAttention with input size 64
attn = InfiniAttention(64)

# Apply the attention mechanism to the input tensor
out = attn(x)

# Print the shape of the output tensor
print(out.shape)

License

MIT

About

Implementation of the paper: "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" from Google in pyTORCH

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

 

Packages

No packages published