Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement max_norm argument for torch.nn.functional.embedding #1699

Open
IvanYashchuk opened this issue Jan 27, 2025 · 0 comments
Open

Implement max_norm argument for torch.nn.functional.embedding #1699

IvanYashchuk opened this issue Jan 27, 2025 · 0 comments
Labels

Comments

@IvanYashchuk
Copy link
Collaborator

🚀 Feature

At the moment, the max_norm argument raises a NotImplementedError in Thunder:

def embedding_meta(
a: TensorProxy, /, weight, *, padding_idx=-1, max_norm=None, norm_type=2.0, scale_grad_by_freq=False, sparse=False
) -> TensorProxy:
# TODO: canonicalize and validating padding idx with weight.shape[0]
if max_norm is not None:
raise NotImplementedError

According to the documentation when this argument is active the weight argument is modified inplace. Once we improve general support for inplace operators in Thunder we need to come back to this op and support this variant.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant