Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update AO pin to pickup lowbit kernel subclass #7759

Merged
merged 1 commit into from
Jan 18, 2025
Merged

Update AO pin to pickup lowbit kernel subclass #7759

merged 1 commit into from
Jan 18, 2025

Conversation

Jack-Khuu
Copy link
Contributor

Summary

Title

Test plan

CI

Copy link

pytorch-bot bot commented Jan 18, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/7759

Note: Links to docs will display an error until the docs builds have been completed.

❌ 2 New Failures, 2 Pending

As of commit 8d3ef87 with merge base 66bfd75 (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jan 18, 2025
@facebook-github-bot
Copy link
Contributor

@Jack-Khuu has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@Jack-Khuu Jack-Khuu merged commit 9836b39 into main Jan 18, 2025
45 of 47 checks passed
@Jack-Khuu Jack-Khuu deleted the bump-ao branch January 18, 2025 01:42
zonglinpeng pushed a commit to zonglinpeng/executorch that referenced this pull request Jan 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. release notes: quantization Changes to quantization
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants