Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor fixes to PT2 export path: enum typo and max_seq_len #1343

Merged
merged 1 commit into from
Nov 5, 2024

Conversation

Jack-Khuu
Copy link
Contributor

2 mini fixes on #896

  • Enum logic typos (was not found when testing originally due to Python magic)
  • max_seq_len discrepancy (found when testing)
python3 torchchat.py export llama3.1 --quantize '{"precision": {"dtype":"bfloat16"}, "executor":{"accelerator":"cuda"}}' --output-aoti-package-path /tmp/model3.pt2

python3 torchchat.py generate llama3.1 --aoti-package-path /tmp/model3.pt2 --prompt "Once upon a time,"  --num-samples 3

Compared to

python3 torchchat.py export llama3.1 --quantize '{"precision": {"dtype":"bfloat16"}, "executor":{"accelerator":"cuda"}}' --output-dso-path /tmp/model3.so

python3 torchchat.py generate llama3.1 --dso-path /tmp/model3.so --prompt "Once upon a time,"  --num-samples 3

Copy link

pytorch-bot bot commented Nov 5, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchchat/1343

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit f35d5fc with merge base 4510ba0 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 5, 2024
@Jack-Khuu Jack-Khuu merged commit 54455a3 into main Nov 5, 2024
52 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants