Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Codebook Training Epochs #35

Open
Revliter opened this issue Apr 3, 2024 · 1 comment
Open

Codebook Training Epochs #35

Revliter opened this issue Apr 3, 2024 · 1 comment

Comments

@Revliter
Copy link

Revliter commented Apr 3, 2024

Hello,
Congratulations on the successful development of the SEED model! I am impressed by its ability and wanna to reproduce it locally. However, I am encountering some confusing problems. The config of the codebook training of the seed tokenizer says that it takes up to 500 epochs training upon 500m data. I am wondering is it the right config of the codebook training since it takes so many gpu hours to finish this. It would be thankful if you can clarify this or provide some advice. Thanks for your generous help.

@luohao123
Copy link

Have got data?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants