Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor so that environment and engine #65

Merged
merged 2 commits into from
May 2, 2024
Merged

Refactor so that environment and engine #65

merged 2 commits into from
May 2, 2024

Conversation

qihqi
Copy link
Collaborator

@qihqi qihqi commented May 2, 2024

so that they dont depend on llama
specific stuff such as ModelArgs

@qihqi qihqi requested review from wang2yn84 and FanhaiLu1 May 2, 2024 01:54
so that they dont depend on llama
specific stuff such as ModelArgs
Copy link
Collaborator

@FanhaiLu1 FanhaiLu1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For 'tokenize_and_pad' error, jetstream removed tokenize_and_pad in recent change, I added it back in PR AI-Hypercomputer/JetStream#70, it should pass after re-run the test.

@@ -630,7 +626,7 @@ def samples_per_slot(self) -> int:

@property
def max_prefill_length(self) -> int:
return self.param.max_seq_len
return self.env.max_input_sequence_length
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great, it's much clear now. Engineer might need to figure out and do mapping max_seq_len is max_prefill_length

jetstream_pt/engine.py Outdated Show resolved Hide resolved
@qihqi
Copy link
Collaborator Author

qihqi commented May 2, 2024

For 'tokenize_and_pad' error, jetstream removed tokenize_and_pad in recent change, I added it back in PR google/JetStream#70, it should pass after re-run the test.

I added a command in install_everything.sh to pin v0.2.0 for jetstream. We can manually update the hashes and do fixups.

@qihqi qihqi requested a review from FanhaiLu1 May 2, 2024 17:10
@FanhaiLu1 FanhaiLu1 merged commit d507086 into main May 2, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants