Skip to content

[Bugfix] Require triton >= 3.0.0 to resolve issue with MoE and TP>1#6304

Closed
tdoublep wants to merge 1 commit intovllm-project:mainfrom
tdoublep:bump-triton-version
Closed

[Bugfix] Require triton >= 3.0.0 to resolve issue with MoE and TP>1#6304
tdoublep wants to merge 1 commit intovllm-project:mainfrom
tdoublep:bump-triton-version

Conversation

@tdoublep
Copy link
Copy Markdown
Member

Fixes #6103 via triton-lang/triton#4295

This would make #6140 redundant but let's see if bumping Triton causes any issues in CI.

Signed-off-by: Thomas Parnell <tpa@zurich.ibm.com>
@tdoublep
Copy link
Copy Markdown
Member Author

Hmm closing this for now since it seems to conflict with torch version currently being used:

INFO: pip is looking at multiple versions of torch to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install -r requirements-cuda.txt (line 7) and triton>=3.0.0 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested triton>=3.0.0
    torch 2.3.0 depends on triton==2.3.0; platform_system == "Linux" and platform_machine == "x86_64" and python_version < "3.12"

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

we will need to wait for torch to go to Triton 3.0.0 i guess

@tdoublep tdoublep closed this Jul 10, 2024
@comaniac
Copy link
Copy Markdown
Collaborator

To update this, we will need to wait for torch 3.0 to release, followed by xformers, vllm-flash-attn and flashinfer.

@jeejeelee
Copy link
Copy Markdown
Collaborator

I think #6140 can serve as a solution for triton <3.0.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: fused_moe_kernel compile bug

3 participants