Conversation
There was a problem hiding this comment.
Code Review
This pull request updates the pinned commit for the flash-attention dependency. The change is straightforward. My review includes one suggestion to improve maintainability by documenting the reason for this specific commit hash directly in the code, which is important for future maintenance of this dependency.
| vllm-flash-attn | ||
| GIT_REPOSITORY https://github.com/vllm-project/flash-attention.git | ||
| GIT_TAG 86f8f157cf82aa2342743752b97788922dd7de43 | ||
| GIT_TAG 1c81743f90ed982461bb3f9a0cef7aa361ee2f11 |
There was a problem hiding this comment.
For long-term maintainability, it's crucial to document 'magic' values like this commit hash. While the PR description provides context, it's best practice to include this information directly in the code. Please add a comment explaining why this specific commit is being used. This will help future developers understand the dependency and make informed decisions when updating it.
# Pick up fix from https://github.com/vllm-project/flash-attention/pull/116
GIT_TAG 1c81743f90ed982461bb3f9a0cef7aa361ee2f11
|
This pull request has merge conflicts that must be resolved before it can be |
17cd46b to
2ec6bf4
Compare
|
This pull request has merge conflicts that must be resolved before it can be |
Signed-off-by: Lucas Wilkinson <LucasWilkinson@users.noreply.github.com>
Update FA to pickup
vllm-project/flash-attention#116