Skip to content

fix(deps): breaking change from transformers 5.4.0#3231

Merged
imstevenpmwork merged 5 commits intomainfrom
fix/deps-transformers-5.4.0
Mar 27, 2026
Merged

fix(deps): breaking change from transformers 5.4.0#3231
imstevenpmwork merged 5 commits intomainfrom
fix/deps-transformers-5.4.0

Conversation

@Maximellerbach
Copy link
Copy Markdown
Member

@Maximellerbach Maximellerbach commented Mar 27, 2026

fixing transformers breaking changes from transformers 5.4.0

  1. changing is_flash_attn_greater_or_equal_2_10 to is_flash_attn_greater_or_equal("2.1.0"). Explained here
  2. removing @dataclass from class that inherit from PretrainedConfig. Explained here

Finally, we bumped the minimum requirements of LeRobot to use transformers>=5.4.0 to avoid any compatibility issues with versions that match: transformers>=5.3.0,<5.4.0 that would be caused by the deletion of @dataclass

Copilot AI review requested due to automatic review settings March 27, 2026 15:05
@github-actions github-actions bot added the policies Items related to robot policies label Mar 27, 2026
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates XVLA (Florence2) and Wall-X (Qwen2.5-VL MoE) model implementations to adapt to a Transformers 5.4+ breaking change where is_flash_attn_greater_or_equal_2_10 was removed.

Changes:

  • Replace is_flash_attn_greater_or_equal_2_10 imports with is_flash_attn_greater_or_equal.
  • Update FlashAttention version checks used to determine causal mask alignment behavior.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
src/lerobot/policies/xvla/modeling_florence2.py Switches FlashAttention version helper import/call used by Florence2 FlashAttention module.
src/lerobot/policies/wall_x/qwen_model/qwen2_5_vl_moe.py Switches FlashAttention version helper import/call used by Qwen2.5-VL FlashAttention module.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Maximellerbach and others added 4 commits March 27, 2026 16:13
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Signed-off-by: Maxime Ellerbach <maxime@ellerbach.net>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Signed-off-by: Maxime Ellerbach <maxime@ellerbach.net>
Copy link
Copy Markdown
Collaborator

@imstevenpmwork imstevenpmwork left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@imstevenpmwork
Copy link
Copy Markdown
Collaborator

Merging this one despite the CI red, so we can unblock other PRs. The failures related to transformers v5.4 & the logins test will be addressed shortly

@imstevenpmwork imstevenpmwork merged commit 0750286 into main Mar 27, 2026
12 of 15 checks passed
@imstevenpmwork imstevenpmwork deleted the fix/deps-transformers-5.4.0 branch March 27, 2026 20:25
imstevenpmwork added a commit that referenced this pull request Mar 30, 2026
imstevenpmwork added a commit that referenced this pull request Mar 30, 2026
* Revert "fix(deps): breaking change from transformers 5.4.0 (#3231)"

This reverts commit 0750286.

* chore(dependecies): pin transformers to 5.3.0 temporarily
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

policies Items related to robot policies

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants