Skip to content

Conversation

@ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Oct 5, 2023

What does this PR do?

Fix failing tests on main due to torch 2.1

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for Hubert and Wav2Vec2, fx tracing starts to fail with torch 2.1

Comment on lines 527 to 529
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

despite the changes in the modeling files, this part still fails

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, let's try to use unitest skip and put the torch fx proxy in the model forward

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of commenting we can add something like

self.skipTest("Skipping until we fix it ")

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's do the check a lot earlier, we can check this in the model's forward only once!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure about this place. This block contain

            attn_weights = attn_weights.view(bsz, self.num_heads, tgt_len, src_len) + attention_mask
            attn_weights = attn_weights.view(bsz * self.num_heads, tgt_len, src_len)

and it should be run no matter if we are in fx proxy or not

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, let's ensure we check for eventual performance issues and fix the TODOs quickly.

Thanks for fixing these tests so quickly!

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Oct 5, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

@ydshieh
Copy link
Collaborator Author

ydshieh commented Oct 5, 2023

@michaelbenayoun Could you help us on the torch fx tests for wav2vec2/hubert with torch 2.1.

See this internal discussion

But in short, it can't do

if attn_weights.size() != (bsz * self.num_heads, tgt_len, src_len)

as the corresponding proxy object has no _metadata attribute (with torch 2.1) but it has it with torch 2.0.

@ydshieh ydshieh merged commit 54e17a1 into main Oct 5, 2023
@ydshieh ydshieh deleted the fix_ci branch October 5, 2023 08:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants