Skip to content

Conversation

@gante
Copy link
Contributor

@gante gante commented Sep 20, 2024

What does this PR do?

test_static_cache_matches_dynamic is VERY flaky on some VLMs. Tags it as flaky to avoid breaking CI while it is being investigated (cc @zucchini-nlp)

This test was enabled recently on VLMs on #33533

@gante gante requested a review from LysandreJik September 20, 2024 16:34
return_dict_in_generate=True,
)
self.assertTrue(torch.allclose(dynamic_out.logits[0], static_out.logits[0], rtol=1e-3, atol=1e-3))
self.assertTrue(torch.allclose(dynamic_out.logits[0], static_out.logits[0], rtol=1e-3, atol=1e-4))
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

restores the original precision

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@zucchini-nlp zucchini-nlp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for flagging it, will definitely check

@gante
Copy link
Contributor Author

gante commented Oct 1, 2024

(cc @LysandreJik -- can we fix this band-aid for the time being? It is a common failure in our CI :) )

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes sure!

@gante gante merged commit 6f0ce52 into huggingface:main Oct 3, 2024
@gante gante deleted the flaky_test_static_cache_matches_dynamic branch October 3, 2024 11:27
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants