Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add attentive layer to Jepa #927

Open
wants to merge 14 commits into
base: tuan/support_explicit_init_fn
Choose a base branch
from

Conversation

antoine-tran
Copy link
Contributor

What does this PR do? Please describe:

This PR follows up #889 to add the building blocks (models, builders, loader) for the finetuned JEPA encoder, plus testing scripts of the models in different downstream tasks.

Fixes #{issue number}

Does your PR introduce any breaking changes? If yes, please list them:
List of all backwards-incompatible changes.

Check list:

  • Was the content of this PR discussed and approved via a GitHub issue? (no need for typos or documentation improvements)
  • Did you read the contributor guideline?
  • Did you make sure that your PR does only one thing instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (no need for typos, documentation, or minor internal changes)

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 19, 2024
@antoine-tran antoine-tran changed the title Tuan/add attentive layer to jepa Add attentive layer to jepa Dec 19, 2024
@antoine-tran antoine-tran marked this pull request as ready for review December 20, 2024 12:58
@antoine-tran antoine-tran changed the base branch from main to tuan/support_explicit_init_fn December 20, 2024 12:59
@antoine-tran antoine-tran changed the title Add attentive layer to jepa Add attentive layer to Jepa Dec 20, 2024
Tuan Tran added 2 commits December 20, 2024 13:46
@cbalioglu
Copy link
Contributor

I am outside for a couple errands, will review before EOD.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants