Skip to content

Conversation

@younesbelkada
Copy link
Contributor

@younesbelkada younesbelkada commented Dec 7, 2022

What does this PR do?

Adds DPT-hybrid support in transformers
Currently only DPT is supported. This PR leverages AutoBackbone from @NielsRogge to replace the embedding layer from DPT to support DPT-hybrid

Fixes #20435

Model weights: https://huggingface.co/Intel/dpt-hybrid-midas

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Dec 7, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! Let's please wait though until @sgugger has approved as well :-)

@younesbelkada younesbelkada marked this pull request as ready for review December 7, 2022 12:41
pooler_output=pooled_output,
hidden_states=encoder_outputs.hidden_states,
attentions=encoder_outputs.attentions,
intermediate_activations=embedding_output.intermediate_activations,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a breaking change, as users who are currently using DPTModel expect 4 keys in the output class => this will now be 5.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only if those are all not None however.

logger.info("Initializing the config with a `BiT` backbone.")
self.backbone_config = BitConfig(**backbone_config)
elif isinstance(backbone_config, PretrainedConfig):
self.backbone_config = backbone_config
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure this model works with other backbones, but sure let's support it :D

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not a huge fan of the modularity introduced in the modeling code, but okay in this case since it's all in the same paper.

Left a comment on the config param introduced, other than that it should be good to merge soon!

pooler_output=pooled_output,
hidden_states=encoder_outputs.hidden_states,
attentions=encoder_outputs.attentions,
intermediate_activations=embedding_output.intermediate_activations,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only if those are all not None however.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You'll just need to adapt the checkpoint online with the new config arg and then should be good to merge!

@younesbelkada
Copy link
Contributor Author

Thanks a bunch! Fortunately it was already on the config file :D https://huggingface.co/Intel/dpt-hybrid-midas/blob/main/config.json#L277 but will open a PR to remove the embedding_type as it is not needed anymore

@younesbelkada
Copy link
Contributor Author

The config file has been modified, merging!

@younesbelkada younesbelkada merged commit 7c5eaf9 into huggingface:main Dec 7, 2022
mpierrau pushed a commit to mpierrau/transformers that referenced this pull request Dec 15, 2022
* add `dpt-hybrid` support

* refactor

* final changes, all tests pass

* final cleanups

* final changes

* Apply suggestions from code review

Co-authored-by: Patrick von Platen <[email protected]>

* fix docstring

* fix typo

* change `vit_hybrid` to `hybrid`

* replace dataclass

* add docstring

* move dataclasses

* fix test

* add `PretrainedConfig` support for `backbone_config`

* fix docstring

* Apply suggestions from code review

Co-authored-by: Sylvain Gugger <[email protected]>

* remove `embedding_type` and replace it by `is_hybrid`

Co-authored-by: Patrick von Platen <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add DPT-hybrid

5 participants