Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DETR VEA uses output of first layer of transformer decoder? #10

Open
KamilDre opened this issue Dec 19, 2023 · 3 comments
Open

DETR VEA uses output of first layer of transformer decoder? #10

KamilDre opened this issue Dec 19, 2023 · 3 comments

Comments

@KamilDre
Copy link

KamilDre commented Dec 19, 2023

In

hs = self.transformer(src, None, self.query_embed.weight, pos, latent_input, proprio_input, self.additional_pos_embed.weight)[0]
and
hs = self.transformer(transformer_input, None, self.query_embed.weight, self.pos.weight)[0]
you index the output of the transformer with [0]. Does this not take the output of the first layer of the transformer decoder, instead of the last layer? And is this behaviour expected?

@CarlDegio
Copy link

Agree. I removed subsequent layers and it had no effect on training.

@uuu686
Copy link

uuu686 commented Apr 8, 2024

@CarlDegio hello,I changed 0 to -1 and used the output of the last layer, but there is no obvious difference in the effect. May I ask that your experiment is effective?

@CarlDegio
Copy link

CarlDegio commented Apr 8, 2024

I did not try the effect of multi-layer decoder. I just removed the decoder forward propagation that was not used in the original code to speed up the training. @uuu686

JINXER000 pushed a commit to JINXER000/act_interleaved that referenced this issue Sep 19, 2024
Add (optional) adapter plate for the ViperX joint1 (base joint) CAD (by Thinh Nguyen) to enable additional screws into the motor.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants
@KamilDre @uuu686 @CarlDegio and others