Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This repo has almost 300 stars but does *NOT* have residual connections #11

Open
MatthewBM opened this issue Mar 3, 2023 · 2 comments

Comments

@MatthewBM
Copy link

Author wrote:

def forward(self, x):
    return self.conv_block(x) + self.conv_skip(x)

Which will function like a grouped convolution. A skip would be:

def forward(self, x):
    return self.conv_block(x) + x
@true-zk
Copy link

true-zk commented Jun 5, 2024

That's not the same as the original implementation.
But still not

def forward(self, x):
    return self.conv_block(x) + x

author add a down sample use "strides"
see this:
image
Cite from: https://github.com/DebeshJha/ResUNetPlusPlus/blob/master/m_resunet.py

but I guess add a conv layer is ok
though I have not try which is better...

@MatthewBM
Copy link
Author

You cannot add a convolution to x_init and still call it a residual connection.
image

Here is a code example for the Pytorch Implementation of the original Resnet Paper:
result = self.model.forward(x) + x

It needs to be x = Add()([x, x_init])

This repo now has 425 stars and is still wrong...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants