We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author wrote:
def forward(self, x): return self.conv_block(x) + self.conv_skip(x)
Which will function like a grouped convolution. A skip would be:
def forward(self, x): return self.conv_block(x) + x
The text was updated successfully, but these errors were encountered:
That's not the same as the original implementation. But still not
author add a down sample use "strides" see this: Cite from: https://github.com/DebeshJha/ResUNetPlusPlus/blob/master/m_resunet.py
but I guess add a conv layer is ok though I have not try which is better...
Sorry, something went wrong.
You cannot add a convolution to x_init and still call it a residual connection.
x_init
Here is a code example for the Pytorch Implementation of the original Resnet Paper: result = self.model.forward(x) + x
It needs to be x = Add()([x, x_init])
x = Add()([x, x_init])
This repo now has 425 stars and is still wrong...
No branches or pull requests
Author wrote:
Which will function like a grouped convolution. A skip would be:
The text was updated successfully, but these errors were encountered: