Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

instance norm and reflection padding #7938

Merged
merged 24 commits into from
Feb 3, 2018
Merged

Conversation

zhanghang1989
Copy link
Contributor

Re-create the PR for instance norm and reflection padding.

@szha
Copy link
Member

szha commented Sep 18, 2017

Could you fix lint? The errors can be obtained by running make pylint

Output shape:
Same shape as input.

This implementation is based on paper:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fix format. see dropout


Parameters
----------
pad_width int: the size of the padding. If is int, uses the same
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fix format

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if not int what else can it be?

>>> input = mx.nd.random_normal(shape=(16, 3, 224, 224))
>>> output = m(input)
"""
def __init__(self, pad_width=0, **kwargs):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pad_width -> padding

pad_width int: the size of the padding. If is int, uses the same
padding in all boundaries.

Shape:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fix format

Examples
--------
>>> m = nn.ReflectionPad(3)
>>> input = mx.nd.random_normal(shape=(16, 3, 224, 224))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mx.nd.random.normal


Examples
--------
>>> m = nn.ReflectionPad(3)
Copy link
Contributor

@piiswrong piiswrong Sep 20, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

m -> layer

ReflectionPad -> ReflectionPad2D

self._pad_width = pad_width

def forward(self, x):
return F.pad(x, mode='reflect', pad_width=self._pad_width)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't pad_width be a tuple?

@piiswrong
Copy link
Contributor

Please add tests.

@piiswrong
Copy link
Contributor

@szha The original author seems to have disappeared. Would you like to take over?

@szha
Copy link
Member

szha commented Dec 12, 2017

Actually @zhanghang1989 will join us soon 😄

@zhanghang1989
Copy link
Contributor Author

@piiswrong @szha Hi guys, sorry for the delay. I just made some changes.

@piiswrong
Copy link
Contributor

see the other ops for doc format.

Also please add test cases

@zhanghang1989
Copy link
Contributor Author

Thanks @piiswrong! I made some changes to the doc.

Copy link
Member

@eric-haibin-lin eric-haibin-lin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there no unit test?

@zhanghang1989
Copy link
Contributor Author

I will add one unit test soon at unittest/test_gluon.py . Thanks!

Copy link
Member

@szha szha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@szha
Copy link
Member

szha commented Jan 31, 2018

please fix lint

self._padding = padding

def hybrid_forward(self, F, x):
return F.pad(x, mode='reflect', padding=self._padding)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't tested yet. Also, it looks like a very straightforward one-liner. Should we just ask users to use hybrid lambda for this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This simple API makes user easier to use. The Reflectance padding is popularly used in low level vision, such as GANs, super resolution and style transfer. I think it make sense to provide an API for it. What's your thought?

http://pytorch.org/docs/0.3.0/nn.html?highlight=padding#reflectionpad2d

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should have this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK


Parameters
----------
`padding` is a tuple of integer padding widths for each axis of the format
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the documentation and add tests

"""
def __init__(self, padding=0, **kwargs):
super(ReflectionPad2D, self).__init__(**kwargs)
self._padding = padding
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure this works correctly when padding is a number instead of a tuple?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just updated the integer type check and fix docs

Copy link
Member

@szha szha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the classes to docs/api/python/gluon/nn.md so that doc is available.

super(InstanceNorm, self).__init__(**kwargs)
self._kwargs = {'eps': epsilon}
if in_channels != 0:
self.in_channels = in_channels
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't seem needed in forward. The reference in __repr__ can be changed to use the shape from weight instead.

def __repr__(self):
s = '{name}({content}'
if hasattr(self, 'in_channels'):
s += ', in_channels={0}'.format(self.in_channels)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

load size from weight instead.

padding: int or a tuple of int
An integer padding width for height and weight or a tuple of
integers padding widths for each axis of the format ``(before_1, after_1, ... ,
before_N, after_N)``. The `padding` should be of length ``2*N`` where ``N``
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The document describes something general but the class is specific to 2D. Maybe describe the argument that's specific to 2D case here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If pass into a tuple, it works for different dimensions. Should we remove this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the class name says 2D, it should support 2D.

You can do something similar to the conv blocks, with a common abstract class for the forward logic, and the actual 1D/2D/3D classes for specific implementation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree. Just made the changes. Will refactor when adding 1D/3D

@@ -555,8 +553,9 @@ def hybrid_forward(self, F, x, gamma, beta):

def __repr__(self):
s = '{name}({content}'
in_channels = self.gamma.shape[0]
if hasattr(self, 'in_channels'):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove condition

@szha szha merged commit d452162 into apache:master Feb 3, 2018
@zhanghang1989 zhanghang1989 deleted the patch-1 branch February 6, 2018 23:28
@piiswrong
Copy link
Contributor

@zhanghang1989 @szha
The instance norm operator needs an axis argument to be consistent with batchnorm

reflectionpad2d's input/output shape doc has wrong format

rahul003 pushed a commit to rahul003/mxnet that referenced this pull request Jun 4, 2018
* instance norm and reflection padding

* r prefix

* indent and space

* fix docs

* change docs

* spacing

* typo

* hybrid forward

* spcaing

* add test for instance norm

* fix typo

* add to __all__

* rm white space

* integer value

* add test

* make line short

* rm white space

* add docs ref

* fix docs

* RFpad2D docs

* read shape from weight

* rm condition
zheng-da pushed a commit to zheng-da/incubator-mxnet that referenced this pull request Jun 28, 2018
* instance norm and reflection padding

* r prefix

* indent and space

* fix docs

* change docs

* spacing

* typo

* hybrid forward

* spcaing

* add test for instance norm

* fix typo

* add to __all__

* rm white space

* integer value

* add test

* make line short

* rm white space

* add docs ref

* fix docs

* RFpad2D docs

* read shape from weight

* rm condition
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants