Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there any other methods to use tensorflow and tensorlayer mutually? #208

Closed
SunnerLi opened this issue Sep 9, 2017 · 2 comments
Closed

Comments

@SunnerLi
Copy link

SunnerLi commented Sep 9, 2017

Recently I want to use tensorlayer and tensorflow mutually,
but I encounter a trouble.
For my design,
I want to use batch normalization between dense layer and activation layer.
Moreover, it's recommend to use this mechanism to reduce the variance.
It's not problem that add batchNormLayer after DenseLayer,
but I should to use native tensorflow function to reach the goal which returns tensor object.
If I want to concatenate the other part of the network,
the redundant InputLayer should be added.

For example:

import tensorlayer as tl
import tensorflow as tf

class Net(object):
    def __init__(self, _placeholder):
        # Assign input
        self.network = tl.layers.InputLayer(_placeholder, name='input1')

        # 1st fc + bn + elu
        self.network = tl.layers.DenseLayer(self.network, name ='fc1')
        self.network = tl.layers.BatchNormLayer(self.network, name='bn1')
        self.network = tf.nn.elu(self.network.outputs)

        # Add redundant layer with redundant name
        self.network = tl.layers.InputLayer(self.network, name='input2')

        # 2nd fc + bn + elu
        self.network = tl.layers.DenseLayer(self.network, name ='fc2')
        self.network = tl.layers.BatchNormLayer(self.network, name='bn2')
        self.network = tf.nn.elu(self.network.outputs)

ph = tf.placeholder(tf.float32, [None, 1000])
net = Net(ph)

The idea is from here.
Is there any other method to use tensorlayer more directly?
Does tensorlayer provide any other way to become more flexible,
or adding such this Inputlayer is the only solution?

@zsdonghao
Copy link
Member

Hi, you can put the activation function into BatchNormLayer, this is how I do :

But I am not sure what do you mean by "concatenate the other part of the network", here is some tricks summary by others:

Hope it can solve your problem.

@SunnerLi
Copy link
Author

SunnerLi commented Sep 10, 2017

@zsdonghao I'm sorry that I miss the activation arg in batchNormLayer......
It's my fault.....
Thanks for your model script!
And the trick link is a really good demonstration (or tutorial).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants