You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Recently I want to use tensorlayer and tensorflow mutually,
but I encounter a trouble.
For my design,
I want to use batch normalization between dense layer and activation layer.
Moreover, it's recommend to use this mechanism to reduce the variance.
It's not problem that add batchNormLayer after DenseLayer,
but I should to use native tensorflow function to reach the goal which returns tensor object.
If I want to concatenate the other part of the network,
the redundant InputLayer should be added.
The idea is from here.
Is there any other method to use tensorlayer more directly?
Does tensorlayer provide any other way to become more flexible,
or adding such this Inputlayer is the only solution?
The text was updated successfully, but these errors were encountered:
@zsdonghao I'm sorry that I miss the activation arg in batchNormLayer......
It's my fault.....
Thanks for your model script!
And the trick link is a really good demonstration (or tutorial).
Recently I want to use tensorlayer and tensorflow mutually,
but I encounter a trouble.
For my design,
I want to use batch normalization between dense layer and activation layer.
Moreover, it's recommend to use this mechanism to reduce the variance.
It's not problem that add
batchNormLayer
afterDenseLayer
,but I should to use native tensorflow function to reach the goal which returns tensor object.
If I want to concatenate the other part of the network,
the redundant
InputLayer
should be added.For example:
The idea is from here.
Is there any other method to use tensorlayer more directly?
Does tensorlayer provide any other way to become more flexible,
or adding such this
Inputlayer
is the only solution?The text was updated successfully, but these errors were encountered: