Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Use tensorflow leaky_relu op for efficiency (#9044)
* Use tensorflow leaky_relu op for efficiency The current implementation for leaky_relu is extremely inefficient. In my specific usecase it took as much time as the convolution itself and led to tensorslow being much slower than theano. The old inefficient implementation was the result of tensorflow not having the leaky_relu op, but it was recently added. * fix max_value clipping * fix pep8
- Loading branch information
f699346
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have the same version with almost the same changes but it does not work!
Would you please help me?
def relu(x, alpha=0., max_value=None):
"""Rectified linear unit.