-
Notifications
You must be signed in to change notification settings - Fork 0
Activation Functions
Yi Zhu edited this page Mar 16, 2021
·
1 revision
The activation functions change the features of output data from each layer. The current framework supports None
, ReLU
, Sigmoid
, Softmax
and tanH
. They are modules in Julia.
In order to avoid using the same name of Julia built-in function tanh
, the name is slightly changed to tanH
.
To avoid the numerical instability, please use Softmax_CEL
if you want to use Softmax
with the loss function Cross_Entropy_Loss
.
Star the repo if you like it! :-)