Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add out layer activation to net module #300

Merged
merged 5 commits into from
Apr 17, 2019
Merged

Add out layer activation to net module #300

merged 5 commits into from
Apr 17, 2019

Conversation

kengz
Copy link
Owner

@kengz kengz commented Apr 17, 2019

Add out layer activation to net module

  • add out layer activation option to net modules: MLP. Conv, recurrent
  • use net spec out_layer_activation to specify; defaults to null
  • if specified, out_layer_activation needs to be the same shape as out_dim, e.g. tanh for single scalar out_dim, or [tanh, None, ...] for a list out_dim

Misc

  • improve util.debug_image, and wrapper image preprocessing using transpose

@kengz kengz merged commit f8567e3 into master Apr 17, 2019
@kengz kengz deleted the net-out-activ branch April 17, 2019 16:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant