-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds possibility to specify batch_norm, bias and batch_norm_scale for each layer #217
Conversation
[fix] a bug that would add redundant scale layers
…st_layer_batchnorm argument
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #217 +/- ##
========================================
+ Coverage 7.35% 7.38% +0.02%
========================================
Files 58 58
Lines 5939 5947 +8
Branches 1005 1006 +1
========================================
+ Hits 437 439 +2
- Misses 5473 5478 +5
- Partials 29 30 +1 ☔ View full report in Codecov by Sentry. |
Thanks a lot Max! I've built that branch with the sensorium model and tried it out - worked as intended. |
The branch is out of date with the base branch though - could you update it? then I'll go ahead and merge |
Thank you @KonstantinWilleke ! Merged the latest changes from main, so this PR should now be ready for merging into |
To specify batch norm and its properties for each layer individually, this PR changes the attributes
batch_norm
bias
batch_norm_scale
and the according class members to lists.Backwards-compatibility is ensured.
Also includes #215 which should be merged first.