-
-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Simplify trainable
, functor
and Parallel
#1862
Conversation
sum(regularize(l) for l in modules(model)) where So I think for a breaking change, we can put a note in the docs/NEWS and update the tests, but this will not affect the expected use case. |
Ok. Possibly it should filter out all types owned by Base? With
Or perhaps that's just more confusing and staying closer to the literal structure is better. |
@testset "Utils" begin | ||
include("utils.jl") | ||
end | ||
@testset verbose=true "Flux.jl" begin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This PR also adds an overall testset, so that all tests are run even if one fails near the start.
I think so. And ultimately, only the last three elements above will matter for the expected use case. |
Codecov Report
@@ Coverage Diff @@
## master #1862 +/- ##
==========================================
+ Coverage 73.85% 73.94% +0.09%
==========================================
Files 28 28
Lines 1683 1689 +6
==========================================
+ Hits 1243 1249 +6
Misses 440 440
Continue to review full report at Codecov.
|
This does a few things to do with Functors / Optimisers:
trainable
to return a NamedTuple, which is what Optimisers now wants.trainable
on Parallel, since all fields are trainable, the layers are just one wrapper deeper.Chain
, so that it can simply be@functor Chain
. FixesChain
forgets names underfmap
#1857 .Maxout
.show
code not to usetrainable
. This is most of why all these changes are in one PR.The downside of no longer hiding the Tuple inside Chain from Functors.jl is that
fcollect
and henceFlux.modules
will see it.This is why the tests fail. I'm not too sure what this is for, and whether this matters.Tests updated to allow this.The PR also changes Parallel to call its
connection
exactly once, always. And to allow, with N layers, either 1 input or exactly N inputs. It used to zip, but (IMO) allowing N-1 inputs just seems like a bug magnet. These can be separated if anyone feels strongly. Fixes #1685, closes #1698.