-
-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallel
layer doesn't need to be tied to array input
#1673
Comments
Fix would be to generalise the forward passes, and not have types on the input data formats. (m::Parallel)(x) = mapreduce(f -> f(x), m.connection, m.layers)
(m::Parallel)(xs...) = mapreduce((f, x) -> f(x), m.connection, m.layers, xs) |
@darsnack do you remember why the bound was put in place in the initial PR? |
No idea why...seems like a mistake. We should definitely drop them. |
If it passes the tests, then it's fine |
It passes - @rkurchin what in your opinion is the most sensible default for tuple inputs. Should the elements of the tuple be distributed across the branches of the |
One thing to note is that the current behavior for (m::MyLayer)(x) = f(x), g(x) Then that naturally reads as (m::MyLayer2)(x) = tuple((f(x), g(x))) This very explicitly says that (m::MyLayer3)(x) = [f(x), g(x)] Unless we change the behavior of Didn't mean to derail: @rkurchin I'd still like to hear your thoughts too. Also, I feel like this multiple arguments issue should be handled more carefully and separately. We can still merge #1674 now. |
In my particular use case, what I want is more like the "zip" behavior (argument 1 goes to branch 1, argument 2 goes to branch 2), but I'm not sure if that ought to inform big-picture decisions here because it's just one specific application. I'm inclined to defer to @darsnack etc. on what the defaults ought to be, especially since the thing I'm working on right now doesn't even have the |
Zip like behavior seems to be the least surprising to me. We should keep with that, and document that mismatched lengths would behave exactly like they do with zip, so users should be careful. Probably not worth a warning in this case. |
Flux.jl/src/layers/basic.jl
Line 419 in 9931730
This line is needlessly restrictive, and breaks for a use case I'd like to make involving a custom type in my ChemistryFeaturization and AtomicGraphNets packages where the input to an
AGNConv
layer can be aFeaturizedAtoms
type.I'm not sure if the solution is just to change
AbstractArray
toAny
or what...The text was updated successfully, but these errors were encountered: