-
Notifications
You must be signed in to change notification settings - Fork 785
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Training on batches of GraphsTuples? #147
Comments
Thanks for your message! There are two options here and hopefully at least one of them would work for you:
Hope this helps! |
This worked. Thanks @alvarosg ! |
Follow-on issue: I am passing batches to the model during training like so:
As a reminder, each batch is an iterable of The Any ideas on how to solve? |
Could you check the type of the object being passed to the model? My guess is that the A simple fix to get the right type just do:
|
Your hunch was correct! It was being transformed into
Is that^^ the preferred way to feed a batch of graph sequences during the update step? Wondering if it isn't, since when run
Checking a bit more, |
Let's say I want to train an LSTM or transformer on sequences of graphs using Sonnet2/TF2:
I want to represent the graphs in each sequence as one
GraphsTuple
, which means my batches are essentially an iterable ofGraphsTuple
s, each with a variable number of graphs. This is great until it's time to get the input signature and compile the update step. It's unclear to me how to define the tensorspec for this type of input. Is my best route to subclasscollections.namedtuple()
similar to how you define aGraphsTuple
, or can you suggest a more elegant solution?Thanks
The text was updated successfully, but these errors were encountered: