-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ragged Tensors #18414
Comments
Report from the other thread: We might support However if you're just looking to have a dynamic data dimension, then you can do it just by:
In general it is possible to handle any workflow using rectangular tensors. RaggedTensors are a convenience but not a blocker. |
@fchollet Thank you very much, you can close this issue then. Just one last questions: So my worries are that back then, you were not able to feed a keras model inputs of shape let's say However, I must check whether keras-core behaves still the same. So it was very handy to use ragged tensor for model input to get a fixed batch dimension but otherwise flexible shape. But yes, I understand that you can use padded or bucketed data, which is fine I guess, but a little overhead even if decomposed in the first layer. |
I also need ragged arrays in keras, or equivalent functionality to feed awkward data. This is critical not out for NLP but also for graph networks. The key pain point is that the fit method comes with a strong assumption that all the input arrays contain data for all the input examples, partitioned by the first axis. But ragged arrays encoded break this a It's true that only tf technically supports ragged. But keras could and should support a limited implementation of a cross-backend composite array type that stores a ragged (batch, ragged_dim, ...) array a row_length array (batch, ragged_shape) and a values array (value_id, ...). Using this encoding the GNN libraries (see DGL, jraph, tensorflow_gnn) have been able to get a workable situation. Without having this implemented directly in keras-level composite array type, it would be very difficult to get this work because of the strong shape assumptions being made by keras. Now perhaps there is already a way to make keras-level composite arrays? If so, maybe there is a work around? |
@swamidass |
For Jax, it is easiest because there are no constraints. For performance reasons, you do want to pad to consistent sizes, and Jraph has a simple function to accomplish this. Nonetheless, the input tensors have different sizes leading dimensions regardless. @PatReis, how are you managing loading batches into keras fit? Or are you jut avoiding keras fit and writing your own training function? |
@swamidass https://github.com/aimat-lab/gcnn_keras/blob/master/docs/source/models.ipynb You can not really use ragged tensors becaus of ops.convert_to_tensor() but you can disassemble ragged tensors in the first layer. That I think works. |
+1 to restore ragged tensors support |
What is the current status of keras and ragged tensors? |
I would also like to know, especially for ragged tensor support in Conv layers! |
Hello,
thanks for the wonderful work on Keras-Core.
I saw the issue #18420 and #18467 and wanted to ask about the ideas/roadmap to support ragged tensors.
Will there be a possibilty to pass a
KerasRaggedTensor
to layers and models, of course without the support of ragged operations across backends ? To become compatible withtf.keras
? Just to feed a ragged shaped tensor to model or layers without having to resort to padding...I know that this is probably a lot of work, so I just wanted to know in order to plan ahead for myself.
Best regards
The text was updated successfully, but these errors were encountered: