Skip to content

Embedded inference / ONNX? #790

Answered by avital
bhchiang asked this question in General
Dec 29, 2020 · 1 comments · 1 reply
Discussion options

You must be logged in to vote

Hi @bryanhpchiang -- I'm not sure if folks have tried converting JAX models to ONNX, but I can ask. The current best practice for inference is to use jax2tf and existing infrastructure for tf.SavedModels. (We should probably add a short guide about that on our documentation)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@bhchiang
Comment options

Answer selected by bhchiang
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants