You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running a NLP TF model, we get
```01/13/23 02:46:41.752060: E neuropod/backends/tensorflow/tf_utils.cc:73] [thread 310, process 149] TensorFlow error: {{function_node __inference_signature_wrapper_60293}} {{function_node __inference_signature_wrapper_60293}} {{function_node __inference__wrapped_model_54675}} {{function_node __inference__wrapped_model_54675}} {{function_node __inference_restored_function_body_53812}} {{function_node __inference_restored_function_body_53812}} {{function_node __inference_model_layer_call_fn_2449}} {{function_node __inference_model_layer_call_fn_2449}} Op type not registered 'CaseFoldUTF8' in binary running on phx4-tqp. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) tf.contrib.resampler should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
even though the corresponding libraries are in LD_LIBRARY_PATH.
Feature
Is your feature request related to a problem? Please describe.
Describe the solution you'd like
Describe alternatives you've considered
Additional context
The text was updated successfully, but these errors were encountered: