Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Have a way to set LD_PRELOAD for tensorflow_text #559

Open
josephykwang opened this issue Jan 13, 2023 · 2 comments
Open

Have a way to set LD_PRELOAD for tensorflow_text #559

josephykwang opened this issue Jan 13, 2023 · 2 comments

Comments

@josephykwang
Copy link

Feature

Is your feature request related to a problem? Please describe.

Describe the solution you'd like

Describe alternatives you've considered

Additional context

@josephykwang josephykwang changed the title Have a way to set LD_PREL Have a way to set LD_PRELOAD for tensorflow_text Jan 13, 2023
@josephykwang
Copy link
Author

When running a NLP TF model, we get
```01/13/23 02:46:41.752060: E neuropod/backends/tensorflow/tf_utils.cc:73] [thread 310, process 149] TensorFlow error: {{function_node __inference_signature_wrapper_60293}} {{function_node __inference_signature_wrapper_60293}} {{function_node __inference__wrapped_model_54675}} {{function_node __inference__wrapped_model_54675}} {{function_node __inference_restored_function_body_53812}} {{function_node __inference_restored_function_body_53812}} {{function_node __inference_model_layer_call_fn_2449}} {{function_node __inference_model_layer_call_fn_2449}} Op type not registered 'CaseFoldUTF8' in binary running on phx4-tqp. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) tf.contrib.resampler should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.

even though the corresponding libraries are in LD_LIBRARY_PATH. 

@VivekPanyam
Copy link
Collaborator

Note: this was explored offline and passing in custom_ops when creating the model (https://neuropod.ai/docs/master/packagers/tensorflow/#custom_ops) instead of using LD_LIBRARY_PATH solved the issue for a test model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants