You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Can you please help me to productionize the model, here the difficulty I have faced
TF serving with ngnix (recommended) : Because it store model in pickle file so not sure how to provide pickle file path as model, can you please help me
Flask App with ngnix : It loose graph reference after few calls, random behavior and seems its too hard to provide same offering as TF serving
please help us with sample code
The text was updated successfully, but these errors were encountered:
Hi, Thanks for pointing out the issue @jaiswalvineet , I am also having trouble with tensorflow serving of the model. Any help would be appreciated.
Thanks in advance
Regarding question two, I responded to your problem in #120. I hope this fixes your problem. But I am also very interested in running TensorRec in production. Especially in performance optimizations. Currently solution with flask and TensorRec using predict_ranks method gives me a latency of 150-300ms even on larger compute instances on AWS.
Use case is computing a recommendation for a single user and roughly 100 items with 12 item features.
If anybody can recommend some performance tuning tipps for model serving, it would be awesome :)
I did some load testing with TensorRec for serving using TensorFlow 1.12. and TensorFlow < 1.12 and the latter was way faster (up to factor of 10!) than 1.12. Is there anything known why 1.12. is so slow with TensorRec? @jfkirk also noticed that the tests are running much slower with this version...
Can you please help me to productionize the model, here the difficulty I have faced
please help us with sample code
The text was updated successfully, but these errors were encountered: