Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flask to deploy on line #263

Open
mokundong opened this issue Dec 13, 2018 · 5 comments
Open

flask to deploy on line #263

mokundong opened this issue Dec 13, 2018 · 5 comments

Comments

@mokundong
Copy link

when use flask deploy a fine tuned- classification task online, each post will reload the model. this takes about 4 seconds to respond, is there any solutions to avoid reloading?

@aron3312
Copy link

#94
You can refer to this.
import predict_online function in run_classifier_predict_online.py.
It will create a session and graph initializing at first, and you won't initial model every time you send a request

@mokundong
Copy link
Author

#94
You can refer to this.
import predict_online function in run_classifier_predict_online.py.
It will create a session and graph initializing at first, and you won't initial model every time you send a request

thanks!

@hanxiao
Copy link

hanxiao commented Dec 19, 2018

@mokundong You may refer to bert-as-service for serving and deploying BERT models.

@mokundong
Copy link
Author

@mokundong You may refer to bert-as-service for serving and deploying BERT models.

it works for classifier task?

@anasuna
Copy link

anasuna commented Feb 28, 2019

@mokundong You may refer to bert-as-service for serving and deploying BERT models.

it works for classifier task?

It didn't work for me for Question answering

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants