-
Notifications
You must be signed in to change notification settings - Fork 7.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
paddle2.0.0.beta0的内存泄露问题 #971
Comments
可以提供下你的运行平台,运行脚本和配置吗? |
我的机器配置为8核64G内存,centos7+docker,只要连续的识别图片,内存就会持续上涨,最终内存会耗尽,进程退出; 至于配置你希望了解哪个配置? |
你好,麻烦确保使用Paddle2.0.0b0版本的Paddle,同时看下是否为最新代码吧~ |
我也是centos7,docker和非docker都是用的hubserving,Paddle2.0.0b0。 |
hubserving中的话,这个是一个已知问题哈~可以参考这里:PaddlePaddle/PaddleHub#682 |
我今天上午从github上下载最新代码,重新部署了一遍,测试了60张图片,内存占用从2.18G上涨到12.4G,并且只要进程不退出,内存就不下降。我目前没有使用GPU,仅用CPU |
pdserving会有这个问题吗,好像pdserving的部署文档不见了 |
用的移动版的模型chinese_ocr_db_crnn_mobile,CPU版 调用模型测试也有内存溢出问题,大概测试十几张会占到8G CPU,有解决办法吗 9 def get_image(url): |
遇到相同问题,移动版模型CPU计算,第一次调用后达到3G内存然后持续增加 |
我调试修改代码观察发现:
满足以上两个条件,GPU内存不再增长 |
Since you haven't replied for more than 3 months, we have closed this issue/pr. |
这个问题都没解决,就关闭了? |
我也遇到了,2.2.1+cu111,内存一直往上飙,最后64G内存全部用完就被系统kill了,真是无语 |
cpu下初始化完占用内存300多兆,识别完一张图片会到6G,识别完也不会释放内存,之后会越占越高。
The text was updated successfully, but these errors were encountered: