Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在jetson xavier nx上,采用Jetpack5.0.2:nv-jetson-cuda11.4-cudnn8.4.1-trt8.4.1-jetpack5.0.2-xavier推理库,trt_int8报错 #419

Open
zhizhongqu opened this issue Jan 18, 2023 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@zhizhongqu
Copy link

预测脚本:
python3 infer_resnet.py --model_file=./resnet50/inference.pdmodel --params_file=./resnet50/inference.pdiparams --run_mode=trt_int8

报错如下:
I0117 21:55:26.689865 23205 engine.cc:675] ====== engine info ======
terminate called after throwing an instance of 'phi::enforce::EnforceNotMet'
what(): (InvalidArgument) thread local var predictor_id_per_thread must be initialized to >= 0, but now predictor_id_per_thread = -1
[Hint: Expected predictor_id_per_thread > -1, but received predictor_id_per_thread:-1 <= -1:-1.] (at /home/paddle/data/xly/workspace/24116/Paddle/paddle/fluid/inference/tensorrt/engine.h:298)


C++ Traceback (most recent call last):

0 paddle::operators::TensorRTEngineOp::RunCalibration(paddle::framework::Scope const&, phi::Place const&) const::{lambda()#1}::operator()() const
1 paddle::operators::TensorRTEngineOp::PrepareTRTEngine(paddle::framework::Scope const&, paddle::inference::tensorrt::TensorRTEngine*) const
2 paddle::inference::tensorrt::OpConverter::ConvertBlockToTRTEngine(paddle::framework::BlockDesc*, paddle::framework::Scope const&, std::vector<std::string, std::allocator<std::string > > const&, std::unordered_set<std::string, std::hash<std::string >, std::equal_to<std::string >, std::allocator<std::string > > const&, std::vector<std::string, std::allocator<std::string > > const&, paddle::inference::tensorrt::TensorRTEngine*)
3 paddle::inference::tensorrt::TensorRTEngine::FreezeNetwork()
4 paddle::inference::tensorrt::TensorRTEngine::GetEngineInfo()
5 paddle::inference::tensorrt::TensorRTEngine::context()


Error Message Summary:

FatalError: Process abort signal is detected by the operating system.
[TimeInfo: *** Aborted at 1674021326 (unix time) try "date -d @1674021326" if you are using GNU date ***]
[SignalInfo: *** SIGABRT (@0x3e800005a85) received by PID 23173 (TID 0xffff31ccf1e0) from PID 23173 ***]

@vivienfanghuagood vivienfanghuagood self-assigned this Feb 5, 2024
@vivienfanghuagood vivienfanghuagood added the bug Something isn't working label Feb 5, 2024
@vivienfanghuagood
Copy link
Collaborator

你好,麻烦更新到最新的develop之后看是否有相关问题。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants