Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 0 additions & 4 deletions paddle/fluid/inference/tensorrt/engine.cc
Original file line number Diff line number Diff line change
Expand Up @@ -196,11 +196,7 @@ bool TensorRTEngine::Enqueue(nvinfer1::IExecutionContext *context,
if (!with_dynamic_shape()) {
ret = context->enqueue(batch_size, buffers->data(), stream, nullptr);
} else {
#if IS_TRT_VERSION_GE(8500)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这是支持trt的enqueueV3接口,不能删除

Copy link
Contributor Author

@co63oc co63oc Oct 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image 这里有重复的enqueueV3调用 @lizexu123

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

如果TensorRT版本大于8.5,且我们用的是静态shape,按照你这个逻辑,不久跑到了enqueue接口去了?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image 加了注释这样容易查看

ret = context->enqueueV3(stream);
#else
ret = context->enqueueV2(buffers->data(), stream, nullptr);
#endif
}
#endif
return ret;
Expand Down
Loading