Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query about speed of inference on semantic segmentation. #19

Open
zwbx opened this issue Mar 16, 2023 · 3 comments
Open

Query about speed of inference on semantic segmentation. #19

zwbx opened this issue Mar 16, 2023 · 3 comments
Labels
enhancement New feature or request

Comments

@zwbx
Copy link

zwbx commented Mar 16, 2023

Hi, thanks for your nick work.
I find it is a little bit slow to inference on semantic segmentation code, the speed is less than 1 task/s on 1 v100.
Is this a normal case? or there are anything to do?
Thanks:-)

@qinenergy qinenergy added the enhancement New feature or request label Aug 24, 2023
@daeunni
Copy link

daeunni commented Sep 10, 2023

Same here

@daeunni
Copy link

daeunni commented Nov 25, 2023

@zwbx Hi, did you figure out what is the most reason?
I think this is because CoTTA uses augmentation processes like [0.5, 1.0, 1.5, ..] when doing each image's inferences.

@zwbx
Copy link
Author

zwbx commented Nov 26, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants