Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference time #1

Open
mamineayari opened this issue Apr 7, 2017 · 2 comments
Open

inference time #1

mamineayari opened this issue Apr 7, 2017 · 2 comments

Comments

@mamineayari
Copy link

Hi,
Thanks for sharing this code.
Since I don't have MATLAB to play with it, I'd like to know the processing time (inference time) that would take for a test image ?

Thanks in advance,

@Liusifei
Copy link
Owner

Liusifei commented Apr 7, 2017

It is 159ms for CPU and 67ms for GPU titanX

Sorry I made a mistake, this was the time for one mini-batch, which contains 40 images.
When I test the single images' inference with pycaffe it is around 2.75 ms with titan pascal. Super fast.

@mamineayari
Copy link
Author

Thanks,

The file (the model) "face_parsing_v1_iter_20800. caffemodel" is missing. Is it possible to share it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants