Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] image segmentation very slow #79

Closed
leonardoFu opened this issue Apr 11, 2023 · 8 comments · Fixed by #545
Closed

[Bug] image segmentation very slow #79

leonardoFu opened this issue Apr 11, 2023 · 8 comments · Fixed by #545
Labels
bug Something isn't working

Comments

@leonardoFu
Copy link

I'm trying to use transform.js with MetaAI SAM, it cost above half a min to segment a 540p picture.

How to reproduce
Steps or a minimal working example to reproduce the behavior

Expected behavior
A clear and concise description of what you expected to happen.

Maybe faster?

Logs/screenshots
image

Environment

  • Transformers.js version: 1.4.2
  • Browser (if applicable): Edge 112.0.1722.34
  • Operating system (if applicable): MacOS 13.2.1
  • Other:

Additional context
Add any other context about the problem here.

get warning from console:
-nodewas not found. Usingonnxruntime-webas a fallback. We recommend installingonnxruntime-node` as it generally improves performance (up to 5X).

Source Code

import { pipeline } from '@xenova/transformers';

(async function main() {
performance.mark('segmentation');
let segmenter = await pipeline('image-segmentation', 'facebook/detr-resnet-50-panoptic');
let img = 'img'
let outputs = await segmenter('https://cdn.pixabay.com/photo/2017/02/20/18/03/cat-2083492__340.jpg');
performance.mark('segmentation-end');
console.log(performance.measure('segmentation-elapse', 'segmentation', 'segmentation-end'));

console.log(outputs);

const cat = outputs.find(out => out.label === 'cat');
const image = new ImageData(cat.mask.width, cat.mask.height);
const canvas = document.createElement('canvas');
canvas.width = cat.mask.width;
canvas.height = cat.mask.height;
const ctx = canvas.getContext('2d');
cat.mask.data.forEach((d, index) => {
const offset = index * 4
if (d === 255) {
image.data.set([255, 99, 71, 255], offset)
} else {
}
})
ctx.putImageData(image, 0, 0);
document.documentElement.append(canvas);
})()

@leonardoFu leonardoFu added the bug Something isn't working label Apr 11, 2023
@kungfooman
Copy link
Contributor

Usually the first thing I do with performance issues is to look at the exact functions: https://developer.chrome.com/docs/devtools/performance/

Can you post what you see there? It samples everything and you get a rather clear picture where it could be optimized.

@xenova
Copy link
Collaborator

xenova commented Apr 11, 2023

Right, so the main issue is that the model is currently running on the CPU (instead of the GPU). This is because:

  1. onnxruntime-web does not yet have the webgpu backend support (follow the PR here)
  2. When webgpu is supported, only Chrome currently has it enabled by default (with other browsers coming soon)

TLDR: wait a week or two until webgpu is supported! 🤣

@xenova
Copy link
Collaborator

xenova commented Apr 11, 2023

One thing I do find strange, however, is that it thinks you're running in Node? Somehow, the process global variable is defined. I tried in Edge, but I did not encounter that issue. Perhaps it's because you're running Edge with MacOS?

@leonardoFu
Copy link
Author

Usually the first thing I do with performance issues is to look at the exact functions: https://developer.chrome.com/docs/devtools/performance/

Can you post what you see there? It samples everything and you get a rather clear picture where it could be optimized.

image

@leonardoFu
Copy link
Author

One thing I do find strange, however, is that it thinks you're running in Node? Somehow, the process global variable is defined. I tried in Edge, but I did not encounter that issue. Perhaps it's because you're running Edge with MacOS?

I don't know where process is defined (not from user code), maybe because of the parcel bundler? but use process may not be a exact way to confirm the runtime environment. How about add process.versions.node ?

In this case, process is defined like:
image

@kungfooman
Copy link
Contributor

Bundlers can introduce all kinds of issues, that's why I changed my personal projects to use ES6 import / import-maps syntax, which is supported by modern browsers. And for older browsers you can still build single-file-bundles, so at least the bundle process is not messing with you developer experience (a simple ctrl+r vs. calling starting bundle/watch commands).

@xenova
Copy link
Collaborator

xenova commented Apr 12, 2023

Bundlers can introduce all kinds of issues, that's why I changed my personal projects to use ES6 import / import-maps syntax, which is supported by modern browsers. And for older browsers you can still build single-file-bundles, so at least the bundle process is not messing with you developer experience (a simple ctrl+r vs. calling starting bundle/watch commands).

Do you suggest the project migrate to ES6 syntax? I've been wanting to do it for a while to be honest.

@kungfooman
Copy link
Contributor

Do you suggest the project migrate to ES6 syntax? I've been wanting to do it for a while to be honest.

I feel a bit uncertain, because onnxruntime only ships with ES5 files e.g. here:

https://github.com/microsoft/onnxruntime/blob/main/js/web/lib/onnxjs/backends/webgl/utils.ts

I found some JSFIDLE ES6 files here, but I don't know how to make use of them yet:

await import("https://cdn.jsdelivr.net/npm/[email protected]/+esm")

image

I didn't read/test too much so far due to lack of time... usually you can just import ES5 files in ES6, but ONNX is maybe a bit more challenging/sophisticated (on account of different backends like WASM), so simply assumptions can be wrong.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants