-
Notifications
You must be signed in to change notification settings - Fork 786
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request] Deno Support #78
Comments
This seems to be an problem with onnxruntime itself: microsoft/onnxruntime#10913 The person who opened the issue did seem to find a workaround, so, maybe that will help too? @josephrocca is quite active in the community, and since he opened up that issue, he might have some more insight. |
So @nestarz created a Deno |
That's good to know - thanks! Have you tried overriding the dependency in your project? |
Not sure how. For example, this is my script file import * as t from 'npm:@xenova/transformers'
const pipeline = t.default.pipeline
let pipe = await pipeline('sentiment-analysis')
let out = await pipe('I love transformers!') This gets a new error, |
Maybe this might help: https://www.reddit.com/r/Deno/comments/x9q9vp/how_do_you_patch_a_module_like_patchpackage_in/ ? I haven't used Deno before, so, I probably won't be able to provide much more help (other than googling or "chatgpt-ing" 🤣 ) |
Have you been able to get it working in Deno yet? If not, feel free to try with the latest version 2.0.0-alpha.0. This may fix it. |
Hey, I just tried it, using npm: and esm.sh and both got an error.
npm:
|
it looks like sharp doesn't support deno at the moment |
@omar2205 Have you tried the steps outlined here: lovell/sharp#2583 (comment) ? This is an apparent workaround until denoland/deno#16164 is made available. |
Thanks for the ping @xenova, I followed that and it worked, however, I'm met with import { pipeline } from 'npm:@xenova/transformers';
/* warning
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
napi_add_finalizer is not yet supported.
*/
let pipe = await pipeline('sentiment-analysis');
/* Error
No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
Uncaught TypeError: Invalid URL: 'Xenova/distilbert-base-uncased-finetuned-sst-2-english/tokenizer.json'
*/ |
Okay great! That error is probably deno specific, because the check to see whether something is a valid URL is surrounded by a try-catch block (and works in other environments). While I look into it, you can get around it temporarily by disabling local models: import { pipeline, env } from 'npm:@xenova/transformers';
env.allowLocalModels=false;
let pipe = await pipeline('sentiment-analysis'); |
I tried to do that before, but it fails:
|
Hmm, it must be failing elsewhere then. Are you able to send the full stack trace? |
Here you go
|
Thanks! It looks like deno has it's own caching API, which only accepts HTTP URLs. I'll see what I can do. |
In the meantime, can you try disabling the cache system? You can do this by setting: |
That fixed it. But we got another error 😅
|
|
Yeah looks like it. Is it possible to use this https://github.com/nestarz/onnx_runtime? |
If you'd like, you can fork this repo and replace the onnxruntime import with that? |
After second thought, I don't know how that would even work. It's written for Deno. |
They seem to provide an import syntax in their README: import * as ort from "https://deno.land/x/onnx_runtime/mod.ts"; You can edit the ONNX imports in https://github.com/xenova/transformers.js/blob/main/src/backends/onnx.js |
Maybe I'm missing something, but: Wouldn't it make more sense to use the browser version of onnxruntime for Deno? Deno is aimed at being strongly web-compatible, so running transformers.js/onnxruntime with node.js emulation (i.e. importing with |
Trying using jsdelivr gives |
I set up a codespace and got it working: import { pipeline, env } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers';
// import {pipeline, env} from '@xenova/transformers';
env.useBrowserCache=false;
env.allowLocalModels=false;
let pipe = await pipeline('text-classification')
let output = await pipe('I love Transformers.js')
console.log(output) console output: $ cat main.js | deno run --allow-all --unstable -
No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
[ { label: "POSITIVE", score: 0.99961256980896 } ] The only drawback with this approach is that you might experience a performance hit since it's running with the WASM backend (and not native CPU). I'll look into why the tensor types are giving errors when running the native version. |
I got When I tried it in a deno deploy, I got hit with the memory limit. |
I don't quite know where Transformers.js would be throwing a DOMException 👀 Is it maybe an issue with your environment (since it works in that replit)?
Do you know what the memory limit is for deno deploy? Some sources suggest either 256MB or 512MB. The model you're testing with isn't that large. |
I believe it's 512 MB. I managed to get a QA bot going, deploy to Deno Deploy here, which was the demo I was going for. I don't know if you want to close this issue or wait for onnx to support Deno. |
Cool! I'll try look into the previous issue and see if there's a workaround since it might take a while for them to support Deno properly. I think it will also be good to create a Deno tutorial/example project, so, I'll close the issue after I make that. |
@xenova I gave your codespace example a try locally with Deno 1.37.0 and I get the following: No model specified. Using default model: "Xenova/distilbert-base-uncased-finetuned-sst-2-english".
error: Uncaught (in promise) NotSupported: Classic workers are not supported.
at createWorker (ext:runtime/11_workers.js:41:14)
at new Worker (ext:runtime/11_workers.js:110:16)
at Object.yc (https://cdn.jsdelivr.net/npm/@xenova/transformers:34:7988)
at Object.Cc (https://cdn.jsdelivr.net/npm/@xenova/transformers:34:8046)
at de (https://cdn.jsdelivr.net/npm/@xenova/transformers:34:5797)
at Ae (https://cdn.jsdelivr.net/npm/@xenova/transformers:34:9771)
at <anonymous> (https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/ort-wasm-simd-threaded.wasm:1:8324360)
at <anonymous> (https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/ort-wasm-simd-threaded.wasm:1:999116)
at <anonymous> (https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/ort-wasm-simd-threaded.wasm:1:1913327)
at <anonymous> (https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/ort-wasm-simd-threaded.wasm:1:4813118) Any quick ideas? I'll happily dig in the mean time :) |
Thanks @xenova, I've got it running now with the following code: import { pipeline, env } from "https://cdn.jsdelivr.net/npm/@xenova/transformers";
env.backends.onnx.wasm.numThreads = 1;
const classifier = await pipeline("text-classification", "Xenova/distilbert-base-uncased-finetuned-sst-2-english");
const input = ["I love Transformers.js", "I hate Transformers.js", "I not sure about Transformers.js"];
const result = await classifier(input);
console.log(result); then running in 1.37.1 with: $ deno run -A main.ts |
I am running into |
@devoutdrawai can you provide the code snippet you are using? Also, if you import the unminified version of the script (https://cdn.jsdelivr.net/npm/@xenova/[email protected]/dist/transformers.js), you'll see a more helpful error message. Regardless, I believe you'll be running into issues since deno doesn't yet support offscreen canvas (denoland/deno#19533) |
@xenova Yeah I got |
Update from the sharp.js team about installation with Deno: lovell/sharp#3750 (comment) 🥳
Will be a good idea to revisit proper Deno support (instead of using the web/WASM version). |
@xenova , I think that's a good idea too to avoid paying the sometimes steep wasm price, unless strictly necessary
|
It seems v0.33 is now publically avaliable! |
Have this been working since transformers.js 2.9 where sharp was bumped to v0.33? |
transformers.js v 2.13.1 Not sure if this should be a new issue but is also Deno related. I've been using remote models without issue in a Supabase edge function (Deno environment), now experimenting with using the same model locally and hitting errors related to the local filesystem. I'm getting Invalid URL error
I've had a pick through the source and don't think this should be happening if the 'fs' package is available, which AFAIK it should be in deno. So I tried adding env.FS = true and got a different error, again appears related to fs (or lack thereof)
Folder structure is: embed.ts
|
@DuncanLHS Could you try test if |
@xenova Yes, but before I do I've just spotted that any node packages available in deno need to use the 'node:' specifier. |
Also note that filesystem access is disabled when loading the library from a CDN, so you would need to wait until the library functions correctly with Deno. |
OK, I'll move the functionality into node for now. |
The alpha is working in deno https://huggingface.co/posts/Xenova/681836693682285 working example DENO_FUTURE=1 deno add npm:@huggingface/[email protected] // main.ts
import { pipeline } from "@huggingface/transformers";
async function main() {
const generateEmbeddings = await pipeline("feature-extraction");
const embeddings = await generateEmbeddings("Hello, World!");
console.log(embeddings);
}
if (import.meta.main) {
main();
} DENO_FUTURE=1 deno run -RWN --allow-ffi --deny-env main.ts
Unfortunately only able to use |
Name of the feature
Support running in the Deno runtime.
Additional context
I tried all the tricks,
esm.sh
,unpkg
,esm.run
, andnpm:@xenova/transformers
, but nothing worked.Using unpkg import I get
Using esm.sh
The text was updated successfully, but these errors were encountered: