Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support React Native #118

Open
wants to merge 145 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
145 commits
Select commit Hold shift + click to select a range
1aefcb6
Support for ReactNative
hans00 May 17, 2023
38acc1e
Correct module import
hans00 May 17, 2023
0d7faf0
Update package-lock
hans00 May 17, 2023
c1da2c8
fix errors
hans00 May 17, 2023
bb550f1
fix error
hans00 May 17, 2023
35d8f18
Merge branch 'xenova:main' into main
hans00 May 17, 2023
83cd1e6
Prevent fallback WASM on RN
hans00 May 17, 2023
f64e61b
Let native side load binary file, instead load as buffer
hans00 May 17, 2023
85d75ef
Fix 0 size tensor
hans00 May 17, 2023
392dc67
Correct version
hans00 May 17, 2023
c9cada9
Constantly fetch arraybuffer
hans00 May 17, 2023
20f128f
Replace `Uint8Array` as `Buffer` on RN
hans00 May 17, 2023
891a0a4
Revert version
hans00 May 17, 2023
fd302ae
Merge native codes
hans00 May 19, 2023
37e89cf
Fix test
hans00 May 19, 2023
9ee7504
Disable `image-encode` and `image-decode` for web
hans00 May 19, 2023
91ecc47
Sync code
hans00 May 19, 2023
17bc3e8
Use `RNFS.downloadFile` to avoid OOM on large model file.
hans00 May 19, 2023
58c2468
Correct download progress order
hans00 May 19, 2023
52be4db
Fix bug
hans00 May 19, 2023
e0683b3
Merge branch 'xenova:main' into main
hans00 May 19, 2023
362a0c7
Fix error
hans00 May 19, 2023
c633a2d
Use `interpolate_data` instead `resize-image-data`
hans00 May 19, 2023
798edec
Correct params
hans00 May 19, 2023
8c66b31
Support `react-native-gcanvas` to improve image parse performance
hans00 May 19, 2023
874340c
Correct algorithm impl
hans00 May 20, 2023
e9c2872
Use `interpolate_data` to resize
hans00 May 20, 2023
7fad900
Reuse code
hans00 May 20, 2023
327fb59
`ImageData` not need wrap with `Uint8ClampedArray`
hans00 May 20, 2023
ac85425
Add switch to enable use GCanvas
hans00 May 20, 2023
7b9232f
Support gCanvas on `resize`, `crop` and `pad`
hans00 May 21, 2023
47711c8
Force use latest `jpeg-js`
hans00 May 21, 2023
c2f01bd
Use `XRegExp` to support unicode on RN
hans00 May 21, 2023
185c58f
Revert "Use `XRegExp` to support unicode on RN"
hans00 May 21, 2023
e86687d
Add missing var
hans00 May 21, 2023
b7521c5
Set `useGCanvas` default true, instead env check
hans00 May 21, 2023
1730bea
Log full tensor on browser
hans00 May 21, 2023
ff64922
Merge branch 'xenova:main' into main
hans00 May 22, 2023
0e54440
Update package-lock.json
hans00 May 22, 2023
3f32b71
Use dynamic import to select backend
hans00 May 25, 2023
fd0fa22
Add `onnxruntime-react-native` in optional
hans00 May 25, 2023
3edb592
Set browser ignore
hans00 May 25, 2023
4f6b788
Fix runtime setup
hans00 May 25, 2023
0ac4b4e
Default export `onnxruntime-common`
hans00 May 25, 2023
0a260c5
Add missing var
hans00 May 25, 2023
7b2667b
Fix `browser` field for `react-native`
hans00 May 25, 2023
28e8f8d
Fix import
hans00 May 25, 2023
8069844
Fix build
hans00 May 25, 2023
a5682dd
Allow fallback `web` on node-like environment
hans00 May 25, 2023
77ac38e
Merge branch 'main' into merge
hans00 May 30, 2023
57e4bc8
Merge branch 'main' into merge
hans00 Jun 2, 2023
eee34a4
Merge branch 'xenova:main' into merge
hans00 Jun 30, 2023
e09f2d9
Correct variable
hans00 Jul 4, 2023
9e7d629
Fix error on load model
hans00 Jul 4, 2023
311cb9c
Merge branch 'main' into merge
hans00 Jul 12, 2023
6f85b28
Merge branch 'main' into merge
hans00 Jul 29, 2023
f27121d
Merge branch 'main' into merge
hans00 Aug 6, 2023
973f33a
Correct package.json
hans00 Aug 7, 2023
d70abf9
Export `isReady` promise
hans00 Aug 7, 2023
44f0668
Merge branch 'main' into merge
hans00 Aug 8, 2023
dff8dbb
Use `react-native` field to replace module
hans00 Aug 15, 2023
7d49523
Merge branch 'xenova:main' into merge
hans00 Aug 18, 2023
4b63d92
Add missing optionalDeps
hans00 Aug 22, 2023
2050020
Merge branch 'main' into merge
hans00 Aug 23, 2023
d6d42dc
Fix config file never cache
hans00 Aug 23, 2023
8aa2993
Merge branch 'xenova:main' into merge
hans00 Sep 8, 2023
f98160f
Fix missing var
hans00 Sep 8, 2023
214ba14
Avoid use preserve word
hans00 Sep 11, 2023
7fe660b
Support `OffscreenCanvas` polyfill without `document`
hans00 Oct 3, 2023
b95a836
Merge branch 'main' into merge
hans00 Oct 3, 2023
d14ec8e
Fix `type` is missing
hans00 Oct 12, 2023
d49b0d0
Merge branch 'xenova:main' into merge
hans00 Oct 24, 2023
2fdf3e6
Opt-out wasm for RN and wait runtime loaded
hans00 Oct 30, 2023
ee89374
Merge branch 'main' into merge
hans00 Nov 10, 2023
95526e0
Merge branch 'xenova:main' into merge
hans00 Nov 16, 2023
2d54b61
Merge branch 'main' into merge
hans00 Dec 13, 2023
73b9131
Merge branch 'main' into merge
hans00 Dec 24, 2023
24726fa
Update `package-lock.json`
hans00 Dec 24, 2023
880348a
Fix error
hans00 Dec 24, 2023
1210102
Fix fetch binary for TTS
hans00 Dec 24, 2023
27e8434
Fix error when file cached
hans00 Dec 24, 2023
b16ad4c
Merge branch 'xenova:main' into merge
hans00 Dec 31, 2023
c5bbba5
Merge branch 'main' into merge
hans00 Jan 3, 2024
ad60a9b
Merge branch 'main' into merge
hans00 Jan 14, 2024
b42d06c
break trying load if backend error
hans00 Jan 17, 2024
2e15702
Fix web support
hans00 Jan 17, 2024
a94b45a
Bump onnxruntime
hans00 Jan 17, 2024
4a8a5bb
Remove `isReady` check
hans00 Jan 17, 2024
af1d6f6
Only continue on `Unsupported model type`
hans00 Jan 17, 2024
8a234c5
Continue on file 404 error
hans00 Jan 17, 2024
566ec32
Fix error on node
hans00 Jan 17, 2024
6e3affa
Exclude `wasm` for RN
hans00 Jan 18, 2024
79979e8
Merge branch 'xenova:main' into merge
hans00 Feb 1, 2024
c7aa59f
Merge branch 'main' into merge
hans00 Feb 13, 2024
3592613
Update `package-lock.json`
hans00 Feb 13, 2024
685b718
Support decode wav
hans00 Feb 17, 2024
2a35254
add `node-wav`
hans00 Feb 23, 2024
718c3d7
Merge branch 'main' into merge
hans00 Feb 23, 2024
755f5ba
Fix `package.json`
hans00 Feb 24, 2024
95faa44
Bump onnxruntime
hans00 Feb 24, 2024
c0dda59
Update `package-lock.json`
hans00 Feb 24, 2024
7fb93d9
Fix typo
hans00 Feb 24, 2024
d4d6ebf
Support Tensor of `[email protected]`
hans00 Feb 24, 2024
5a0d504
Correct algorithm implement
hans00 Feb 24, 2024
8cd25a4
Correct resample factor behavior
hans00 Feb 24, 2024
b597fa4
Rename `useGCanvas` to `useRNCanvas`
hans00 Feb 24, 2024
f7222ec
Better naming, change `useRNCanvas` to `rnUseCanvas`
hans00 Feb 24, 2024
5a23e2f
Fix `Tensor` index getter
hans00 Feb 25, 2024
a089ef0
Fix not work with spread transform
hans00 Feb 25, 2024
306b208
Add dispose to release memory
hans00 Feb 25, 2024
431983b
Support `crop` for ReactNative
hans00 Feb 25, 2024
060a2e5
Change to `image-codecs`
hans00 Feb 25, 2024
7ad4ef2
Setup `image-codecs`
hans00 Feb 25, 2024
b4a4bce
Correct URI format
hans00 Feb 25, 2024
9b7faf9
Fix bug
hans00 Feb 25, 2024
d697e77
Fix error process
hans00 Feb 28, 2024
173cc09
Correct error process
hans00 Feb 28, 2024
0d31927
Merge branch 'main' into merge
hans00 Mar 6, 2024
eb56058
Update package-lock.json
hans00 Mar 6, 2024
1a2bc57
Merge branch 'main' into merge
hans00 Mar 8, 2024
3dd8855
Disable WASM configure for React Native
hans00 Mar 18, 2024
b41e23c
Move `readFile` into `FileResponse`
hans00 Mar 26, 2024
840fa5c
Support load model from local path for RN & Node.js
hans00 Mar 26, 2024
9e15f15
Merge branch 'main' into merge
hans00 Mar 26, 2024
d81b459
Fix missing function
hans00 Mar 27, 2024
333c547
Update package-lock
hans00 Mar 27, 2024
0a29022
Fix missing var
hans00 Mar 27, 2024
dc05218
Use `fetch` for non-RN env
hans00 Mar 27, 2024
243f7fd
Cleanup invalid doc
hans00 Mar 27, 2024
bad3882
Add `session_options`
hans00 Apr 3, 2024
4098dd0
Fix `session_options` not work
hans00 Apr 3, 2024
800a476
Use same option for path load
hans00 Apr 3, 2024
c7fdd36
Fix `session_options`
hans00 Apr 3, 2024
c9bed9c
Correct logic
hans00 Apr 3, 2024
f835060
Fix model load
hans00 Apr 3, 2024
fbd7136
Return file path if not RN
hans00 Apr 4, 2024
6da74ff
Prevent readFile when get model path
hans00 Apr 8, 2024
3a3fa29
Merge branch 'main' into merge
hans00 Apr 20, 2024
22cbfe9
Update package-lock
hans00 Apr 20, 2024
c986a8c
Merge branch 'main' into merge
hans00 May 13, 2024
8df376b
Fix syntax
hans00 May 13, 2024
9f30f1b
Merge branch 'main' into merge
hans00 Jun 10, 2024
9109355
Fix model file download
hans00 Jun 18, 2024
918950e
Merge branch 'main' into merge
hans00 Jul 22, 2024
9015d03
Merge branch 'main' into merge
hans00 Oct 20, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -62,10 +62,18 @@
"homepage": "https://github.com/huggingface/transformers.js#readme",
"dependencies": {
"@huggingface/jinja": "^0.3.0",
"image-codecs": "^0.1.0",
"onnxruntime-node": "1.19.2",
"onnxruntime-web": "1.20.0-dev.20241016-2b8fc5529b",
"onnxruntime-react-native": "1.19.2",
"sharp": "^0.33.5"
},
"optionalDependencies": {
"path-browserify": "^1.0.1"
},
"peerDependencies": {
"react-native-fs": "*"
},
"devDependencies": {
"@types/jest": "^29.5.1",
"@webgpu/types": "^0.1.44",
Expand All @@ -87,8 +95,18 @@
"README.md",
"LICENSE"
],
"react-native": {
"fs": "react-native-fs",
"onnxruntime-web": false,
"onnxruntime-node": "onnxruntime-react-native",
"image-codecs": "image-codecs",
"sharp": false,
"path": "path-browserify",
"stream/web": false
},
"browser": {
"fs": false,
"image-codecs": false,
"path": false,
"url": false,
"sharp": false,
Expand Down
12 changes: 9 additions & 3 deletions src/backends/onnx.js
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,12 @@ if (ORT_SYMBOL in globalThis) {
// If the JS runtime exposes their own ONNX runtime, use it
ONNX = globalThis[ORT_SYMBOL];

} else if (apis.IS_REACT_NATIVE_ENV) {
ONNX = ONNX_NODE.default ?? ONNX_NODE;

supportedDevices.push('xnnpack', 'cpu');
defaultDevices = ['xnnpack', 'cpu'];

} else if (apis.IS_NODE_ENV) {
ONNX = ONNX_NODE.default ?? ONNX_NODE;

Expand Down Expand Up @@ -146,19 +152,19 @@ let wasmInitPromise = null;

/**
* Create an ONNX inference session.
* @param {Uint8Array} buffer The ONNX model buffer.
* @param {Uint8Array|string} bufferOrPath The ONNX model buffer.
* @param {import('onnxruntime-common').InferenceSession.SessionOptions} session_options ONNX inference session options.
* @param {Object} session_config ONNX inference session configuration.
* @returns {Promise<import('onnxruntime-common').InferenceSession & { config: Object}>} The ONNX inference session.
*/
export async function createInferenceSession(buffer, session_options, session_config) {
export async function createInferenceSession(bufferOrPath, session_options, session_config) {
if (wasmInitPromise) {
// A previous session has already initialized the WASM runtime
// so we wait for it to resolve before creating this new session.
await wasmInitPromise;
}

const sessionPromise = InferenceSession.create(buffer, session_options);
const sessionPromise = InferenceSession.create(bufferOrPath, session_options);
wasmInitPromise ??= sessionPromise;
const session = await sessionPromise;
session.config = session_config;
Expand Down
18 changes: 14 additions & 4 deletions src/env.js
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ const IS_NODE_ENV = IS_PROCESS_AVAILABLE && process?.release?.name === 'node';
const IS_FS_AVAILABLE = !isEmpty(fs);
const IS_PATH_AVAILABLE = !isEmpty(path);

const IS_REACT_NATIVE_ENV = typeof navigator !== 'undefined' && navigator.product === 'ReactNative';

/**
* A read-only object containing information about the APIs available in the current environment.
*/
Expand All @@ -50,6 +52,9 @@ export const apis = Object.freeze({
/** Whether we are running in a web worker environment */
IS_WEBWORKER_ENV,

/** Whether we are running in a React Native environment */
IS_REACT_NATIVE_ENV,

/** Whether the Cache API is available */
IS_WEB_CACHE_AVAILABLE,

Expand All @@ -73,9 +78,12 @@ export const apis = Object.freeze({
});

const RUNNING_LOCALLY = IS_FS_AVAILABLE && IS_PATH_AVAILABLE;
const dirname__ = RUNNING_LOCALLY
? path.dirname(path.dirname(url.fileURLToPath(import.meta.url)))
: './';
let dirname__ = './';
if (IS_REACT_NATIVE_ENV) {
dirname__ = fs.DocumentDirectoryPath;
} else if (RUNNING_LOCALLY) {
dirname__ = path.dirname(path.dirname(url.fileURLToPath(import.meta.url)));
}

// Only used for environments with access to file system
const DEFAULT_CACHE_DIR = RUNNING_LOCALLY
Expand Down Expand Up @@ -117,7 +125,7 @@ export const env = {
/////////////////// Backends settings ///////////////////
// NOTE: These will be populated later by the backends themselves.
backends: {
// onnxruntime-web/onnxruntime-node
// onnxruntime-web/onnxruntime-node/onnxruntime-react-native
onnx: {},
},

Expand All @@ -130,6 +138,8 @@ export const env = {
localModelPath: localModelPath,
useFS: IS_FS_AVAILABLE,

rnUseCanvas: true,

/////////////////// Cache settings ///////////////////
useBrowserCache: IS_WEB_CACHE_AVAILABLE,

Expand Down
158 changes: 104 additions & 54 deletions src/models.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ import {

import {
getModelFile,
getModelPath,
getModelJSON,
} from './utils/hub.js';

Expand Down Expand Up @@ -117,6 +118,8 @@ import { apis } from './env.js';
import { WhisperGenerationConfig } from './models/whisper/generation_whisper.js';
import { whisper_language_to_code } from './models/whisper/common_whisper.js';

const IS_BROWSER = typeof navigator !== 'undefined' && navigator.product !== 'ReactNative';

//////////////////////////////////////////////////
// Model types: used internally
const MODEL_TYPES = {
Expand Down Expand Up @@ -146,7 +149,7 @@ const MODEL_CLASS_TO_NAME_MAPPING = new Map();
* @param {string} pretrained_model_name_or_path The path to the directory containing the model file.
* @param {string} fileName The name of the model file.
* @param {import('./utils/hub.js').PretrainedModelOptions} options Additional options for loading the model.
* @returns {Promise<{buffer: Uint8Array, session_options: Object, session_config: Object}>} A Promise that resolves to the data needed to create an InferenceSession object.
* @returns {Promise<{bufferOrPath: Uint8Array|string, session_options: Object, session_config: Object}>} A Promise that resolves to the data needed to create an InferenceSession object.
* @private
*/
async function getSession(pretrained_model_name_or_path, fileName, options) {
Expand Down Expand Up @@ -223,63 +226,103 @@ async function getSession(pretrained_model_name_or_path, fileName, options) {
);
}

const bufferPromise = getModelFile(pretrained_model_name_or_path, modelFileName, true, options);
let bufferOrPath;

// handle onnx external data files
const use_external_data_format = options.use_external_data_format ?? custom_config.use_external_data_format;
/** @type {Promise<{path: string, data: Uint8Array}>[]} */
let externalDataPromises = [];
if (use_external_data_format && (
use_external_data_format === true ||
(
typeof use_external_data_format === 'object' &&
use_external_data_format.hasOwnProperty(fileName) &&
use_external_data_format[fileName] === true
)
)) {
if (apis.IS_NODE_ENV) {
throw new Error('External data format is not yet supported in Node.js');
if (apis.IS_NODE_ENV || apis.IS_REACT_NATIVE_ENV) {

const pathPromise = getModelPath(pretrained_model_name_or_path, modelFileName, true, options);
// handle onnx external data files
const use_external_data_format = options.use_external_data_format ?? custom_config.use_external_data_format;
/** @type {Promise<{path: string, data: string}>[]} */
let externalDataPromises = [];
if (use_external_data_format && (
use_external_data_format === true ||
(
typeof use_external_data_format === 'object' &&
use_external_data_format.hasOwnProperty(fileName) &&
use_external_data_format[fileName] === true
)
)) {
const path = `${fileName}${suffix}.onnx_data`;
const fullPath = `${options.subfolder ?? ''}/${path}`;
externalDataPromises.push(new Promise(async (resolve, reject) => {
const data = await getModelPath(pretrained_model_name_or_path, fullPath, true, options);
resolve({ path, data })
}));

} else if (session_options.externalData !== undefined) {
externalDataPromises = session_options.externalData.map(async (ext) => {
// if the external data is a string, fetch the file and replace the string with its content
if (typeof ext.data === "string") {
const ext_buffer = await getModelPath(pretrained_model_name_or_path, ext.data, true, options);
return { ...ext, data: ext_buffer };
}
return ext;
});
}
const path = `${fileName}${suffix}.onnx_data`;
const fullPath = `${options.subfolder ?? ''}/${path}`;
externalDataPromises.push(new Promise(async (resolve, reject) => {
const data = await getModelFile(pretrained_model_name_or_path, fullPath, true, options);
resolve({ path, data })
}));

} else if (session_options.externalData !== undefined) {
externalDataPromises = session_options.externalData.map(async (ext) => {
// if the external data is a string, fetch the file and replace the string with its content
if (typeof ext.data === "string") {
const ext_buffer = await getModelFile(pretrained_model_name_or_path, ext.data, true, options);
return { ...ext, data: ext_buffer };
}
return ext;
});
}

if (externalDataPromises.length > 0) {
session_options.externalData = await Promise.all(externalDataPromises);
}
if (externalDataPromises.length > 0) {
session_options.externalData = await Promise.all(externalDataPromises);
}

if (selectedDevice === 'webgpu') {
const shapes = getKeyValueShapes(options.config, {
prefix: 'present',
});
if (Object.keys(shapes).length > 0 && !isONNXProxy()) {
// Only set preferredOutputLocation if shapes are present and we aren't proxying ONNX
/** @type {Record<string, import('onnxruntime-common').Tensor.DataLocation>} */
const preferredOutputLocation = {};
for (const key in shapes) {
preferredOutputLocation[key] = 'gpu-buffer';
bufferOrPath = await pathPromise;
} else {

const bufferPromise = getModelFile(pretrained_model_name_or_path, modelFileName, true, options);

// handle onnx external data files
const use_external_data_format = options.use_external_data_format ?? custom_config.use_external_data_format;
/** @type {Promise<{path: string, data: Uint8Array}>[]} */
let externalDataPromises = [];
if (use_external_data_format && (
use_external_data_format === true ||
(
typeof use_external_data_format === 'object' &&
use_external_data_format.hasOwnProperty(fileName) &&
use_external_data_format[fileName] === true
)
)) {
const path = `${fileName}${suffix}.onnx_data`;
const fullPath = `${options.subfolder ?? ''}/${path}`;
externalDataPromises.push(new Promise(async (resolve, reject) => {
const data = await getModelFile(pretrained_model_name_or_path, fullPath, true, options);
resolve({ path, data })
}));

} else if (session_options.externalData !== undefined) {
externalDataPromises = session_options.externalData.map(async (ext) => {
// if the external data is a string, fetch the file and replace the string with its content
if (typeof ext.data === "string") {
const ext_buffer = await getModelFile(pretrained_model_name_or_path, ext.data, true, options);
return { ...ext, data: ext_buffer };
}
return ext;
});
}

if (externalDataPromises.length > 0) {
session_options.externalData = await Promise.all(externalDataPromises);
}

if (selectedDevice === 'webgpu') {
const shapes = getKeyValueShapes(options.config, {
prefix: 'present',
});
if (Object.keys(shapes).length > 0 && !isONNXProxy()) {
// Only set preferredOutputLocation if shapes are present and we aren't proxying ONNX
/** @type {Record<string, import('onnxruntime-common').Tensor.DataLocation>} */
const preferredOutputLocation = {};
for (const key in shapes) {
preferredOutputLocation[key] = 'gpu-buffer';
}
session_options.preferredOutputLocation = preferredOutputLocation;
}
session_options.preferredOutputLocation = preferredOutputLocation;
}
}

const buffer = await bufferPromise;
bufferOrPath = await bufferPromise;
}

return { buffer, session_options, session_config };
return { bufferOrPath, session_options, session_config };
}

/**
Expand All @@ -294,8 +337,8 @@ async function getSession(pretrained_model_name_or_path, fileName, options) {
async function constructSessions(pretrained_model_name_or_path, names, options) {
return Object.fromEntries(await Promise.all(
Object.keys(names).map(async (name) => {
const { buffer, session_options, session_config } = await getSession(pretrained_model_name_or_path, names[name], options);
const session = await createInferenceSession(buffer, session_options, session_config);
const { bufferOrPath, session_options, session_config } = await getSession(pretrained_model_name_or_path, names[name], options);
const session = await createInferenceSession(bufferOrPath, session_options, session_config);
return [name, session];
})
));
Expand Down Expand Up @@ -342,7 +385,7 @@ function validateInputs(session, inputs) {
missingInputs.push(inputName);
continue;
}
// NOTE: When `env.wasm.proxy is true` the tensor is moved across the Worker
// NOTE: When `onnx_env.wasm.proxy is true` the tensor is moved across the Worker
// boundary, transferring ownership to the worker and invalidating the tensor.
// So, in this case, we simply sacrifice a clone for it.
checkedInputs[inputName] = isONNXProxy() ? tensor.clone() : tensor;
Expand Down Expand Up @@ -386,7 +429,14 @@ async function sessionRun(session, inputs) {
} catch (e) {
// This usually occurs when the inputs are of the wrong type.
console.error(`An error occurred during model execution: "${e}".`);
console.error('Inputs given to model:', checkedInputs);
// Not log full data, it may cause crash in React Native
console.error(
'Inputs given to model:',
IS_BROWSER ? checkedInputs : Object.fromEntries(
Object.entries(checkedInputs)
.map(([key, tensor]) => [key, { dims: tensor.dims, type: tensor.type }])
)
);
hans00 marked this conversation as resolved.
Show resolved Hide resolved
throw e;
}
}
Expand Down
7 changes: 5 additions & 2 deletions src/pipelines.js
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,9 @@ import {
topk,
} from './utils/tensor.js';
import { RawImage } from './utils/image.js';
import {
fetchBinary
} from './utils/hub.js';


/**
Expand Down Expand Up @@ -2719,7 +2722,7 @@ export class TextToAudioPipeline extends (/** @type {new (options: TextToAudioPi
if (typeof speaker_embeddings === 'string' || speaker_embeddings instanceof URL) {
// Load from URL with fetch
speaker_embeddings = new Float32Array(
await (await fetch(speaker_embeddings)).arrayBuffer()
await (await fetchBinary(speaker_embeddings)).arrayBuffer()
);
}

Expand Down Expand Up @@ -3339,4 +3342,4 @@ async function loadItems(mapping, model, pretrainedOptions) {
}

return result;
}
}
4 changes: 3 additions & 1 deletion src/tokenizers.js
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@ import {
import { max, min, round } from './utils/maths.js';
import { Tensor } from './utils/tensor.js';

import { env } from './env.js';

import {
PriorityQueue,
TokenLattice,
Expand Down Expand Up @@ -2084,7 +2086,7 @@ class ByteLevelDecoder extends Decoder {
*/
convert_tokens_to_string(tokens) {
const text = tokens.join('');
const byteArray = new Uint8Array([...text].map(c => this.byte_decoder[c]));
const byteArray = Uint8Array.from([...text].map(c => this.byte_decoder[c]));
const decoded_text = this.text_decoder.decode(byteArray);
return decoded_text;
}
Expand Down
Loading