base 64 image - backend #105
Answered
by
vladmandic
juancruzsosagab
asked this question in
Q&A
-
Hi @vladmandic or someone! |
Beta Was this translation helpful? Give feedback.
Answered by
vladmandic
May 4, 2022
Replies: 1 comment 1 reply
-
something like this? const tf = require('@tensorflow/tfjs-node');
function b64toTensor(b64string) {
const img = Buffer.from(b64string, 'base64');
const hcw = tf['node'].decodeJpeg(img); // decode jpeg data (or use different methods for png, gif, etc.) to tensor in hcw format
const nhcw = hcw.expandDims(0); // expand hcw to nhcw that can be directly used by faceapi
tf.dispose(hcw);
return nhcw; // dont forget to dispose tensor once faceapi is done
} alternatively, you can use function b64toCanvas(b64string) {
const img = Buffer.from(b64string, 'base64');
const input = await canvas.loadImage(img);
const canvas = canvas.createCanvas(input.width, input.height);
const ctx = canvas.getContext('2d');
ctx.drawImage(img, 0, 0, input.width, input.height);
return canvas
} |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
vladmandic
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
something like this?
alternatively, you can use
node-canvas
with the sameimg
variable and then passcanvas
object tofaceapi
for example: