-
Notifications
You must be signed in to change notification settings - Fork 233
Description
Bug Report
Prerequisites
- [ x ] Can you reproduce the problem?
- [ x ] Are you running the latest version?
- [ x ] Are you reporting to the correct repository?
- [ x ] Did you perform a cursory search?
yes, I found a gist that suggested reading entire file into memory for each chunk splicing :)
Description
Attempting to pass a Stream into LargeFileUploadTask as file content is failing with
this.file.content.slice is not a function
const fileObject = {
size,
content: fileStream,
name: targetFileName,
};
const uploadTask = await new LargeFileUploadTask(
graphClient,
fileObject,
uploadSession
);
const response = await uploadTask.upload()Steps to Reproduce
- Take the example https://github.com/microsoftgraph/msgraph-sdk-javascript/blob/HEAD/docs/tasks/LargeFileUploadTask.md
- use
var readStream = fs.createReadStream(filename);to get a file stream and pass that in as content
Expected behavior: [What you expected to happen]
Stream flows and gets accumulated into chunks that get uploaded
Actual behavior: [What actually happened]
Error stemming from the assumption content is a Buffer
Additional Context
Streams allow for a container limited to 100MB of RM to upload a multi GB file successfully.
LargFileUploadTask is where it breaks.
Users of the SDK can't afford to load the entire file into RAM if they're building a multi tenant web app.
I'm willing to contribute a solution if I get someone from the maintainers to help me out (I have little experience using TypeScript and typing a stream is difficult)
Usage Information
SDK Version - [SDK version you are using]
- [ x ] Node (Check, if using Node version of SDK)
Node Version - [The version of Node you are using]
10+ (I used multiple)