Skip to content

Increase file upload limit to 100 MB#77117

Merged
balloob merged 15 commits intodevfrom
feat/increase_max_size_upload
Nov 30, 2022
Merged

Increase file upload limit to 100 MB#77117
balloob merged 15 commits intodevfrom
feat/increase_max_size_upload

Conversation

@marvin-w
Copy link
Copy Markdown
Contributor

@marvin-w marvin-w commented Aug 21, 2022

Proposed change

As spoken with balloob this PR will increase the file upload limit via the new file upload API from 10 MB to 100 MB. It now also uses streaming to upload and reads the data in 1 MB chunks (if available) and writes them to the file.

Things that we should check as well:

  • Are there any webservers (like nginx) in front of HA in any of the available installation methods that potentially need the client_max_body_size adjusted?

Type of change

  • Bugfix (non-breaking change which fixes an issue)
  • Dependency upgrade
  • New integration (thank you!)
  • New feature (which adds functionality to an existing integration)
  • Deprecation (breaking change to happen in the future)
  • Breaking change (fix/feature causing existing functionality to break)
  • Code quality improvements to existing code or addition of tests

Additional information

  • This PR fixes or closes issue: fixes #
  • This PR is related to issue:
  • Link to documentation pull request:

Checklist

  • The code change is tested and works locally.
  • Local tests pass. Your PR cannot be merged unless tests pass
  • There is no commented out code in this PR.
  • I have followed the development checklist
  • The code has been formatted using Black (black --fast homeassistant tests)
  • Tests have been added to verify that the new code works.

If user exposed functionality or configuration variables are added/changed:

If the code communicates with devices, web services, or third-party tools:

  • The manifest file has all fields filled out correctly.
    Updated and included derived files by running: python3 -m script.hassfest.
  • New or updated dependencies have been added to requirements_all.txt.
    Updated by running python3 -m script.gen_requirements_all.
  • For the updated dependencies - a link to the changelog, or at minimum a diff between library versions is added to the PR description.
  • Untested files have been added to .coveragerc.

The integration reached or maintains the following Integration Quality Scale:

  • No score or internal
  • 🥈 Silver
  • 🥇 Gold
  • 🏆 Platinum

To help with the load of incoming pull requests:

@probot-home-assistant
Copy link
Copy Markdown

Hey there @home-assistant/core, mind taking a look at this pull request as it has been labeled with an integration (file_upload) you are listed as a code owner for? Thanks!
(message by CodeOwnersMention)

@marvin-w marvin-w force-pushed the feat/increase_max_size_upload branch from dc96865 to 1facaed Compare August 31, 2022 21:22
@pvizeli
Copy link
Copy Markdown
Member

pvizeli commented Sep 1, 2022

another way for not jump 100x times into a thread is to use deque from functiontools which is none blocking, and write each junk into queue and write it inside the thread. So there would be only 1 context switch

or:
https://github.com/aio-libs/janus

@marvin-w
Copy link
Copy Markdown
Contributor Author

marvin-w commented Sep 1, 2022

another way for not jump 100x times into a thread is to use deque from functiontools which is none blocking, and write each junk into queue and write it inside the thread. So there would be only 1 context switch

or: https://github.com/aio-libs/janus

The point of a streaming file upload is to not write the whole content to RAM first. Yes, we could do this but it will increase the RAM usage for larger files and then we might as well just revert the whole streaming approach.

@pvizeli
Copy link
Copy Markdown
Member

pvizeli commented Sep 1, 2022

Correct, with Janus you stream it into the thread, so you don't have a context switch and waiting for a thread after each junk. You can also limit the buffer and it get wait on the async part until the thread processed the buffer

@marvin-w marvin-w force-pushed the feat/increase_max_size_upload branch from 0c3e795 to 669b8e2 Compare October 14, 2022 12:04
@marvin-w marvin-w added second-opinion-wanted Add this label when a reviewer needs a second opinion from another member. and removed second-opinion-wanted Add this label when a reviewer needs a second opinion from another member. labels Oct 14, 2022
@balloob
Copy link
Copy Markdown
Member

balloob commented Oct 14, 2022

@marvin-w you can still achieve that with janus by calling join() on the Janus queue when it gets too full before adding more files.

https://github.com/aio-libs/janus/blob/master/janus/__init__.py#L602

@marvin-w marvin-w force-pushed the feat/increase_max_size_upload branch from 2138171 to 3f139e5 Compare October 14, 2022 19:07
@marvin-w
Copy link
Copy Markdown
Contributor Author

@marvin-w you can still achieve that with janus by calling join() on the Janus queue when it gets too full before adding more files.

https://github.com/aio-libs/janus/blob/master/janus/__init__.py#L602

I gave it a shot, it is working. However, I'm not 100% sure if it is correct like that. Especially the future handling looks kind of dirty. Open for suggestions :).

@balloob balloob merged commit 1908fea into dev Nov 30, 2022
@balloob balloob deleted the feat/increase_max_size_upload branch November 30, 2022 01:46
@github-actions github-actions bot locked and limited conversation to collaborators Dec 1, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants