-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Many queued files can make Zeronet stop responding? #2161
Comments
Unfortunately it's a limitation in the browser: they only allow 6-8 concurrent connections per domain. So if you have that amount of pending file downloads, then the browser will not start load new requests. |
@HelloZeroNet What's the best way to load a lot of images then? Should I:
? |
Yeah that should work for many files. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Step 1: Please describe your environment
Step 2: Describe the problem:
"I made the mistake of trying to seed all small and big files on a board that apparently is mostly 0 peers. I've been stuck "updating 14000 files" for a week now and it's very unresponsive. How do I cancel trying to sync all those files?" source: millchan
And some time ago, i have had similar feeling, that too many (1000+) files queued and being actively downloaded (or at least attempted), maybe simultaneously on multiple zites, maybe makes zeronet do not respond other request (to load a webpage). Can this be true? If so, can you improve this so it does not influence site loading and its operations?
PS: maybe the cause of the lag/unresponding is not queued files, but the way images are loaded on the site i linked above. Because zeronet stop loading sites and when i browse opened tabs, i sometimes see that on above linked web zite is the page where many images did not loaded (instead of a image thumbnail, there is a wheel indicating image loading) and when i close that tab, zeronet start responding... i always had impression it has to do with some zite function that is made wrongly to hang. Or the function should have some timeout or something.
If the problem is the browser limit, i have following in the Firefox "about:config" page:
network.http.max-persistent-connections-per-proxy - 32
network.http.max-persistent-connections-per-server - 6
network.http.max-urgent-start-excessive-connections-per-host - 3
so i have added 0 at the end of each value in hope to remove this lag
dom.workers.maxPerDomain - 512
The text was updated successfully, but these errors were encountered: