Skip to content
This repository has been archived by the owner on May 31, 2021. It is now read-only.

Running Flood with >8k torrents locks rTorrent in IO loop #320

Open
jfurrow opened this issue Apr 23, 2017 · 11 comments
Open

Running Flood with >8k torrents locks rTorrent in IO loop #320

jfurrow opened this issue Apr 23, 2017 · 11 comments
Labels

Comments

@jfurrow
Copy link
Member

jfurrow commented Apr 23, 2017

A user wrote:

I tried it with about 8k, the response of /api/client/torrents was about 7 MB. It also seems to have put itself in a loop of checking the status of every file (a very long operation for me), which completely locked up rtorrent until I killed the server.
Edit: Just to be clear, I was never even able to reach the overview page. 'Flood Settings' had to yet to complete when I gave up and killed the server.
Edit 2: Even with the server gone rtorrent is still locked up :(

@kannibalox
Copy link

loop of checking the status of every file

By this I mean when looking at the strace, it was polling every file with stat, similar to a save_session call. This is synchronous and takes a long time for me (more than a minute), long enough to drop peer connections. Watching the stack trace, I would see a scan finish, a short wait time where it was responsive, then the next scan would kick off. rTorrent never crashed, it just never got out of the IO deadlock long enough to do anything useful. Luckily the XMLRPC tool I use hangs until it's able to send the command, so I issued a quit and played the waiting game.

Also, I did confirm my install worked fine for an instance with a sane number of torrents.

@jfurrow
Copy link
Member Author

jfurrow commented Apr 24, 2017

Thanks for the additional information! I'll look into this.

@jfurrow jfurrow changed the title Running Flood with 8k torrents crashes rTorrent Running Flood with >8k torrents locks rTorrent in IO loop Apr 24, 2017
@jfurrow jfurrow added the bug label Apr 24, 2017
@kannibalox
Copy link

kannibalox commented Apr 28, 2017

Is there any way to trace the XMLRPC calls that are sent to rtorrent? I'm not too familiar with nodejs, but I'd be happy to perform any troubleshooting, even if it means applying a patch file or something.

@jfurrow
Copy link
Member Author

jfurrow commented May 3, 2017

@kannibalox Sorry for the delay, I totally forgot to reply to you.

You can see the XMLRPC calls that are sent to rTorrent by checking out the contents of this file: https://github.com/jfurrow/flood/blob/master/server/util/scgi.js. The XML is generated here and stored in the variable xml.

So you could write console.log(xml); to see it in the Node sever's output, or if the XML is particularly large, you can append the value to a file and inspect its contents with a text editor. This snippet should work for that:

let fs = require('fs');
fs.appendFile('/path/to/file', xml);

Specifically line number 39 is where the request is sent to rTorrent.

@jfurrow
Copy link
Member Author

jfurrow commented May 3, 2017

@kannibalox It just dawned on me that this bug is probably caused by requesting d.free_diskspace= for every torrent... I'm going to feel really dumb if this is the case.

If you wouldn't mind testing this for me, I'd be super grateful. Try commenting out two lines in this file: https://github.com/jfurrow/flood/blob/master/shared/constants/torrentGeneralPropsMap.js (yes this file is messy AF, I'm working on cleaning this up right now): line number 49 and line number 120.

You'll need to kill and restart the Flood server for your changes to take effect.

There might be other properties that cause I/O here also...

@romancin
Copy link

I did the change you propose and the same behavior.

@romancin
Copy link

romancin commented Oct 8, 2017

Hello, any news on this?

@KyleSanderson
Copy link

Indeed, I'm fed up with ruTorrent being a pile of...

Things needed from even just reading the issue tracker:
Scheduled removals ( #371 ).
Move data actually working ( #581 ).
Scalability without locking up rTorrent for minutes on end.

@kannibalox
Copy link

#581 and #371 could probably be done with unlimited time from the volunteer who maintains this project, but "Scalability without locking up rTorrent for minutes on end." is incredibly hard even if there was unlimited time.

@mikl0s
Copy link

mikl0s commented Mar 5, 2019

Any news on this - almost 2 years now :)

@kannibalox
Copy link

Any news on this - almost 2 years now :)

It's an extremely hard problem to solve, feel free to give it a try yourself. I'm more than willing to lock up my instances in pursuit if need be.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

5 participants