-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bundler being unresponsive after first userOp #181
Comments
@stefanodecillis Which version are you using here? Could you please try
and send me the output? Remember to replace the |
Thanks for the help @zsluedem! Also, I'm using the bundler in unsafe mode |
Hello @stefanodecillis , thanks for reporting that. Does this happens on every run or just randomly? |
Hi @Vid201, 100% of the time the bundler stops answering after one user op handled - I need to restart to make it work with another one. For the curl call, that was the first time I called it and it also never worked. It keeps waiting for the answer. My guess is that could be a problem with the docker image. Do you ever deploy the bundler remotely with docker? In case, did you ever experience the same? |
@stefanodecillis Which docker image are you using? Could you docker inspect your container and give me more info about the version. |
Sorry for the delayed response @zsluedem! Here you can see the inspection:
I'm actually using the default Dockerfile using the Makefile. The only change i made was about running the binaries with the args. I will also add that, when it gets stuck, the GET method on the root path (expecting the message "Used HTTP Method is not allowed. POST or OPTIONS is required") will load infinitely with a timeout too The image i'm using: rust:1.65-slim |
Found the reason why the bundler is not responsive. DashMap is causing the dead lock. These two lines which Related resource: |
@stefanodecillis A fix is already merged. Would you like to take a try? Feel free to reopen it if you have any problems. |
@zsluedem thanks for your help! It actually works:) If you send 3-4 user operations in a short time, it will get stuck and you will experience the same experience of this topic. However, thanks to your fix, it will get unstuck by itself within a few seconds and you can continue sending user operations. Given that, I'm pointing out that the core problem could still exist causing several other issues. For instance, while it is unresponsive, you cannot send user operations to the mempool since the entire stack is frozen. In this case, even if there are many bundlers connected to the same altpool, we will just shift the problem. Now this is just my guess but I should actually read the code: could it be possible that, while bundling, it is waiting on the block creation without spawning any task with tokio? I'm wondering if there is a situation that led to the problem of deadlock you solved since in both cases the rpc (http) stack level was blocked |
@stefanodecillis Thanks for your report~! You are really helping here!
Is that very obvious stuck between each user operations? Like 1 or 2 or more seconds?
Are you running silius components separately? Like running
I don't think bundling is the reason based on your description above. I think it is higher possibility that the mempool have data race problem with it got several use operation at the same time. But I would take some time to dig in and find out what is the real problem. |
@zsluedem i see your point!
Besides this change, i'm using the dockerfile in the root folder (also exposed the port for rpc). To give you the real scenario, a colleague sent 3-4 transactions in a short timeframe and then the bundler became unresponsive. When he told me so, I had the time to watch the logs and try to ping it on my own and it was not replying. That's why I think there is some job/handler on the same thread that handles the RPC methods that get stuck |
Hi,
I hope this problem is only on my side!
I deployed the docker image and it becomes unresponsive after handling one or two userOp. The log trace stops without any error, even if the application is not halting.
Trying on different machines, the behavior is the same.
A typical UserOp I sent to the bundler is:
{"method":"eth_sendUserOperation","params":[{"sender":"0x5481E5F531702E8fb0bCB7c98cBdb22e814F6Acd","nonce":"0x21","initCode":"0x","callData":"0x8d44ad620000000000000000000000002b87f2390e4aef1fc961027982804650fadfaf4000000000000000000000000000000000000000000000000000000000000000a0b08d656e32fc45f4c913e44936538785704ee65d06c3a86ef0b44238cca4b1db0000000000000000000000000000000000000000000000000000000064cccf6900000000000000000000000000000000000000000000000000000000000001e00000000000000000000000000000000000000000000000000000000000000104a5efb2350000000000000000000000005481e5f531702e8fb0bcb7c98cbdb22e814f6acd0000000000000000000000000000000000000000000000000000000000000040000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000200000000000000000000000007a8a552e1305a631e2e1b44ba67b5d45a1a30497000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000600000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000","callGasLimit":"0x0222c0","verificationGasLimit":"0x0249f0","preVerificationGas":"0xea60","maxFeePerGas":"0x3b9aca10","maxPriorityFeePerGas":"0x3b9aca00","paymasterAndData":"0x","signature":"0xbcaf5133b0ad85ada7295b966071b624df0f69782ac6e0729197ffdcd079220a010e04e85ef22b045c5e1b2f0eb863667af297e3c91dd5e91a7ab4710a9a67ca1b"},"0x5FF137D4b0FDCD49DcA30c7CF57E578a026d2789"],"id":46,"jsonrpc":"2.0"}
The bundler handles correctly the userOp and sends the transaction on chain. After the first ones, it stops without any error log and the bundler becomes unresponsive.
Did you ever experience the same?
The only changes I made to the docker file are:
Thanks in advance for the help!
The text was updated successfully, but these errors were encountered: