Skip to content
This repository was archived by the owner on Jan 22, 2025. It is now read-only.

solana-web3.js: Automatically batch requests made via JSON RPC #23627

Closed
epicfaace opened this issue Mar 13, 2022 · 12 comments
Closed

solana-web3.js: Automatically batch requests made via JSON RPC #23627

epicfaace opened this issue Mar 13, 2022 · 12 comments
Labels
good first issue Good for newcomers

Comments

@epicfaace
Copy link
Contributor

Problem

Many of the existing methods for interacting with JSON RPC don't support batching.

Proposed Solution

It would be nice to have an option that would automatically batch requests made via JSON RPC. We could batch requests sent every 100ms, for example.

@epicfaace epicfaace changed the title Automatically batch requests made via JSON RPC solana-web3.js: Automatically batch requests made via JSON RPC Mar 13, 2022
@t-nelson t-nelson added the good first issue Good for newcomers label Mar 14, 2022
@steveluscher
Copy link
Contributor

I gave this a good think, and wrote down my thoughts here: https://twitter.com/steveluscher/status/1532443998488363008

I actually think that JSON-RPC 2.0 batching is a huge footgun, and might actually lead to worse application performance overall. Read the thread above and let me know what you think!

@epicfaace
Copy link
Contributor Author

@steveluscher Interesting thread. I agree with your thread, but only in some cases in which the number of requests you're trying to batch is small enough. If you have too many requests that are running at the same time, though, then this overcomes the drawbacks of batching and makes batching way better.

Let me explain my use case: I had an app that would make 20-30 requests, all at the same time. Each request was pretty simple and just fetched data. But running that many requests at once made things super slow. This is particularly because Google Chrome has a maximum of 6 concurrent requests per host name. Essentially, this made things really slow and the best way to solve this is to batch.

Does this sound like a case for ‘yes, absolutely, we need that.’??

@steveluscher
Copy link
Contributor

Ah yes, I remember the 6 request limit. The thing with that limit is that it's not real.

https://codesandbox.io/s/sweet-surf-qu60i3?file=/src/index.js

Screen.Recording.2022-06-02.at.1.33.42.PM.mov

I'd be interested to replicate the slowness you observed, to see if there isn't some other cause. If you have a repro that you can share with me, let's dig into it!

@epicfaace
Copy link
Contributor Author

@steveluscher this intrigued me, I investigated further, and aha, it is real!

https://codesandbox.io/s/friendly-dream-oercmt?file=/src/index.js

image

For your demo, the requests were too fast for humans to actually notice the 6-request concurrent limit. My reproduction, which uses 5-second requests, makes the limit clearer.

@steveluscher
Copy link
Contributor

We're both right!

image

In your example, deelay.me serves files over the ancient HTTP/1.1. In my example, jsonplaceholder.com serves them over HTTP/3.

Now please, please, please, please, please, please, please, please, please, please, please, please, please, please tell me that the validator RPC isn't HTTP/1.1…

@steveluscher
Copy link
Contributor

Now please, please, please, please, please, please, please, please, please, please, please, please, please, please tell me that the validator RPC isn't HTTP/1.1…

Shit.

@steveluscher
Copy link
Contributor

#25796 👀

@steveluscher
Copy link
Contributor

Thanks for helping me discover that the Solana-hosted RPC nodes are running in HTTP/1.1 mode. Changing that is going to have a huge impact on the ecosystem overall.

With respect to this GitHub issue though, I don't think we're going to pull this into the core library. The good news, though, is that you can implement this as a custom fetch function!

Supply your own fetch function here:

const connection = new Connection(URL, {customFetch: myBatchingFetchFn});

The implementation of myBatchingFetchFn basically:

  • intercepts requests
  • coalesces requests received within a time window
  • makes requests in a single batch
  • fans the responses out

@epicfaace
Copy link
Contributor Author

@steveluscher amazing! What a find.

@devDesu
Copy link

devDesu commented Oct 6, 2023

Thanks for helping me discover that the Solana-hosted RPC nodes are running in HTTP/1.1 mode. Changing that is going to have a huge impact on the ecosystem overall.

With respect to this GitHub issue though, I don't think we're going to pull this into the core library. The good news, though, is that you can implement this as a custom fetch function!

Supply your own fetch function here:

const connection = new Connection(URL, {customFetch: myBatchingFetchFn});

The implementation of myBatchingFetchFn basically:

  • intercepts requests
  • coalesces requests received within a time window
  • makes requests in a single batch
  • fans the responses out

@steveluscher could you please share the fn impl?
Btw, how do you manage rpc rate limit using mentioned fn? In my project I have to manually wait between rpc requests to prevent rate-limit errors

@steveluscher
Copy link
Contributor

I have nothing to share @devDesu – I would have to write it from scratch! If you need such a thing, I'm sure that you can find a community of builders to collaborate with you on it.

@huybuidac
Copy link

Maybe too late,
You can use connection.getmultipleaccounts
Or you still want independence fetch account but auto collect requests in 100s to batch. You can use solana-batch-requests

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants