Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add send_future_batch to ComponentLink #742

Closed
hgzimmerman opened this issue Nov 14, 2019 · 7 comments
Closed

Add send_future_batch to ComponentLink #742

hgzimmerman opened this issue Nov 14, 2019 · 7 comments
Labels

Comments

@hgzimmerman
Copy link
Member

Description

I'm submitting a feature request

Similar to send_back_batch which takes a closure that returns a Vec<Msg>, and re-rendering happens if any of these return true as their ShouldRender return value, an async variant of this might be a useful to compliment send_future.

This has the application of someone joining a series of fetch requests together and re-rendering once all of them finish.

I currently have code in my project that looks like this:

    fn mounted(&mut self) -> bool {
        // This will always cause 2 updates to the component.
        let fetch = fetch_to_state_msg(&GetPublicBuckets, Msg::FetchedPublicBuckets);
        self.link.send_future(fetch);

        let fetch = fetch_to_state_msg(&GetParticipatingBuckets, Msg::FetchedUserBuckets);
        self.link.send_future(fetch);

        false
    }

When what I really would like is:

    fn mounted(&mut self) -> bool {
        // This will always cause one update to the component,
        // but all futures need to complete first.
        let fetch = futures::future::join_all(
            fetch_to_state_msg(&GetPublicBuckets, Msg::FetchedPublicBuckets),
            fetch_to_state_msg(&GetParticipatingBuckets, Msg::FetchedUserBuckets)
        );

        self.link.send_future_batch(fetch);

        false
    }

This would allow you to load and display everything at once initially, and then be able to individually reuse each Msg to update parts independently once the component enters its normal update loop.


What might be tangentially worth exploring would be a "thunkable" future that you register multiple futures to and after a defined delay after the first future resolves, it will batch any other resolved futures up to that point and send them to the component to be updated at once, doing this until all consumed futures are accounted for.

This could be used to manage fetching 8 things at once, where the first 6 resolve really quickly and can be re-rendered at once, but the remaining 2 take a while to finish. A mechanism like this would allow the developer using Yew to not have to think about the expected fetch timings in order to minimize re-renders, by exchanging 50-100ms of latency to wait for other futures to resolve in that window.

@jstarry
Copy link
Member

jstarry commented Nov 16, 2019

Thanks @hgzimmerman! This is actually making me realize that we didn't need to add send_back_future to the main framework. It's just a nice wrapper around the existing ComponentLink api. Now that ComponentLink implements Clone, we can just move the link into the future.

What do you think about adding send_future and send_future_batch to a middleware type crate? Maybe add it to yewtil?

@hgzimmerman
Copy link
Member Author

I think that's reasonable.

@hgzimmerman
Copy link
Member Author

I think once a lot of the current Readme's content is moved into external documentation, a section could be added to enumerate related crates to Yew, where yewtil, the router, various templates, along with other supported ecosystem crates people come up with could be featured.

I'll try to migrate current futures functionality to Yewtil in a couple of days, at which point I'll open a PR removing it from this project.

@hgzimmerman
Copy link
Member Author

Actually, I think that a send_self_batch is needed on component link in order to implement the external send_future_batch because access to the underlying scope isn't provided.

@jstarry
Copy link
Member

jstarry commented Nov 18, 2019 via email

@jstarry
Copy link
Member

jstarry commented Nov 28, 2019

@hgzimmerman can this be closed?

@jstarry jstarry added feature-request A feature request won't fix labels Nov 28, 2019
@hgzimmerman
Copy link
Member Author

Yes, this can be closed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants