Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1 second delay when starting wasmer run #3962

Closed
Michael-F-Bryan opened this issue Jun 6, 2023 · 1 comment · Fixed by #3983
Closed

1 second delay when starting wasmer run #3962

Michael-F-Bryan opened this issue Jun 6, 2023 · 1 comment · Fixed by #3983
Assignees
Labels
🎉 enhancement New feature! priority-medium Medium priority issue
Milestone

Comments

@Michael-F-Bryan
Copy link
Contributor

Motivation

Every time we run wasmer run, there seems to be a 500-1000ms delay before the WASI instance starts.

I believe this is because of the wasmer_wasix::runtime::resolver::WapmSource querying WAPM every time, but I'll double-check the logs.

Proposed solution

The best way to avoid hitting the WAPM backend every time we start up is to cache the result of queries that have been made in the past. This could either be the original GraphQL queries or the more high-level "which versions satisfy some/package@^1.0?"

Additional context

This delay is causing a fair amount of flakiness in Wasmer's CI because the run_test_caching_works_for_packages_with_versions test relies on measuring a command's run time to infer whether we hit the cache or not.

@Michael-F-Bryan Michael-F-Bryan added the 🎉 enhancement New feature! label Jun 6, 2023
@theduke
Copy link
Contributor

theduke commented Jun 6, 2023

I'd cache the result of package resolution, AKA a lockfile.

We need a somewhat stable format for this anyway, so might as well define that now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🎉 enhancement New feature! priority-medium Medium priority issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants