-
-
Notifications
You must be signed in to change notification settings - Fork 6.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Persistent Cache #1309
Comments
@yyx990803 Recently, I found that pre-binding has been implemented through esbuild in the optimizer directory. Is this problem solved? |
we started to migrate our monorepo to vite. our biggest frontend project has around 2800 modules on load. Suggestion: |
This comment has been minimized.
This comment has been minimized.
The idea of persistent cache sounds great. It's been a while since the issue was created. What's the current status? |
最近把老项目从 webpack 迁移到 vite 时又遇到这个问题了,记录了下优化的方法, 虽然没有从实质上解决问题,但是也加速不少,https://carljin.com/vite-resolve-request-files-a-ton.html |
Any discussion opened for this issue? Hope to apply a service worker as @AlonMiz mentioned to do some pre-tasks with a more heuristic pre-bunding discovery algorithm and persist them. |
@AlonMiz, curious if you've improved your load performance in the interim -- we've got a project on a similar scale facing the same sorts of issues. Would love to compare notes! |
Unfortunately, we've halted the vite migration from webpack. We couldn't find an intermediate solution for this. Developers preferred one slow load and fast refreshes than fast first load and long refreshes |
Thanks for letting me know. Did HMR/Fast Refresh work for you at all? Curious because I'm currently betting big on optimizing my own app for HMR. |
@davidwallacejackson yes it did pretty well in most cases, some issues in edge cases, but not something significant. |
Hi, guys, I found a couple steps to optimize browser loading performance. Although it cannot solve the problem from the root cause, but it speeds up a lot. If u want to quickly feel the changes after optimization.
I recently migrated the aPaaS project from webpack to vite and optimized it according to the above steps. The following is a speed comparison
|
This is the details of optimization step and why https://carljin.com/vite-resolve-request-files-a-ton.html |
I successfully migrated a large project from webpack to vite. The biggest problem I face is that some packages do not support ESM, you have to write compatibility or re-fork and release. For example, ant-desgin-vue needs to write a vite plugin to solve the ESM problem in their code. It took me about 1 week to process the migration, and now I am very satisfied, because the HMR speed of vite is so dope, once you experience it, you can’t forget it. |
If anyone interested I open this repo to play with this (POC), recenlty pushed and still not working: https://github.com/userquin/vite-dev-sw The sw is being registered with Initial gist from @davidwallacejackson. EDIT: rn it is also working with F5 and Crtl + F5, we can now play with the sw |
we can also inject some resources such as precaching on the plugin, rn we read it from file, if we have the transform cache on disk we can inject it on the sw. |
The slow initial page loads are still the main issue we have that impacts vite adoption. On some pages of our app there are 1900 module requests, and especially on Linux laptops it can take up to 30 seconds to do the initial load, even if no files were actually changed and it was just vite that was started again. For this reason we have kept webpack around and some devs prefer using webpack especially when they work mostly on the (Python) backend of the app, since then bundling only happens once and afterwards all page loads are fast. Any way to speed this up, whether it's using a persistent file cache, service workers, or something else, would be great. |
If you're using React you can get a speed up by using https://github.com/ArnaudBarre/vite-plugin-swc-react-refresh |
Has anyone considered using websockets to get around the HTTP connection concurrency limits? |
always use http2 |
@carl-jin less than 0.3s? |
不仅仅是大型项目加载慢,我的项目只有两个页面,登录页和首页,组件不超过10个,但是首屏加载竟然需要45秒,惊呆我了... |
Hi everyone, Also http1 is not the limitation: https://github.com/ArnaudBarre/esm-dev-server-limit |
With RSC, an increasing number of npm packages will need transpilation. If an RSC app has 20 npm packages that need transpilation, then it seems to me that we'll need a cache. Especially considering that RSC code living at E.g. a global cache |
Let's see how this RSC move plays out in the ecosystem, but I'm confident that either someone will make a fast transformer for it, either a small cache on top of the RSC transformer will be good enough and easy to implement (or maybe the RSC architecture will not play nicely with Vite, but that's another issue) |
👍 I agree. Although a global cache could be a pretty cool addition nevertheless: a global cache guarantees that Vite's dev speed stays instantaneous (library sizes increases while on-demand user code size doesn't). On a tengent: a global cache would allow use to aggressively transform npm packages, e.g. to remove CJS-ESM compatibility problems. But, I agree, let's see how things turn out. |
For RSC case, the cache problem is something like this? https://github.com/Shopify/hydrogen-v1/blob/v1.x-2022-07/packages/hydrogen/src/framework/plugins/vite-plugin-hydrogen-client-components-cache.ts |
这不太可能是vite的问题 |
没必要洗地的,官网已经承认现在vite的性能就是差,在想办法优化。只要用了SCSS就会卡,你可以试试配置element plus 自定义主题,配置scss之后明显变卡。 |
Lack of persistent cache and annoying unnecessary rebuilds is my main frustration with Vite in comparison with Parcel 2 (which has its own flaws too, but speed and annoying unnecessary rebuilds are not those). |
项目使用了 less,sass 是会显著降低打包工具的性能,vite 目前是按单个组件来进行 transform,随着组件的增加,性能肯定会变差。 但是两个页面,不超过 10 个组件,极大可能是哪里配置的不合适,是可以优化的。最近接触了两个产品项目,一个使用 Less(Ant Design Vue),一个使用了 Scss(Element Plus),使用 Less 的不论是开发还是打包,都需要 45-50 秒,使用 Scss 的打开开发环境 14 秒左右,打包将近 50 秒。 我把使用 Less 的项目打开开发环境优化到了 5s 左右,打包 18s。主要是 |
#1207 (comment)
Vite's current biggest performance bottleneck is first page load with large amount of modules. This is mostly caused by congestion at the browser network layer, since every module results in a full HTTP GET request on fresh server start.
To improve this, we can dump the in memory transform cache to disk on server close, and "hydrate" the module graph on next server start (the hydration should only fill in the cache for files with still fresh
mtimeMs
). The cache can be dumped in one file so it should be relatively fast to deal with. This allows the server to directly send 304s even on a restart.The text was updated successfully, but these errors were encountered: