-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve performance #392
Comments
Allows browsers to cache (and reuse) assets that can't change. This sets a conservative limit of 1 minute, which is enough for reuse when clicking around the site but not for repeated visits. This is to give a chance for problems to be found and fixed, as clearing browser caches is hard at best. A more realistic max-age will be 1 year (the maximum value that is safe to use). This recognises files named like 'index.337f472b.js', that is they have a hash of their content in their name. As a result, a change of content results in a new file, leading to a fresh download. Refs #392
Re-ran case 1 after enabling HTTP caching. 'Cache static content' drops from an A to an F, and 'Effective use of CDN' from a pass to fail. This is expected as the current TTL is very short, but it also shows that WebPageTest doesn't pick up when caching should be enabled and isn't (it definitely should not have been an A initially.) |
Binary assets, such as JPEGs, aren't particularly compatible with Git as they inflate the repo size if changed (since they can't be diffed). This moves them to Git LFS ahead of making changes. It's possible to remove them from the repo entirely (i.e. remove them from previous commits) which would make a git clone a lot faster, but since it involves rewriting history I've not gone down that path. Refs #392
Transform images to an appropriate size and, in some cases, format. This reduces the page weight of the homepage significantly. They won't look quite as sharp on retina-like displays but given the overall performance issues this shouldn't be a concern. Refs #392
The staging workflow is failing with an unhelpful error message. I think the problem might be that assets are now stored in Git LFS rather than the Git repo directly, and so it will need to check them out separately. This does also show that the staging workflow should not be rebuilding the image. Refs #392, 0cbb848
Re-ran case 1 after improving the images. 'Compress Images' goes from a C to an A, and the page weight has dropped from around 5 MB to 1.9 MB. |
Persona avatars are loaded, but the component expects them to be a data URI string rather than a Buffer object. As a result, the browser tries to load `/[object%20Object]`. Images inlined in the API are one of the leading performance problems. Rather than try and fix the types so that avatars show consistently, this hides them completely where they don't work. As far as I'm aware, there's been no complaints that they're not displayed, and this will help make some of the critical pages a bit less slow. Refs #392, #398
Images inlined in the API are one of the leading performance problems. The community banner images can be heavy (one of the three is 1.8MB), which is the leading cause of the slowdown on the listing page. This commit prevents the banner images from being loaded for the listing page. As they're decorative rather than part of the core content, it's far safer to remove them than enter the rabbit hole of trying to fix them. Refs #392, 440b3e3
When preprints are part of a list, data for all the communities are loaded to show a review filter. Unfortunately, rather than requesting just the names for communities that it needs to display, it loads all the data for all communities (including the list of all the communities members, which is 599 for one of them). The filters don't share the information, so the browser has to make a separate API request for each preprint shown. All review lists already have a by-community filter, and others are already filtered somehow (e.g. for an individual community). So, this filter seems redundant. This change removes the filter, except for on the preprint page (where it is loaded only once). Refs #392
While not a single-page application, the site has been rendered in the browser using a single, bloated JavaScript file. Introducing caching has helped, but any change to the code can still cause the file to changed and so re-downloaded. This change splits the file into smaller chunks that are loaded when needed. While this doesn't address the fundamental problem of rendering in the browser, it should improve the situation a bit: 1. Less JavaScript is loaded, so it'll be quicker and use device resources (memory, data plan etc). 2. Changes to the code have less of an impact, as only parts have to be re-downloaded. Again, this makes it quicker and uses fewer resources. Refs #392
Reference tests:
From those, the main problems are:
Size of the JS: even on a fast connection it’s ~3.6 seconds before anything is displayed (https://www.webpagetest.org/video/compare.php?tests=210721_AiDcJC_ed69d396a91d993156e59cd530aeba32-r%3A1-c%3A0-s%3A1%2C210721_AiDcJC_ed69d396a91d993156e59cd530aeba32-r%3A2-c%3A0-s%3A1%2C210721_AiDcJC_ed69d396a91d993156e59cd530aeba32-r%3A3-c%3A0-s%3A1&thumbSize=200&ival=100&end=visual)
Size of the images: once the JS has downloaded and executed, it’s then time to download 3.7MB worth of images (https://www.webpagetest.org/result/210721_AiDcP3_dd9fffea321e334c1d8b949fa048c98b/1/performance_optimization/#image_compression_step1). They’re not optimised, nor served at an appropriate size (the mobile device does not need a 2644x1388 image).
No browser caching. The 2nd and 3rd WPT runs should be faster as it shouldn’t need to re-download everything, but they’re not. (This makes the cache busting filenames where are in place (e.g. the random characters in https://prereview.org/sc-morning-post.f7e3a52a.jpg) pointless.)
The text was updated successfully, but these errors were encountered: