-
-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
prologue uses more memory per request for static files #106
Comments
Is this also the case when you compile with usestd will use |
@jivank i don't remember if i tried a release build, but yes i tested with |
@ITwrx Please check again with the latest prologue and nim 1.4.2, I am serving a 7GB file and I seem to be getting around 10MB now with |
@jivank thanks for the "heads up". A couple of things:
Thanks nim and prologue devs! |
If serving a static image via img tag or especially a video via html5 video tag, prologue uses more and more memory per request. With a video in the page, you can quickly be using 1GB of ram.
This also happens with other nim web frameworks, so it's likely something to do with the underlying http server implementations.
However, this is easily mitigated by using nginx in front of prologue as a reverse proxy, and specifically, by explicitly serving the static files with nginx via a location block as demonstrated below.
If testing on localhost, you will want to create a line in /etc/hosts for the server_name you are using in your nginx config, and use an env variable for the host name in your app's urls, so that all your static resources use that domain/server name instead of localhost:8080. Otherwise, the requests would bypass nginx and be served by prologue.
Please note that the above nginx config is a minimal example and should not be considered complete for production use, as you will need an http block with various settings for performance and security, and likely other lines in this server block. The example above is just to demonstrate the part related to this workaround.
The text was updated successfully, but these errors were encountered: