-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
increase postgres max_connections above 100 connections #2740
Conversation
I think the better approach would be to take the value from an env variable and default to 100 |
I made the changes you suggested. |
@erfantkerfan I'd add the default value ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this is necessary, how many active connections does your postgres currently have?
Done. Changed to the default 100. |
I'm getting this error:
I'm getting the exact error that tells me I have used all my Postgres connections, also I'm using latest self-host sentry version and this change fixed my problem for when we had a very high load of sentry requests. |
@erfantkerfan you still did not add it to the |
Please paste the error, also please paste the command used and your active connections. I still think this is a XY problem, could you please state what do you mean by high load? How many requests per second? |
Let me recap, I'm having about 50 req/s in normal, but in high traffic hours I'm getting more than 120 req/s and I sometimes I was getting 500 status codes from sentry, after reading sentry logs I found out that sentry is having problems connection to Postgres, after reading Postgres logs I saw this error message:
No, I think the default 100 will be sufficient for most people, but for others that might encounter this problem finding out this .env would be easy (since it's present in docker-compose.yml), But if you insist on adding it to the sample env file It's okay. |
Postgres is not good in handling lots of connections, that's why lots of postgresql connection poolers exist and they are getting used in production. Also we're handling about 800 rps without having to tweak postgresql connections, that's why I don't think your "high load" suffices the reason to merge this PR.
Ok, aside from merging this or not, this has to be added in |
@erfantkerfan - I want it in the
Yes but if we say this, then I'd argue this repo should also bake these kinds of things in such as pgbouncer. Or at least provide easy ways to plug them in when needed. Right now we have none of that so I think this patch is a move in the right direction regarding that.
This discrepancy is worth exploring but I'm not sure "we can handle this without changing the setting" is a great argument against adding the flex point to change the setting easily when people need it. |
Correct.
My argument is that we should make sure that the problem is postgresql max connections prior to adding a configuration for it. For example I think that maybe OP has created too many workers, therefore opening too many connections to postgresql. From my experience when you add these limits configuration, many users just raise them blindly which results in making their instance perform worse. See hundreds of tutorial which just disable kernel limits of open files for really no reason, right now I see this PR going that way. |
Ah, that's a fair point. @erfantkerfan what else did you change from the default configuration we provided? |
That's a fair point. But in the case of Postgres I have to disagree, most of the time it fails at handling large amount of connections.
I added it to the .env with the default 100 limit too in the hope that this PR seems okay now. |
|
In non-peak hours:
In peak hours:
In the past couple of days, I have not found connections above 100, (It's a case that only happens about once or twice a month)
|
Please be precise, are these the output of the command I've pasted or something else?
How do you know this?
Can you elaborate this sentence? Is this PR about number of connections or "actual load on the Postgres"? Also what does
And please do not mention me on every single comment, I'm watching this PR, thanks. |
Yes sir.
How do I know I have not reached above 100 connections? Because in this case I get logs in container output which I can grep and find them. How do I know this happens once or twice a month? Because I get 500 error this often from sentry and after investigating I see this max_connections error both in Sentry container and Postgres logs.
From what I understand from the docs and some blogs, this value changes the combined number of connections, which is:
|
I still cannot see why this is the only instance which needs this fix, and explained my arguments on not merging this PR in #2740 (comment) . I don't see any added value from OP in later comments to change my mind about this. But I'm not a maintainer of this project, @hubertdeng123 please do as you see fit, I've made my arguments. |
Just reading up here, and I see valid points from both sides. FWIW, I don't think adding more connections is the best way to go about things when hitting the connection limit since it can result in spikes in resources, but I also don't see the harm in adding this as a configurable option and surfacing that in the .env file. If this works as a solution for OP here, it may be useful to others as well. Before we merge this, I do think it may be helpful to add a comment along the lines warning users of additional CPU/RAM usage if the maximum connections in postgres is raised. Could you do that @erfantkerfan? |
@hubertdeng123 |
Co-authored-by: Amin Vakil <[email protected]>
Co-authored-by: Amin Vakil <[email protected]>
In my case, the default 100 connections in Postgres is hitting its limit and needs to be set to a higher value.
Since other people might need to change this, appending it to the Postgres flags seems only logical.
Legal Boilerplate
Look, I get it. The entity doing business as "Sentry" was incorporated in the State of Delaware in 2015 as Functional Software, Inc. and is gonna need some rights from me in order to utilize my contributions in this here PR. So here's the deal: I retain all rights, title and interest in and to my contributions, and by keeping this boilerplate intact I confirm that Sentry can use, modify, copy, and redistribute my contributions, under Sentry's choice of terms.