-
-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Blur Background setting causing significant GPU usage above certain settings #271
Comments
Originally the background blur was performed on CPU but now it is performed on GPU so increasing the GPU load is expected. |
I realise that, but this blur in particular seems to be more resource-intensive (even though it's being done on the GPU now) than it was when it was done on the CPU for around the same amount of blur. |
The current blur implementation is a naïve one, and we should implement more efficient algorithm such as ones on the following article: |
we could add an option to choose GPU or CPU blur... to fit everyone's needs |
i've read this article in the past researching efficient blur it all comes down to kernel separability. with a "two pass" (instead of current "one pass") we can do box or even gaussian blur pretty fast... |
thanks, curious to try. |
I 100% agree with this. For people with weak GPUs (i.e. integrated graphics), there should be an option to choose which device does the background blur (similar to the Inference Device setting).
@umireon, can you please provide Windows testing binaries? |
@umireon Works great (even on integrated graphics)! |
Describe the bug
When setting the
Blur background factor
setting high enough, the background blur effect will use a very significant amount of GPU resources (this is more apparent on integrated graphics).To Reproduce
Steps to reproduce the behavior:
Background Removal
filter to any source (if you haven't already).Blur Background Factor
to a high setting (settings as low as 6 max out my integrated graphics, while my discrete graphics can handle up to around 12 before lagging significantly)Expected behavior
The background blur does not use this much GPU time.
Screenshots
Background blur factor was set to 50 (before) and 12 (after). These settings give about the same amount of blur when switching versions, from my testing.
Log and Crash Report
2023-04-24 07-53-54.txt
2023-04-24 07-55-51.txt
Desktop (please complete the following information):
Additional context
Commit 39adf39 and earlier do not seem to exhibit this behaviour, which suggests that the newly-introduced GPU blending in the latest release is actually performing worse than before
The text was updated successfully, but these errors were encountered: