Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add EGL stream support #245

Merged
merged 1 commit into from
Mar 31, 2017
Merged

Add EGL stream support #245

merged 1 commit into from
Mar 31, 2017

Conversation

vially
Copy link
Contributor

@vially vially commented Mar 30, 2017

Add support for EGL streams used by the proprietary NVIDIA driver: #138

The current implementation selects the EGL stream buffer API (instead of the default GBM one) when the WLC_USE_EGLDEVICE environment variable is not empty (e.g.: export WLC_USE_EGLDEVICE=1 is needed at runtime).

I've only tested it by running the example compositor on a single monitor setup (Archlinux using latest NVIDIA drivers) and it seems to work.

Disclaimer: I don't know C, so most probably the code contains lots of bugs so please review very carefully (especially around memory allocations/deallocations).

src/platform/backend/drm.c Outdated Show resolved Hide resolved
src/platform/backend/drm.c Outdated Show resolved Hide resolved
src/platform/backend/drm.c Outdated Show resolved Hide resolved
src/platform/context/egl.c Outdated Show resolved Hide resolved
src/platform/context/egl.c Outdated Show resolved Hide resolved
(void *) eglGetProcAddress("eglGetPlatformDisplayEXT");
if (getPlatformDisplayEXT)
return getPlatformDisplayEXT(type, native, NULL);
return context->api.eglGetPlatformDisplayEXT(type, native, NULL);
}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jwrdegoede Does these changes break anything introduced in #233?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No I don't think so, all the checks are still there just moved to a different place.

@Cloudef
Copy link
Owner

Cloudef commented Mar 30, 2017

Overall looks good. Maybe add documentation in README as well.

@Timidger
Copy link
Contributor

Timidger commented Mar 30, 2017

Tried testing this on my machine with both Way Cooler and Sway, got the same error from wlc:

Failed to activate vt1 for restoration
Falide to switch back to vt1

I made sure I installed the drivers properly, because I can use Gnome + Xorg properly with this driver loaded. Using Nvidia gtx1070 card, not that it probably matters.

EDIT: Same effect from the example compositor. More of the error message that might be relevant:

Failed to get drm resources
drmModeGetResources failed

@4e554c4c
Copy link

@Timidger Do you have Nvidia's DRM KMS module enabled? It is unstable and disabled by default, but needed for EGL. Use modprobe -r nvidia-drm ; modprobe nvidia-drm modeset=1 if you don't.

@vially vially force-pushed the eglstreams branch 2 times, most recently from 4826d36 to 2a0eca0 Compare March 30, 2017 20:30
@vially
Copy link
Contributor Author

vially commented Mar 30, 2017

@Cloudef I've addressed your comments and pushed the changes.

I noticed a very strange thing though. It seems I can still run the example using the GBM buffer API instead of EGL streams on the NVIDIA proprietary driver but it has very poor performance. Any idea how this might be possible given that the proprietary NVIDIA driver never advertised that they support the GBM API?

@Timidger
Copy link
Contributor

@4e554c4c Thanks for the hint, I did have that disabled...but now it seems I have a screwy setup since it keeps segfaulting and after checking the core dump it seems it's trying to use nouveau still....oh well it's about time I switched distros on this machine anyways :P. Thanks for all your help.

@Cloudef
Copy link
Owner

Cloudef commented Mar 30, 2017

@vially I'm not sure. Maybe it fallbacks to some software implementation. Do you happen to have gbm compatible gpu in your pc also?

@vially
Copy link
Contributor Author

vially commented Mar 30, 2017

@Cloudef Yes, I do have an Intel CPU with integrated graphics. Although /dev/dri only lists one card and the monitor is connected to the NVIDIA HDMI port so I'm not sure how that would work. Strange...

@Timidger I've tried running the example on my main work PC which is a multi-monitor setup driven by a GTX 970 and it seems to crash as well. I'll try to look more closely into it to see what might be the problem (although it seems to work just fine on my other PC which has a single monitor and a GTX 750 card). I remember from some early test runs that all outputs were assigned the same CRTC id, but I'm not sure if that's still the case using the latest code. I'll have to try and see.

EDIT: There was indeed a bug in the code which made all the outputs be assigned the same CRTC. I've fixed it by using a more robust approach when selecting the output CRTC and now the example code seems to run on my multi-monitor setup too.

Update: I've been able to run sway successfully without any additional changes.

@vially vially force-pushed the eglstreams branch 2 times, most recently from 76b7d9b to fe743e8 Compare March 31, 2017 04:34
@Cloudef
Copy link
Owner

Cloudef commented Mar 31, 2017

Lets merge and see what happens.

@Cloudef Cloudef merged commit 1364e92 into Cloudef:master Mar 31, 2017
@ghost
Copy link

ghost commented Apr 5, 2017

@vially are you happy that you've now contributed to the segmentation of the entire Linux desktop? By merging this no problem has been solved, but you've worsened an existing one. Because Nvidia couldn't admit they weren't the first and didn't want to use an existing, nice API created by Intel they NIH'd this one, and now it falls onto projects to support both? This is absolute crap, and instead of yielding to Nvidia you should have send them an angry email like this one to support GBM. Or better yet a PR, though don't count on it being merged because they'd rather NIH it.
Now instead of only 1 compositor implementing this piece of crap we have 2, and that's alot more than 0.

@vially
Copy link
Contributor Author

vially commented Apr 5, 2017

@atomnuker I'm sorry you feel this way. I don't like the current situation either with having two competing buffer APIs. But I'm also trying to be pragmatic and I thought it might be useful if we had wlc working on proprietary nVidia drivers until the nVidia and Wayland developers decide on a single buffer API.

I also happen to own an nVidia card which is not supported by the nouveau driver so I really didn't have any other option if I wanted to use Wayland.

@vially vially deleted the eglstreams branch April 5, 2017 12:07
@ddevault ddevault mentioned this pull request Apr 18, 2017
@leigh123linux
Copy link

@vially

Are you using glvnd enabled mesa?, if so /usr/share/glvnd/egl_vendor.d/50_mesa.json overrides /usr/share/glvnd/egl_vendor.d/10_nvidia.json

Renaming 10_nvidia.json to 90_nvidia.json should correct it.

I noticed a very strange thing though. It seems I can still run the example using the GBM buffer API instead of EGL streams on the NVIDIA proprietary driver but it has very poor performance. Any idea how this might be possible given that the proprietary NVIDIA driver never advertised that they support the GBM API?

@vially
Copy link
Contributor Author

vially commented May 1, 2017

@leigh123linux Thanks, I wasn't aware of that. I'll experiment with that to see if it makes any difference.

@Silur
Copy link

Silur commented Apr 21, 2018

I have the same issue on 4.16.3-1-ARCH, modeset = 1 option is enabled in the DKMS however it works on every SECOND boot flawlessly. Something is not released on shutdown?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants