Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't send/receive stream on Oculus Quest 2 #128

Closed
LawrenceSelly opened this issue Sep 24, 2021 · 8 comments
Closed

Can't send/receive stream on Oculus Quest 2 #128

LawrenceSelly opened this issue Sep 24, 2021 · 8 comments
Assignees
Labels
help wanted Extra attention is needed question Further information is requested

Comments

@LawrenceSelly
Copy link

LawrenceSelly commented Sep 24, 2021

I'm working in Unity 2020.3.19f1, with the NDI 5 release of this package.
I'm creating for an Oculus Quest2, and have followed the oculus setup correctly to do that.
Deployed projects, and all is fine.

However, I cannot get any type of NDI streaming in or out of the quest. I only get the name of the stream, followed by a null (or black) texture.

To view the stream from the Quest, I'm using NDI Tools Studio Monitor.
To view the incoming stream, which I'm sending with NDI Test Cards, I'm using the NDI Receiver component, set to Renderer mode, attached to a cube prefab.

I'm not sure what files you may require to debug, but I'm happy to send over anything.

@keijiro keijiro self-assigned this Sep 25, 2021
@keijiro keijiro added the question Further information is requested label Sep 25, 2021
@keijiro
Copy link
Owner

keijiro commented Sep 25, 2021

Which type of computer do you use as a host? Windows PC or Mac? If you're using Windows, did you turn off the Windows Firewall?

Unfortunately, I don't have any VR device, so I can't test it on my side.

@LawrenceSelly
Copy link
Author

Windows PC to build. I use NDI all the time.
With the Quest2 however, I'm using it standalone, without a host.

Instead of talking VR devices, for Android devices, are there permissions required to make NDI work? or perhaps the Android Network Discovery API?

@keijiro
Copy link
Owner

keijiro commented Sep 27, 2021

Instead of talking VR devices, for Android devices, are there permissions required to make NDI work? or perhaps the Android Network Discovery API?

That's enabled by AndroidHelper.cs:
https://github.com/keijiro/KlakNDI/blob/main/jp.keijiro.klak.ndi/Runtime/Internal/AndroidHelper.cs#L18

And it worked as expected on my Pixel 5. So I think it would work on other Android smartphones/tablets too. I'm not sure if there is a difference between them and Quest.

@keijiro keijiro changed the title No Texture [Send or Receive] on Android Arm64. Only stream name comes through. Can't send/receive stream on Oculus Quest 2 Oct 2, 2021
@keijiro keijiro added the help wanted Extra attention is needed label Nov 1, 2021
@keijiro
Copy link
Owner

keijiro commented Nov 1, 2021

Help wanted: Any information about KlakNDI on Quest 2 is helpful. Failed? Succeeded? Any error message? Please post here.

@kawaharas
Copy link

I succeeded in sending and receiving NDI stream between PC and Oculus Quest 2.

My hardware and software environments are as follows.

[PC (LAN)]
Razer Blade 15 (2018) Advanced Model (Geforce GTX 1070 Max-Q)
I connected my PC to wired LAN via USB-LAN converter. I tested this issue with the following two converters.
LENTION C-C68
QNAP QNA-UC5G1T

[Android (WiFi)]
Pixel3
Oculus Quest 2

[Software]
Unity 2021.2.1f1
KlakNDI 2.0.2
Oculus Integration 33.0
XR Plugin Management 4.2.0
Oculus XR Plugin 1.10.0

The procedure to send NDI stream from Oculus Quest2 to PC is as follows.

Oculus Quest2 -> PC

  1. Create a new scene as URP.
  2. Add an empty GameObject to the scene and attach the NDISender script to it.
  3. Add an OVRCameraRig prefab to the scene.
  4. Add a Main Camera (as 2nd camera) under [OVRCameraRig]-[Tracking Space]-[Center Eye Anchor].
  5. Create a new Render Texture and set it to [Output]-[Output Texture] of the Camera added in step 4. (Render texture size is tested at 1920×1080)
  6. Set the NDI Sender script's [Capture Method] to “Texture” and [Source Texture] to the Render Texture created in step 5.

Build settings for Android devices in [Player Settings]-[Other Settings] are as follows.

[Rendering]-[Color Space]: Linear
Auto Graphics API: OFF
Graphics APIs: Vulkan (I removed OpenGLES3)
[Configuration]-[Scripting Backend]: IL2CPP
Target Architectures: ARM64
Static Batching: ON
Graphics Jobs: OFF
Texture compression format: ASTC

At runtime, the NDI Name was displayed as “LOCALHOST (NDI Source)” in the NDI Receiver running on PC.

I referred to the following URL for the build settings for Oculus Quest 2. (Sorry, it's written in Japanese)
https://framesynthesis.jp/tech/unity/oculusquest/

@keijiro
Copy link
Owner

keijiro commented Nov 10, 2021

@kawaharas Thanks for the valuable input.

@LawrenceSelly Are you still seeing this issue on your side? I also got some reports about Quest 2, so I think it basically works, even though it has some issues especially about resuming ( #134 ).

@LawrenceSelly
Copy link
Author

I was attempting a much larger texture, 8000x1000 which may have been the cause of my issues. So if it's 1080p or smaller I assume from @kawaharas testing.

I will give it another test later this week and see what happens.
Luckily, the advent of Air-Link has enabled us to use PC based Oculus/Vive apps which are easier to make, and allows us you use Spout to send the texture rather than NDI. So I think that's the route our clients will end up taking.

@kawaharas
Copy link

kawaharas commented Nov 11, 2021

I also tested sending a 7680x3840 video file from PC to Quest 2 and Pixel 3 via render texture.
As a result, both devices successfully received the video from the PC at the same time.
As for sending from Quest 2 to PC, I have only tested up to 1920x1080 as shown in the previous message.

There is a lag, but I have not tested in detail whether this is due to the video size or to the performance of the router or HUB.
However, the lag decreased at 3840x1920 and ran comfortably at 1920x960 when I used the same video file and changed the render texture size.
It may be able to handle larger video sizes if I am using more good network environment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants