You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a program that creates and fills Depth16Unorm depth textures on the CPU, then reads that data (textureLoad) in a shader to render point clouds.
@group(1) @binding(1)
var depth_texture: texture_2d<f32>;
let norm_linear_depth = textureLoad(depth_texture, texcoords, 0).x;
On native, this works without issue, as the spec suggests.
When using the WebGL backend, on the other hand, my shader just... doesn't work. I cannot see any relevant logs in the browser console, and trying to enforce a fragment color output does nothing, so I assume the shader crashes right at the textureLoad.
Converting my u16 data to f32 on the CPU and then uploading that into a R32Float texture fixes the issue, no other changes required.
Repro steps
Using the WebGL backend:
Create a Depth16Unorm texture
Fill it with arbitrary data
Read from it in a shader
Expected vs observed behavior
Writing to and reading from a Depth16Unorm should work on both native and web, as the spec suggests.
Platform
wgpu 0.15
Linux (Arch), also reproduced on macOS
Latest Firefox & Chromium
The text was updated successfully, but these errors were encountered:
There are a few other open issues related to GLES texture format support.
The backend doesn't advertise support for DownlevelFlags::WEBGPU_TEXTURE_FORMAT_SUPPORT and as far as I know nobody researched what the gap is between the WebGPU spec, WebGL and GLES (since even between these 2 there seem to be differences).
We should have built up texture flag support independently of the spec building off the gles specs. When WEBGPU_TEXTURE_FORMAT_SUPPORT is disabled, we should defer to the backends texture flags for our checks, but that obviously isn't happening here for some reason.
Description
I have a program that creates and fills
Depth16Unorm
depth textures on the CPU, then reads that data (textureLoad
) in a shader to render point clouds.On native, this works without issue, as the spec suggests.
When using the WebGL backend, on the other hand, my shader just... doesn't work. I cannot see any relevant logs in the browser console, and trying to enforce a fragment color output does nothing, so I assume the shader crashes right at the
textureLoad
.Converting my
u16
data tof32
on the CPU and then uploading that into aR32Float
texture fixes the issue, no other changes required.Repro steps
Using the WebGL backend:
Depth16Unorm
textureExpected vs observed behavior
Writing to and reading from a
Depth16Unorm
should work on both native and web, as the spec suggests.Platform
The text was updated successfully, but these errors were encountered: