kvark 4 days ago

Back when I worked on WebGPU in Firefox, GPU debugging was pretty straightforward. You’d use a setting that enables API traces of WebGPU, give it a path, and it would produce a trace. Then you’d replay it in a standalone application that is easily ran from NSight/RenderDoc/PIX/whatever. Moreover, you could replay it on a different platform with a different API! It was a breeze. I bet it still works.

http://kvark.github.io/wgpu/debug/test/ron/2020/07/18/wgpu-a...

pjmlp 5 days ago

It is a sad state of affairs that after so much Web 3D push for the last 15 years, Khronos advocacy of how great it is and we should all be rushing out to use them, browser vendors keep ignoring the need of proper developer tooling for 3D applications.

The best we have is either SpectorJS (showing its age, WebGL only), trying to differentiate between app calls and browser calls in a native GPU debugger, or create an alternative, completly unrelated native version, to sanely use a GPU debugger.

  • _factor 4 days ago

    I think the bigger issue is security. GPU access is too interlinked to kernel drivers. I’ve seen WebGPU and even WebGL crash otherwise stable systems.

    • tw061023 4 days ago

      WebGPU is supposed to be properly sandboxed.

      The bigger issue is that WebGPU is basically dead on arrival for the same reasons WebGL is - it is impossible to get enough data to the client for it to actually matter, Awwwards-style design tricks notwithstanding.

      I suppose browser vendors understand this and don't really care for either.

      • _factor 2 days ago

        “WebGPU is supposed to be properly sandboxed.“

        GPU must therefore provide open API for proprietary processing space.

        Magic “packets” are therefore possible to execute arbitrary functions on “sandboxed” DMA devices.

        Still a problem until we can audit the hardware. NV, and to a lesser extent AMD and ARC play somewhat open with a few omnipotent cards in their pockets. The prime of the issue is that gamers don’t care, only security professionals do. Because they’re the ones who see the 0-days fly by every day.

      • nmfisher 4 days ago

        What do you think could fix this? Access to genuinely permanent storage?

        • tw061023 4 days ago

          That's one part of it, yes. A browser API providing a few GBs of persistent storage with proper isolation and user management, obviously with some kind of compression/decompression going on to save both download times and loading times.

          As an example, consider Infinity Blade, the poster child of mobile gaming: released in 2010, 595 MB download, 948 MB installed. Even the first version of WebGL is capable of providing this kind of experience, we just cannot get it to the user via browser.

          • pjmlp 4 days ago

            I found out another fellow soul, that is exactly one of my complaints with Web 3D, the failure to 15 years later to provide the same experience as Infinity Blade, used by Apple to show off iPhone's OpenGL ES 3.0 capabilities.

            Or Unreal Engine Citadel demo, originally done in Flash / C++.

      • ossobuco 3 days ago

        That may be true (for now) for web apps, but what if you serve your app/game as a desktop app, for example with tauri?

    • pjmlp 4 days ago

      Yeah, however having a debugging tool, that developers have to explicitly open in developer tools, for debugging shaders and drivers that are always used by the browser anyway, doesn't change that.

      • stanleykm 4 days ago

        Agreed. Chrome already has flags to reduce the security posture for developers. It is not outrageous to expect you’d need to use those for profiling gpu shaders. This is a huge gap in the browser gpu ecosystem, especially since wgsl is missing features (like unrolling hints) that can result in dramatically different performance vs hlsl or whatever.