My Gripes With WebGL

I've been playing with WebGL for a while now, and while I appreciate how cool it is to have access to a hardware-accelerated 3D graphics API in the browser, it doesn't mean that things can't be improved. I have three main complaints about WebGL that I want to talk about.

Lowest Common Denominator API

Even if you're running on hardware that is capable of doing more, WebGL still constrains you to GL ES 2.0. Things have been getting better, with WebGL 2 being equivalent to GL ES 3.0, but it's still not ideal. It's sad when you know your hardware supports geometry shaders, but you can't use them from your WebGL app because of API constraints.

In my ideal world, I'd be able to pick a target API, and fail gracefully if the hardware doesn't support it.

The likely reason things are the way they are, is that bringing over each feature from native land to WebGL requires very careful consideration, and I understand that. From a security standpoint, having a smaller API surface is always better. I just hope that at some point we'll have it sorted out, and won't be bound by the lowest common denominator.

Multi-GPU Systems

If you have, for example, a laptop that has both an integrated graphics chip and a beefier GPU, this problem will hit you. Usually, your browser runs its graphics stuff on the integrated GPU. If your WebGL app needs a bit more resources, you're out of luck. NVidia Optimus won't switch you to the high-performance GPU and there's no API to select which GPU to run on.

There's the blanket option of just tweaking your settings to run the browser on the high-performance GPU (for example, using NVidia's control panel app), but it's way too cumbersome, and it can't be a one-time "set it and forget it" thing either: you don't want your browser using high-performance graphics all the time, unless you don't care about your laptop's battery life at all. In other words, if there was a graphics-intensive WebGL game, playing it on a gaming laptop would not be a pleasant experience.

I wish there was some standard API to pick a GPU to run your WebGL stuff on... or at least I wish I knew the reasons why such a thing can't exist.

UPDATE: It has been pointed out to me that there is a special context attribute, powerPreference that resolves this problem. However, it looks like at the time of this writing (July 2017) it is only properly supported by Safari. There seems to be some work going on in Chrome as well. Let's hope full support of this feature by major browsers arrives soon!

Tooling

It would be unfair to say that WebGL has no debug tools at all - but the existing stuff leaves much to be desired.

There's Chrome's WebGL Inspector, which appears to be unmaintained. It works somewhat, but can spontaneously break, for example, by doing nothing more than loading an extension. And I haven't even tried it with WebGL 2.

Firefox has a nice Canvas Debugger that captures draw calls and shows you the framebuffer, but it's sort of limited: I didn't find a way to look at individual textures or buffers. Also, sometimes it fails to capture a frame because it can't find a requestAnimationFrame loop, even though it's clearly there.

There's a nice debugger tool available for Three.js, but once again, it's not as good as some desktop tools, and what if you don't want to use three.js?

I find the available options lacking, compared to RenderDoc. I can't stress enough how important it is to have a good debugging tool for graphics - it can save you tons of time. And one of WebGL's promises is rapid prototyping - so shouldn't it have a good debugger that helps you find the problem more efficiently?

Conclusion

If you know how to get around any of these, or if there's any work being done around those things, please let me know!


Like this post? Follow this blog on Twitter for more!