I thought it would be nice to be able to write games in Javascript that don’t need a full-blown web browser, so I wrote gwebgl to provide gjs with an API that’s compatible with WebGL. The trouble is, it doesn’t work with GTK4. The simple example is supposed to clear the window to black, but it stubbornly stays GTK’s default light grey. No errors are reported.
It works in GTK3, and even weirder, if I write exactly (as much as possible) the same code in C, even using the Gwebgl wrappers instead of calling glClear etc directly, that works in GTK4 and GTK3. I’ve tried it in X and Wayland, on AMD and Intel graphics and even on Fedora as well as Arch. They all do the same thing, gjs and gtk4 doesn’t work.
Also, in every case, GTK/GDK ignores the required_version and use_es settings, and returns a GL version of 4.6. This apparently isn’t backwards compatible with OpenGL (ES) 2.0 which is the API Gwebgl uses to simplify the bindings. So that could have something to do with the problem, but then all versions should fail.
Maybe I could use a “dumb” widget, get its low-level X11/Wayland handle, and create an OpenGL ES 2 context with EGL?
But isn’t that similar to what GtkGLArea/GdkGLContext do behind the scenes? IIRC that’s how it works in SDL. Or do Gtk widgets have a more complex relationship with the GPU, so OpenGL can’t render directly to their surfaces?
Perhaps I could make a new component based on GdkGLContext that’s specialised for Open GL ES 2.0. It would probably be necessary anyway, because the current version only supports OpenGL 3.2+.
I gave more information than that.
I’ve tried 3 different machines, a laptop with Arch and Intel graphics, a NUC (Intel) with Fedora, and a desktop PC with a Radeon 6700XT. It was GNOME in all cases. I tested X11 and Wayland on the laptop, Wayland (I think) on the NUC, and X on the AMD. All with OSS drivers, so Mesa, presumably. Not sure whether the Intel systems had intel-media drivers or i965. I doubt the drivers are relevant because they all exhibit the same behaviour, across 2 different distros, and the problem is triggered by the combination of gjs and gtk4. If you still think it will help, I can post more details tomorrow.
It could be something to do with gtk being loaded after the main program (gjs) instead of being linked with it. So it might be worth trying another rewrite in python to test that theory.
OK, I can try that tomorrow too. Does GDK_DEBUG provide more information than gtk_gl_area_set_debug_enabled()? I already tried that, and I think the only extra information it printed was that 3.2 is the minimum supported version. IIRC I tried requesting version 3.2, and it still returned version 4.6 and ignored use_es, but I could be mistaken.
GtkGLArea only renders to an FBO, and then GTK takes the backing texture and paints it inside the GTK render loop, at the right time and in the right place. With GTK4, only a top level window can have a native windowing system surface.
That’s not correct: GTK also supports OpenGL ES 2.0.
There might be a bug with the creation of a GLES context, but as far as I know, GLES support has been tested on mobile hardware.
The set_debug_enabled() asks the GL implementation to create a GL context, for instance with the EGL_CONTEXT_OPENGL_DEBUG_BIT_KHR flag set when using EGL.
GDK_DEBUG is how you control the overall debugging support for GTK; GDK_DEBUG=opengl will print out the debugging messages from inside GDK that refer to the GL support inside GDK.
That’s not correct: GTK also supports OpenGL ES 2.0.
But apparently only if the platform supports ES and not “full” OpenGL. I’ve added some more debugging to my demo, and checked the source for GdkWaylandGLContext, which does this in its realize handler:
I don’t know whether share == context, where context is the one I’m creating, because there’s no public API for checking that. But even if they are the same object, gdk_gl_context_get_use_es() always returns false if the context doesn’t have its realized flag set, and the flag doesn’t get set until after the realize handler returns. So it’s basically impossible for a user to create an ES context if non-ES profiles are available.
If share != context, then is there an environment variable or something I can set to force the shared context to be ES?
I’ve managed to get a GLES context by setting the environment variable GDK_DEBUG=gl-gles, but it still hasn’t fixed the gjs/gtk4 issue. I suspect it’s something to do with different initialisation paths when the libraries are loaded via gi instead of the normal way. Can anyone think of where I need to look?
The reason it’s not working seems to be because the enums are not getting set on the WebGLRenderingContext correctly. I changed the invocation of glClear to this:
gl.clear(0x4000);
And now it appears to work correctly.
Even though you can technically do this with GDK_DEBUG you really don’t want to do that, and it won’t even work in release builds. If you had two GtkGLAreas in your program that requested different incompatible client APIs, then it would break quite badly.
You don’t need to share the context or technically even use a GtkGLArea, you can just create a new context with EGL, render to a framebuffer texture, and then append it to a snapshot as an EGLimage. I think that would be the most portable way to support multiple EGL client APIs in the same program. Maybe at some point GTKGLArea should handle that case though?
Brilliant, thanks for spotting that. It looks like I forgot to convert the underscores to hyphens when defining the ParamSpecs. I knew I had to do that, so I don’t know how I managed to forget.
Yes, Matthias Clasen has confirmed that’s a bug in #4221. Hopefully the fix will be relatively straightforward, and I will be able to use GtkGLArea instead of inventing my own version. At worst I should be able to just derive from the existing GdkGLContext classes and override the realize handlers to force ES. I’ll have to tap into the “private” headers to do that though.
Fixing the underscore/hyphen issue hasn’t solved the problem. gjs still returns undefined for these properties. It appears to think capitalised property names are invalid. If I change them to lower case it works, but then it’s not compatible with WebGL any more, which was the whole point. I don’t see anywhere in the docs that says property names have to be lower case, and they’re working in C; has gjs got this wrong?
I suppose I’ll have to change gwebgl a bit so that either the properties are lower-cased and the JS wrappers have an ES6 Proxy to remap the names, or auto-generate the wrappers too and define the properties in those.