Visual Artifacts on Windows 11 with GTK 4.14.6 and AMD Graphics

I am experiencing visual artifacts when using GTK 4.14.6 on Windows 11 with an AMD graphics card (RX 580 X). The issues appear as graphical glitches or distortions within GTK-based applications. These artifacts disrupt the UI rendering, making the interface difficult to navigate and use.

The behavior can be seen in two attached videos. The first video shows glitches on the toolbar, while the second one highlights issues with the window buttons. In both cases, no custom user drawing is involved.

It’s also worth noting that these glitches do NOT appear when running the application with GTK 4.12.x.

Toolbar glitches
Button glitches

Could it be because of recent DMA features…

Hi,

There is no 4.14.6 release.
There is a tag in git but it was created by mistake on the unstable snapshot 4.15.6

If you want the latest stable, use 4.14.5 instead.

My bad. It should be 4.14.5. It works with intel and nvidia on 4.14. On 4.12 it works just fine on any gpu.

Btw: I always use github. In this case I took branch 4-14.

Ah, yes, I see, it’s just a revision bump in the build.

Are the GPU drivers up-to-date?

Can you test with the environment variable GSK_RENDERER=gl?
If it works, it’s probably an issue with the new ngl renderer. Look at the issues on Gitlab, there are probably already bug reports. If not, create a new one.

Drivers are updated. Ok. I’ll do as you suggest.

And if the issue lies with NGL, is it possible to compile GTK to use the old GL model without relying on an environment variable, at least until a fix is available?

No, it’s not possible to compile a specific renderer.

The intent is that everyone should be using the default renderer, because it’s the only one that is being developed and tested; the other renderers are there as a fallback mechanism in case of bugs.

Could I implement this: If AMD graphic card is detected, the app automatically switches to the old GL compositor; otherwise, it uses NGL? I need a dynamic switch like this, if such a solution exists, to handle it automatically within the app. Otherwise, I will have to use 4.12.x.

You will need to write a wrapper binary that sets up the GL context, gets the GL renderer string, sets up the environment, and then launches your application in that environment.

Seems too complicated.

I was thinking: could I simply set the environment variable (GSK_RENDERER=gl) from my main routine before launching the GTK application?

You can call setenv() before GTK is initialized, but you can’t know the GL renderer without a) having a display connection and b) initialising GL, so you can only set the GSK_RENDERER environment variable unconditionally if you do that from your own main function.

I plan to detect if the graphics card is from AMD using native system calls, without relying on GTK for that task.

You still need to have a display connection and a GL context bound to something if you want to query which GL driver you’re using. That part has nothing to do with GTK.

GTK 4.16 fixed the problem.

1 Like

Hi @Kemo_G_Kemic,

This is issue NGL: icon rendering artifacts on Windows (#6564) · Issues · GNOME / gtk · GitLab, fixed for 4.16 with Redo descriptors/texture binding (!7473) · Merge requests · GNOME / gtk · GitLab

1 Like

Thanks for the info!

This topic was automatically closed 45 days after the last reply. New replies are no longer allowed.