Full explanation of current HiDPI (fractional and integer) scaling support in Wayland

I have a 32 inch 4k screen (3840 x 2160), which I currently use at 200% integer scaling in GNOME Wayland. I am very happy with the current setup, but I have some questions about the underlying architecture / techniques:

  • How does integer scaling work for XWayland applications? I had issues with blurry applications because I had activated the experimental fractional scaling feature. I did not realize that this would also change the behavior for integer scales. I was very surprised today that on a fresh install, XWayland applications actually do scale for integer factors (tested with Steam, Spotify, Bitwarden, Firefox). I thought I understood why the apps weren’t able to do scaling under XWayland, but now apparently it is possible. How does this work?
  • Why does it no longer work for fractional scaling? If I understand correctly, for fractional scaling apps are scaled with an integer scale to a larger resolution and then downscaled. Why are XWayland apps not capable of renderering at this integer scale in this case, while they clearly can do it in the above situation?

I am running GNOME 43.3 on Arch Linux.

Thank you very much for your time. This is purely a request for information, not to demand anything from developers or anything!

1 Like

The default setting (i.e. the one that doesn’t work with fractional scaling) is, to simplify things a bit, compatible with the way X11 has always done HiDPI. This is also how GNOME Shell itself worked since it learned how to do HiDPI on X11 many years ago, and it is by simply drawing things twice as large using a single global integer scaling factor. This has some annoying consequences such as inability to conveniently draw things with the right size per monitor, use fractional scaling factors, or automatically scale HiDPI-unaware applications to make them usable.

In other words, in the default setting, X11 applications implement their HiDPI just as they always have, and display server just goes with it the way it always has.

When you enable the experimental setting, things change a bit; the coordinate space for each monitor is scaled according to their individual scaling factor, and instead of having GNOME Shell manually drawing itself twice as large, it draws itself with an unchanged logical size, but uses higher resolution images or font caches. The same applies to Wayland clients; for example HiDPI aware Wayland surface with the size 100x100 on a HiDPI monitor will actually provide 200x200 sized buffers containing the window content. This is roughly how most Wayland compositors work these days. In other words, if you have two physically similar monitors, one 2K and one 4K, to the compositor, in this mode, they will all have the logical size equivalent to the 2K monitor, which happens to more correctly mimic their physical real world sizes.

The problem with the new way is that it makes X11 applications believe they are always on a LoDPI monitor, even if that is not the case. To make them adequately sized on the screen, compositors will scale them up, and this is why they are blurry.

3 Likes

Thank you for taking the time to answer my questions!

It is still a bit unclear to me though.

You say that the default setting (the one that only allows integer scaling) at 200% draws everything twice as large. As I can see that the image is still sharp and clear at this scale, somehow each application is providing twice the number of pixels that it would do on a LoDPI screen. Is this correct? My question is then, how does the XWayland application (and the native Wayland app for that matter) know that it is supposed to do that? Does this still leave HiDPI-unaware applications looking small or will the compositor notice this and scale them to blurry normal size?

If all of the above is the case, I cannot understand why the same cannot be achieved for the fractional scaling. I have been reading a lot of issue threads on XWayland gitlab and similar, and the problem they often pose is that a global scaling factor for all XWayland windows would not be good as some might be HiDPI-aware and some might not. I assumed that this was also GNOME’s stance on the matter. However, it seems that in the integer scaling case, this is also the case and is seemingly “OK”?

My point being, if we are somehow able to draw XWayland application sharp and clear at 200%, 300%, 400% using integer scaling, why can’t we let the XWayland clients use the increased buffer size when doing the integer scaling step for fractional scaling?

Yes, Wayland applications do this. The compositor then scales their logical size in the display coordinate space if they are on a HiDPI monitor. An example that illustrates this is moving a window between a LoDPI and HiDPI monitor; during the move from one to the other it’ll suddenly flip to be larger/smaller.

X11 applications either look at Xsettings to find the global scaling factor decided by the desktop environment, or try to figure it out themselves by looking at RANDR resources, e.g. the monitor’s resolution and their physical dimensions. GNOME X11 applications, and X11 applications that seem to respect your global HiDPI scaling setting tends to use the former.

X11 applications will remain small; Wayland applications will be scaled correctly.

The problem is that Xwayland in this case will be made to believe it’s LoDPI and always treated as such. Clients respecting the Xsetting will be draw themselves as LoDPI, and applications looking at RANDR resources will see LoDPI resources and scale themselves accordingly. The reason for this is simply due to how it is implemented, both in GNOME and in most Wayland compositors, where the global coordinate space used by the display server acts on “logical” pixels where each logical pixel is meant to appear roughly the same size to the user no matter the DPI of the monitor it happens to be on.

It is due to how the coordinate spaces in the display servers are managed, and how HiDPI works in X11, i.e. where the clients themselves individually and optionally make HiDPI compatible content. With the old (default) way, legacy X11 applications will forever be unusable due to being too tiny to use, and the new way solves this at the cost of HiDPI capable X11 applications.

Anyhow, there are ways explored, e.g. in Draft: xwayland: Multi DPI support via global factor rescaling [updated using properties] (!733) · Merge requests · xorg / xserver · GitLab that attempts to make it possible to make HiDPI X11 applications appear crisp, at the cost of making legacy LoDPI X11 applications become even more unusable.

3 Likes

Thank you so much for your thorough explanation, I understand it now.

It seems that this was what confused me the most. In the default GNOME Wayland session the tradeoff is decided in favor of HiDPI-aware X11 applications, so all applications are cleanly scaled except HiDPI-unaware XWayland apps. In the experimental version, this decision is apparently flipped (because it is more recent I suppose) in favor of having all XWayland applications blurry, just to make sure that those HiDPI-unaware XWayland applications are not rendered tiny.

Personally, I feel that the default behavior is way better for a non-technical user. A lot of popular applications still run through XWayland (as I mentioned, Steam, Spotify, Bitwarden, a lot of videogames) but I still have to find an app that is actually HiDPI-unaware under X11. Having a lot of those popular applications being unable to crisply render to your screen is a real issue in my opinion. Especially for games, the new solution is totally unacceptable, and I even implemented a GNOME extension for quickly changing display scale so that the game can actually see the full resolution.

HiDPI-unaware XWayland apps being tiny can more easilly be solved on a per app basis (like checking for some internal size setting or font changes) while there is nothing the user can do to fix the bluriness of the experimental implementation.

The bluriness caused me to actually avoid installing any XWayland applications or search for ways to run them natively. The new approach actually increases the number of problematic applications (only HiDPI-unaware XWayland apps vs all XWayland apps), only the type of problem is different (tiny rendering vs not being crisp).

Again, thank you so much for your answers! It is really nice to have an overview of the situation in a way that is clear an understandable.

I have not looked into the chain of PRs, but why is there also not a property that the client can set on its windows to denote scale? It seems that could work the same as wayland buffer scale does (i.e. default to 1 if it is not present) so would not break legacy LoDPI apps.

And there is no way to tell what behavior the X11 application expects?

Maybe it would be helpful to have a list of which applications prefer which behavior. If it turns out that one list is significantly larger than the other, that would help decide what to optimize for. Sounds like the applications used by @knokelmaat are all hidpi-aware already, for instance.

1 Like

And there is no way to tell what behavior the X11 application expects?

We can set a GDK namespaced scaling factor in Xsettings that they are free to ignore, we can set “hidpi-like” monitor dimensions hoping clients do their tricks to become larger, and we can manipulate Xrdb/Xresources like files to set ridiculous font sizes hoping the ones that ignored the other to HiDPI hints will at least be old enough to look at these files, and that the ones that did become HiDPI from the first stays away from Xrdb/Xresources stuff.

And then, with this, assume all Xwayland is HiDPI, do a whole lot of back-and-forth conversion between coordinate spaces in the X11 window management together with some special casing in the Wayland input emission code and surface code to translate between the “fake” X11 coordinate space, and the compositor coordinate space. This is the mode where real legacy lodpi clients will look microscopically tiny, if they didn’t use mega font sizes discovered from the old school X11 configuration files to compensate.

This is roughly what KDE is doing if one goes into the display panel of their settings app and checks some check box, and it kind of works most of the time, but not always.

The code that the mentioned coordinate space back-and-forth in mutter does not yet exist.

1 Like

There is no such thing; in X11 clients get a large canvas and do what they want. There hasn’t really been a need for compositors knowing anything about their scales, since they, in the pure X11 world, couldn’t really do anything with the information anyway.

But with the scale-monitor-framebuffer case, aka the de factor “Wayland style” coordinate space style, this just complicates things, because X11 is still a single large canvas; a HiDPI window effectively needs a twice as large canvas to do its thing (place popups, know where the screen edge is, etc) compared to a LoDPI window. This isn’t an issue in Wayland, because clients don’t know their position nor where any screen edges are, anyway.

1 Like

Yeah, that occurred to me after I typed it, there is simply no way to do that on X11 without breaking the protocol. Or hacking up the server so it isolates clients like Wayland does, which would also effectively break a lot of clients…

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.