(solved) Why GNOME uses "grayscale" antialiasing method by default?

Is there any reason why “grayscale” antialiasing method is used by default in GNOME? In each computer which I used I had to manually change that setting by gsettings set org.gnome.settings-daemon.plugins.xsettings antialiasing rgba to rgba for better font appearance.

I would say this 11 year old bug is relevant here: https://gitlab.gnome.org/GNOME/gnome-settings-daemon/issues/159

So, as I understand, grayscale is kept as default because it’s “safer”, with subpixel antialising linked bug is visible.

Are there any other reasons?

Subpixel AA is useless, if not directly harmful, on hidpi displays, so I expect it to become even less relevant in the future.

Subpixel AA is useless, if not directly harmful, on hidpi displays,

Can you give us some details…

Currently I am working on a 27 inch 4k display – when I got it two years ago I initially turned AA off fully, hoping not needing it. But I was not really satisfied, some fonts and font sizes looked good without AA, but not all, so I turned it on again.

For the upcomming 8k displays I really hope that I can fully turn off AA. Hopefully cairo will become faster when AA is turned off too.

I am very thankful for your work around GNOME. Really. Seriously. But…

Hmmm. Lets’ look on some statistics first:

Your sentence looks like: “Hey, dumb people with 1300x700! Do you want to use GNOME? YOU-ARE-DUMB! Take your money and buy new HiDPI display/laptop NOW!”. I really hope this wasn’t your intention here but for first sight it looks like. :cry:

It’s really sad for me, 1300x700 user, that GNOME ignores 1300x700 users:

  • by forcing default theme with too big widgets,
  • by setting windows size bigger than 1300x700 (gnome software bug),
  • by defaulting to antialiasing method which looks like a shit on non HiDPI.

By the way: all of these things are not important for GNOME but are important for… Canonical. Ubuntu fixes all of these things in its default setting. It’s really sad that strong manual intervention or Ubuntu downstream changes are needed to make GNOME useful on 1300x700.


I hope that the real reason of defaulting to grayscale is linked not-fixed bug, not intentional decision.

This is mostly caused by the font itself, or by the rendering engine. At 4k resolutions, subpixel AA is basically invisible, and just adds a bunch of work to the rendering pipeline.

Apple removed subpixel AA even as an “easter egg” option in macOS Mojave, after disabling it for a couple of releases.

Web rendering engines have also started disabling subpixel AA.

If you want to read stuff into what I write, you’re free to do so. Of course, it’s not something I wrote, so I would kindly tell you to stop replying to what you think I wrote, and reply to what I actually wrote, and the context in which I wrote it. Prefacing
I said:

In the context of “why isn’t subpixel AA the default”. Outside of us not being able to detect which of the 4 possible settings of the RGB pixel ordering is appropriate for your display (hardware lies, so we cannot use the EDID), for future displays the subpixel AA is going to be pointless because the pixel density will remove the need for doing sub-pixel AA in the first place.

Yes: hidpi displays aren’t that common outside of laptops; and external hidpi displays are still expensive. This is why the option is still there, and won’t be removed. The writing’s on the wall, though, which means it won’t be made the default.

Canonical, as a downstream, probably has a better grasp of how many people use their product, and on which displays. They are entirely free to change the defaults to something that makes sense for their users. Doesn’t mean GNOME upstream should do the same, otherwise we wouldn’t be using Wayland by default, for instance. GNOME maintainers and designers make their own decisions.

It’s an intentional decision, in the sense that “grayscale” is the most generic value for what is a very personal decision. Font rendering is as much more about your own perception of what is “good” than it is about an objective reality.

3 Likes

Thanks, @ebassi.

I wonder how it works on M$ Windows, how Windows Vista+ recognizes proper subpixel settings (in non-UWP apps).

Please, grow up.

They don’t. They make a best effort, and they even have vendor support.

There is no incentive for display makers to have up to date or relevant information embedded in the display’s firmware; some vendors copy-paste the EDID data from one model to the other; I’ve seen displays from major manufacturers with physical dimensions set to 16 millimeters by 9 millimeters. There is no validation process for display information.

This is why “hardware lies”: there is no incentive to tell the truth.

In the absence of a source of truth, and because different people like different font rendering results on any platform, we can only punt the decision to the user.

1 Like

Seeing this here and having no idea about these antialiasing technologies, is there any recommendation?
Or guide? E.g. for HiDPI displays (> 125% scaling?) use grayscale, > 200% use none?

BTW one can also set this in gnome-tweaks , no need to use the command line.

Maybe just use what you like, what looks good for you. :slight_smile:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.