(solved) Why GNOME uses "grayscale" antialiasing method by default?

If you want to read stuff into what I write, you’re free to do so. Of course, it’s not something I wrote, so I would kindly tell you to stop replying to what you think I wrote, and reply to what I actually wrote, and the context in which I wrote it. Prefacing
I said:

In the context of “why isn’t subpixel AA the default”. Outside of us not being able to detect which of the 4 possible settings of the RGB pixel ordering is appropriate for your display (hardware lies, so we cannot use the EDID), for future displays the subpixel AA is going to be pointless because the pixel density will remove the need for doing sub-pixel AA in the first place.

Yes: hidpi displays aren’t that common outside of laptops; and external hidpi displays are still expensive. This is why the option is still there, and won’t be removed. The writing’s on the wall, though, which means it won’t be made the default.

Canonical, as a downstream, probably has a better grasp of how many people use their product, and on which displays. They are entirely free to change the defaults to something that makes sense for their users. Doesn’t mean GNOME upstream should do the same, otherwise we wouldn’t be using Wayland by default, for instance. GNOME maintainers and designers make their own decisions.

It’s an intentional decision, in the sense that “grayscale” is the most generic value for what is a very personal decision. Font rendering is as much more about your own perception of what is “good” than it is about an objective reality.

3 Likes