How does WebKitGTK (in Evolution) determine screen pixels per px?

My apologies for a cross-post from

but I was unaware of until this morning. Thanks for your understanding. Anyhow, here’s my quandary:

If I understand correctly, CSS currently defines a px to be a device-independent dimension that basically corresponds to 1/96th of an inch. What I can’t figure out is how WebKitGTK determines how many device pixels to devote to 1px.

The underlying issue is that running the Evolution mail client (which uses WebKitGTK to display html email) on my 283 dpi display (with Xft.dpi set to 283, and with both xrandr and xdpyinfo reporting the correct physical dimensions and dpi of the display), all of the fonts come out looking perfectly fine, but a <table> set to a width of 614px in an HTML email comes out literally 614 device pixels wide, which is way too narrow for the contained text, which has been rendered in reasonable-sized fonts. So it seems to me that if I could “teach” GDK/GTK/WebKitGTK that 1px was basically 3 device pixels, all would be well. But I can’t figure out how to do that; in particular, GDK_DPI_SCALE=2.95 evolution does not seem to help. (Here 2.95 = 283/96 = mydpi/css_px_per_inch.)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.