Adventures in mixed HiDPI

I've just finished looking at all the major operating systems and their support for multiple monitors at different fractional scales. I'll summarize the results of my research here in case someone else finds them useful. Note that I only consider an implementation adequate if it can:
- change the scale/DPI on the fly when a window is moved between monitors, and
- render fonts and other vector elements directly at the desired resolution instead of using raster scaling tricks which make things blurry.
Results from best to worst are in the post below.

Attached: Windows_settings_display.png (2104x1427, 98.38K)

Other urls found in this thread:

blog.elementary.io/what-is-hidpi/
pandasauce.org/post/linux-fonts/
twitter.com/AnonBabble

cont. 1. Windows: PASS. As a freetard it feels weird to say this, but Windows currently does it the best. While it will raster-scale legacy programs, Windows 10 introduced an API which lets apps declare that they can draw themselves at any desired DPI. After an app enables this, Windows just sends it DPI-change events when needed, and the app handles the rest.
2. KDE on X11: PASS. Seems to work similarly to Windows as long as you only use Qt apps. It uses some Qt-specific magic I don't fully understand.
3. Anything else on X11: FAIL. X11 does let you set an arbitrary DPI, but it's a single one for all monitors. Approximating a mixed-DPI setup requires xrandr scaling hacks that affect other things such as cursor speed and font display quality.
4. macOS: FAIL. Apple makes screens designed for either 1x or 2x scaling and only these two cases are handled well. For anything in between, macOS uses a 2x-then-downsize trick. It tells any app, even ones that can render at any scale at all such as browsers and terminals, to render at 2x. After they do this, using all the subpixel rendering tricks which took countless hours to perfect in order to make fonts look great on LCDs, macOS takes the result and just raster-scales it down with a simple algorithm. This is a fantastic way to butcher both performance and visual quality. It also means that it's impossible for an image viewer or video player to render the pixels directly even if the media resolution matches your screen.
5. Wayland: FAIL. Wayland does the same thing as macOS and only supports integer scales in the protocol. While the KDE/Sway people understand why this is insufficient, the GNOME/GTK people shit out a stream of excuses, implying that because it's impossible to scale raster elements well, they're not going to try to make fonts and other vector elements look good either. The most prominent example is blog.elementary.io/what-is-hidpi/

Congrats wintoddlers, you win this one.

Can’t believe it’s getting worse with macOS. Can’t quite understand why but apparently apple silicon is making the 2X conversation an even bigger deal, it seems the GPU overhead of scaling a non-hidpi resolution basically chops your GPU performance in half. Luckily 2x looks gorgeous, but you want any more space fuck you and your performance.

don't care
my 27" 1080p 60hz screen looks fine

How does macOS work with an external 4K monitor then? Everything at 200%?

>kde:pass
LMAO no it does not. the last time I used that piece of shit I had too google to find the scaling parameter hidden behind a trillion tabs. after adjusting it to 300% for my 4k screen and rebooting, the cursor was still tiny as fuck but only when the desktop was in focus. god damn what a trainwreck

tldr: multi monitors with different res are for poorfags or in other words, low iqs

>After they do this, using all the subpixel rendering tricks which took countless hours to perfect in order to make fonts look great on LCDs, macOS takes the result and just raster-scales it down with a simple algorithm.
lol no, pandasauce.org/post/linux-fonts/
"OS X has the absolute worst font rendering from the readability aspect. It does not use hinting at all and relies only on grayscale anti-aliasing. This combination makes everything appear very bold and blurry:"
Apple being fucking lazy with font rendering is one of main reasons why they push higher resolutions so they don't need to do sub-pixel stuff.

I recently tested a 5 year old mbp on an lg oled 55 cx and all five scale settings looked great. It was mirror mode though

You're right OP, I was jumping around DEs for a while before I settled on KDE because it's the only one which just werks.

You can set a fractional scale if you want, but for anything larger than 100% it will render at 200% and scale it down later. So you're taking the performance hit of 200% and getting blurry fonts even if you only need 125%.

And this is why hi-dpi will always be a meme.

Like OP said, it pretty much works fine on windows. Everyone else unfortunately implemented it in a retarded way.

Actually this can work perfectly fine on X11 with randr. It's just that barely anyone has actually implemented (QT being a notable example) it. Randr sends events whenever the screen changes and you have access to the physical dimensions of the the screen.

>this common and useful hardware configuration is not supported well by the software
>"noooo it's your hardware that's bad, you're an idiot if you don't buy the exact configuration that makes thing easy for our shit programs!"
Retards like you are why Linux isn't winning on the desktop.

Agree. Sadly footfags think fractional scaling is too difficult and itoddlers don't have it either, so why bother.
They even killed it completely on Wayland even though Qt can do it on X11. Sad.

>year 2052
>playing call of duty: modern warfare 1 on my GNU OpenGPU
>need to maintain a precise distance for enemies to be visible, because the driver doesn't support non-power-of-2 scaling for models
>still better than using windows, non-power-of-2 scaling is useless bloat anyway

Get good eyes and stop using scaling you fags.
27inch 4k is 160dpi and can be used without scaling. Most programs that are text heavy you can zoom in so reading isn't a problem.
Save your UI space, especially today when a modern UI is 80% useless white space.

1080p at 14" looks small and 1080p at 13" looks tiny to me. The contrast is also jarring when next to a 1080p monitor that's 22" or bigger.

Oh wow. I thought the slightly blurry fonts were a weird macOS design decision, but apparently Apple's font rendering is just so shitty. That explains a lot.

>Randr sends events whenever the screen changes
Does the window get notified when it gets moved from one monitor to another with a different DPI, or is this global? I'm only aware of the global settings, but I may have overlooked something.

>pandasauce.org/post/linux-fonts/
Btw, thanks for making me aware of this nice blog.