Why do GTK/GNOME devs like refuse to implement fractional scaling and instead cling to stupid excuses such as "there...

Why do GTK/GNOME devs like blog.elementary.io/what-is-hidpi/ refuse to implement fractional scaling and instead cling to stupid excuses such as "there are no fractional pixels, so true fractional scaling is impossible"? Web browsers have been doing arbitrary scaling for a decade now. Just tell the font renderer to render the fucking font at 30px instead of 20px when the scale is 1.5x. Is that so difficult?

Attached: 1_VRciIe3d3U-EYduVMs1YCw.png (627x301, 4.02K)

Other urls found in this thread:

web.archive.org/web/20150211113458/http://blog.elementaryos.org/post/110645528530/payments
twitter.com/SFWRedditVideos

that just aint it chief sorry hahah

?

>year 2052
>playing call of duty: modern warfare 1 (2051 version) on my GNU OpenGPU
>need to maintain a precise distance for enemies to be visible, because the driver doesn't support non-power-of-2 scaling for models
>still better than using windows

Surely he is right at least that the fractional scaling ones are mistakes. If I understand it correctly they should just split each pixel into four and render at a higher quality, but with fractional scaling you don't have a sensible concept of where a pixel is.
That said I don't see why it shouldn't be supported, why they shouldn't let people make their bad decisions.
The whole thing seems a bit mental to me. The real solution should just be to design UI which scales properly instead of relying on weird monitor trickery to get it to. I guess that would be really hard to make work though.

It's not tricky at all, rasterizing beziers and rectangles at arbitrary scales is constantly done on GPUs. Like, even a gnome dev ought to admit their reasoning would be retarded if applied to video games, for instance. Projecting samples onto a pixel grid should be entirely orthogonal in a sane graphical environment.

Fractional pixels don't exist though.
Integer scaling + fucking around with the font size is almost always a better solution.

Found the GNOME retard.

Using Gnome/GTK to begin with lol

>there are no fractional pixels
This is not wrong. The guy in the link knows what HiDPI actually means.
> Just tell the font renderer to render the fucking font at 30px instead of 20px when the scale is 1.5x. Is that so difficult?
Learn the difference between resizing and scaling.

His argument is stupid because the UI element whose clarity is by far the most important for long-term usage, the font, can already be drawn at an arbitrary resolution. The same goes for anything vector-based or procedurally-generated such as a big fraction of UI elements on modern OSes, including the author's one. A quality desktop environment would would draw such elements directly at the desired resolution and only use raster resizing tricks for raster images. Instead, GTK/GNOME/Wayland devs chose to use a "render at 2x, then downscale" trick for everything, essentially deciding "we can't get this one case right, therefore we'll give up completely instead of doing the best we can". This is clearly retarded. People using scales like 1.2x or 1.5x shouldn't be punished with an algorithm that butchers subpixel rendering just because the devs were dumb and lazy.

Because I shit you not the only way to implement it is by doing 300% scaling and then rendering at half the size.
As you can imagine the overhead created by this is stupid high and it looks like ass.

>1.2x instead of 2x
Not my problem. If you are such a specialist, send MRs to pango.

Just because Apple produces screens that fit integer scaling and is okay with their OS looking like shit on others, and the GNOME/Wayland people are stupid enough to follow Apple's lead on this, doesn't mean that it's the only way or even a good way. There's nothing preventing the desktop environment from passing a DPI/scale value to apps, which could then use it to render themselves at an appropriate resolution. As I mentioned earlier, Web browsers have been capable of rendering at arbitrary scales for a decade now. Why would doing the same in a desktop UI be fundamentally impossible?

With Wayland, it's dealing with raster images. It knows absolutely nothing about the clients contents and cannot scale things well; no matter what, the compositors fractional scaling is just going to look fucking awful, because _fractional pixels don't exist_.
Proper scaling like you're talking about is a client problem.

Hey Basically I'm just not gonna send it (the MR!!)
I know.... UGH I know..... I'm sorry!!!!!!!!!!!!!!!!!!!
It's just I'd rather switch to mac is all.

Fairly certain web browsers don't actually scale any of the content. They just resize it.

So go use mac, homo. Why are you seething about this?

>Web browsers have been capable of rendering at arbitrary scales for a decade now
Because the web browser is actually the one rendering everything.
Imagine every web page element (including text) was actually just a pre-rendered png and see how well scaling works then.

Electron clients respect Xft.dpi and scale their entire content based on it, including fractional scaling. Some other apps and toolkits do as well.
Qt has its own environment variable because of course but does this just as fine including fractional scaling.
Only the Gnigger ToolKit refuses to implement this shit.

GNOME does have experimental scaling, but you have to enable it through gsettings. It's shit because it blurs everything. Wayland Plasma has the same issue with blur afaik.

Web browsers do not scale contents, it resizes them loosely. This is not accurate scaling.

Yes? That doesn't counter my point at all.
Gnome is retarded beyond belief, GTK4 should be the one implementing fractional scaling, and mutter shouldn't _have_ to.

>fractional pixels don't exist
I don't know why people keep repeating this useless cliche which is both obvious and irrelevant in the context of vector graphics.
>Proper scaling like you're talking about is a client problem.
I understand. It would have to be the toolkits which do this, rather than the Wayland compositor. It would work very well if the compositor passed either a scale float or a DPI int to the client and notified the client when it changed. Unfortunately, the protocol designers decided to make the scale factor an int, which means that the next DPI that applications can render at after 96 is 192. Nothing in between is possible. This is a regression from X, which at least let you set an arbitrary DPI (though not per-screen or without a restart).

That's wrong. If you set 125% scale in a web browser, it will draw the fonts/gradients/etc. perfectly at 1.25x their original size. It won't just draw them at 1x and then resize the resulting raster image.

Another excuse. All fonts that a typical user sees are vector rather than bitmap, and so is a great chunk of UI elements in modern toolkits. It's not actually pre-rendered and stored as bitmaps.

That's because GNOME is doing the 2x+downscale trick instead of actually rendering at the proper resolution.

Wrong again. It does scale non-raster elements accurately. It's just raster images which get raster-resized, which is the best the browser can do.

>Elementary OS
>Apple design philosophy
>You are cheating if you don't pay
>Their app store includes many simple applications that require you to pay

Its a nice looking desktop environment, but their OS is cancer

web.archive.org/web/20150211113458/http://blog.elementaryos.org/post/110645528530/payments

It does because it doesn't matter where they implement it and at no point do they have to "scale pre-rendered pngs" because everyone except GNOME already supports proper scaling.