How much better is it than 1080p?

How much better is it than 1080p?

Attached: AAuE7mB2_xdnswfpbWLeJC-U3FQv7REHm-eo5nCZ9Q=s900-mo-c-c0xffffffff-rj-k-no[1].jpg (900x900, 58.85K)

I want to watch 4k movies but a 4k player is $200 minimum and each individual 4k blu ray is usually at least $30

Way better on >150" screens if we are talking raw res, HDR most of the times is implemented like shit so not much

I meant 120".
On >43"

HDR processing makes images look terrible.
I do not recommend viewing anything in HDR.

It’s only good for newer stuff that’s explicitly shot with HDR in mind. Retroactively adding HDR to something like Citizen Kane is fucking pointless.

What does HDR do exactly?

You ain't gonna see the difference if you don't sit at the ideal distance, how far you are from your TV and whats your screen size now? Sub 43" not worth

I see a ton of noise in a lot of 4k content. Even the biggest rips I can find online (60-120Gb) look like shit. Better resolution yes, but the fine detail is lost in noise. I dunno if this is a camera problem or something to do with my decoders. A moderate quality 1080p rip usually comes in at 2Gb and looks passable, 12Gb for a fine quality one. So logically 8Gb 4K rips should look pretty good, but they look like SHIT.

most movies are actually rendered in 1080 and for 4k releases they are upscaled, which makes 5k or 8k even more pointless, and for streaming there's a lot of compression
you really only get the benefit of 4k in media if you're a videogame fag and even then it's not really very noticeable

OP here.
My TV is 43'' and I lay about 7 feet away from it. I was just watching Psycho on blu-ray and seeing flaws in it that I could imagine 4k would iron out.

I like the idea of having the best quality version of a movie.

film scans go up to 6k. no need to upscale

about four times better

It gives you greater extremes in terms of brightness. Whites look insanely bright, and blacks are super deep and inky. One major problem is that there’s no standardization for HDR, which means that every release is going to look a bit different on your TV compared to someone else’s.

t. someone who tried to view hdr on a non-hdr screen or a failed hdr spec screen (or "fake" hdr)

Film captures dynamic range greater than the HDR spec. It gets compressed to standard range when being released to match the equipment capability. Anything shot on film can be rescanned with HDR in mind.

HDR aka high dynamic range is an increase in colour depth (10 bit, 1billion colours vs 8bit or 16million) and a specification of contrast ratio and brightness (very bright, very high ratio between the bright and the dark) and colour gamut.

2gb 1080p rips are dogshit quality and 8gb 4k rips are also dogshit quality. The "grain" comes film the film they were filmed on and 4k pushing near the limits of the "resolution" of the film negative. They could also be shitty upscales, but it's very unlikely.

>most movies are actually rendered in 1080 and for 4k releases they are upscaled

They aren't rendered at all, rendering is not the right term. It's encoded. Also that's not true at all. As long as they were shot on film stock, they have no "real" resolution and 35mm film can go to about 6k before you start to see the crystals of the film itself. You just rescan the negative for the 4k release. Only some are shitty upscales.

He’s talking about modern digital movies shot on the Arri Alexa or other similar cameras.

>4k
>we're at 8k

Attached: 87654312.jpg (470x470, 29.52K)

Does the quality of your TV matter? It seems like you'd need a high-end QLED TV to really take advantage of it

>Anything shot on film can be rescanned with HDR in mind.
But it goes against the director’s original vision and won’t necessarily look better.

To get the best blacks, you want OLED. I think QLED is supposed to have better gold/yellow though.

Yes, the capability of the TV is very important. They need to be able to have a very high contrast ratio, good brightness support 10bit colour depth and have a wide gamut covering most it not all of rec2020. The HDR certifications however, are very very easy to pass and you can get in with just about any bullshit. Buyer beware.

SDR was simply a limitation of equipment, not a choice or vision. HDR doesn't really change how it's presented either, it just brings it closer to the infinite dynamic range of real life instead of a very limited subset. It's not the same as some dogshit colour grade or post-process shit.

Fuck the director

You are going to see a difference but not a >200$ worth difference in my opinion, user.
Only few releases out there have 4k version of the camera film as the master standard, most of them are made starting from 2k and upscaling them which in my opinion always looks bad and very noisy.

Attached: optimal-viewing-distance-television-graph-size.png (547x461, 48.67K)

>SDR was simply a limitation of equipment, not a choice or vision.
On what planet is 70mm film considered SDR?

Attached: 61CAA4F1-DA2C-4038-8CD6-F6CB080DB41D.png (1920x1080, 2.8M)

Limitation of display equipment retard. I've already stated film stock has a dynamic range above HDR. You can capture HDR all you want, but you have no way to play it back, which is why the release were in SDR. Not even projectors playing the original stock could show HDR because of the limitations of contrast ratios.

But now we have ways of playing HDR, so remasters can go back and rescan film stock and release with it in mind, which is what they are doing.

>Not even projectors playing the original stock could show HDR because of the limitations of contrast ratios.
But maybe you don’t actually need to go beyond. Just because you can do something doesn’t mean you should. Honestly, HDR kind of reminds me of the loudness wars and Rick Rubin’s brickwalling of Californication. It’s just another example of digital technology interfering with a proven process. Audiophiles still listen to vinyl for a reason.