I'm getting a Samsung Odyssey G7 27' (1440p). It's said to have HDR600 but I saw that HDMI2...

I'm getting a Samsung Odyssey G7 27' (1440p). It's said to have HDR600 but I saw that HDMI2.1 supports HDR Dynamic while DP1.4 supports HDR Static. If I want to get the most out of this monitor should I be connecting via HDMI or DP? I have a RTX 3080 if thats relevant and am mostly gaming at 1440p. It seems HDMI and DP deliver 240hz for 1440p so the only difference seems to be this Dynamic vs Static support of HDR.

Attached: Samsung Odyssey G7 27inch.png (709x518, 491.82K)

dynamic metadata is ostensibly better, but a cable for either is $10, and it depends entirely on the games and movies anyways.
Don't worry about it. It's cheap. Test both out, I'm sure you'll find the HDR to be a placebo.

You should use Dynamic HDR because it sounds cool
sounds sporty, being dynamic, yk

does not matter because hdr on windows is dogshit

Reals: Doesn't actually matter
Feels: HDMI2.1 so long as that means HDMI 2.1 and not HDMI2.0 is higher bandwidth than DP1.4 therefore better.

isn't that one VA panel?

>It seems HDMI and DP deliver 240hz for 1440p so the only difference seems to be this Dynamic vs Static support of HDR.

DP 1.4a does not support 10bit 240hz at 2560*1440. It requires 30.77GB/s of bandwidth while DP1.4 only supports 25.92GB/s. It may do it with driver level dithering for 8+2. But why would you want that when you can have 10bit?

Displayport because the connector is nicer

I have this monitor. I use displayport and HDR works although I don't use it.

>27" curved
Yuck.
Curved only for 49"

Does 10 bit actually make any difference? My GNU+Linux system defaults to 8bit. I tried HDR and on this monitor it doesn't make a difference, no idea what that dynamic vs static is though.

I'd probably stick with DP for gaming. It's the better connection. If you do opt for HDMI then get an optical HDMI cable that is rated for 2.1.

Also, HDMI 2.1 is mainly for 8K. HDMI 2.0b will do HDR10, HDR10+, and DolbyVision. The last two support dynamic HDR.

Merely by going from 8bpc to 10bpc pixel format enhances the view by removing a lot of banding in gradients. HDR at this point is more of a meme then anything else.

>Does 10 bit actually make any difference?
Only when the media you are viewing was originally recording in 10-bits per pixel or higher format. It makes subtle differences in the gradients of color, where you will see smooth transitions instead of stripes and bands.

it's pretty good for a va panel

it only has hdmi 2.0

The desktop is made up of SDR and 8bit assets and enabling 10bit and HDR does nothing for it. If you aren't working with media that is HDR10, then there is no reason to enable it.

Enabling HDR and 10bit for 10bit HDR content is what matters.

Also 10bpp (aka 30bit colour) increases the possible gradient of each R G B sub pixel from 0-255 levels to 0-1024 levels. Going from being able to show 16.7million colours to showing 1.07 billion possilble colours. Not to mention that specifications for preak brightness and contrast ratios (and white point) that are apart of the HDR standard you need to meet.

>I'm getting a Samsung Odyssey G7 27' (1440p).
Don't. I have it. It sucks. Flickering, random disconnects, crappy HDR, takes ages to start and shut down.
>It's said to have HDR600 but I saw that HDMI2.1 supports HDR Dynamic while DP1.4 supports HDR Static.
You have no fucking clue what that means and you haven't even bothered to look it up on Google.
>If I want to get the most out of this monitor should I be connecting via HDMI or DP? I have a RTX 3080 if thats relevant
Only one of them supports the native resolution and refresh rate
>am mostly gaming at 1440p
What do you mean by "mostly"? Running LCDs at non-native resolutions is dumb.
>It seems HDMI and DP deliver 240hz for 1440p
No they don't. Look at the fucking specs

BTW, I also have a 3080. There is no modern game that you will be able to run at 240fps.

this one is shit, had it and send it back, curvature is way to agressive, slow va panel, hdr is meh, blacklight bleeding is a huge problem AND had several dead pixels

To be honest I was also apprehensive at the aggresive 1000R curve and ghosting. What would be a good alternative then? The main criteria I wanted originally was g-sync, 1440p, 16:9, 27', and hopefully a decent contrast ratio.

The main issues I had finding monitors were they were either all 1000:1 contrast (but now this seems more acceptable to me) and/or LED monitors so I thought backlight bleeding and ghosting was going to be an issue. Then I saw this Odyssey G7 and it was said it was better and dealing with backlight bleeding as a QLED and ghosting (with reduced motion blurring). But yeah the curve is a turn off.

HDR is fine, I can see how would it look much better with proper backlight and more brightness but you can use it with a couple of tweaks.
Elden Ring looks pretty good and AutoHDR produces good results.

Current monitors are fucked. You can pick a VA panel with good picture quality but it will have noticable ghosting in fast paced games. Or you can buy an LCD but then you’re buying a 2013 tech with little development since.

OLEDs are expensive and have burn-in and new tech is not here yet.

wrong

even alienware has a 27" ips panel @ 240hz grandpa - check youtube for the UFO test
times are a changin