You know, this sold me on RT. Local shadows look too good. But no GPU can run it comfortably

youtu.be/yT6_z0H6-3U
You know, this sold me on RT. Local shadows look too good. But no GPU can run it comfortably.

Attached: l9en1xqI.jpg (3840x2160, 1.11M)

Other urls found in this thread:

youtu.be/TXI8rWiOF0k
twitter.com/NSFWRedditImage

Why would you waste resources on things you won't even notice?

>that visual noise in the plant shadows
It's over. Maybe in a few generations they will get there.

watch the video, it's is very noticeable.

Asked the government when they wrote your welfare check.

Proper GI plays a huge part in grounding the scene. The little details really do add up.

Gonna take a few generations to run it proper or some new creative ways to use RT to pull it off without a huge performance hit. I know Nvidia has been working on some options but they don't seem to get anywhere on actually releasing them. They had demos out years ago.

>ray tracing looks good
no shit zoomer

>no shadows vs rt shadows
i dont get it
we had shadows in games before raytracing was possible
how is that a fair comparison?

the video shows a lot of cases where objects don't have shadows or they're way worse without raytracing

>Local Faggot Unaware of Shadow Maps

And none of the raytracing technology matters unless you own an oled display to play the game on at 4k 120hz+.

Im a bit frustrated with browsing for a pc monitor and getting SDR only IPS after SDR only IPs, some cost 200€, some cost 1k € and they all have garbage contrast and blacks, + they get up to like 500 nits max.

sun shadows are easy, but local object shadows are hard, you'd have to make artists work on them
here it's a system
look at paper, it would be hard as hell to make those shadows within performance limits and only for one scene at best

Stop posting when you don't understand a fucking thing about the topic that's being discussed.

Ray Tracing is one of those things that everybody likes, but the question is not just like, it's the viability of this shit that in my opinion is nowhere near what enthusiasts believe and companies try to make us believe.
For example, one problem I see in all games, whether they use NVidia's techniques or games that implement their own ray tracing rendering (like Teardown), is that there is a considerable delay for the denoiser to do its job. If you are in a dark area and stop, you will notice over seconds the building of the image. If you move, the image quality immediately drops. It's easy to ignore these things, but I don't think we should.

You can easily see the effects no matter the resolution and on a very basic monitor though.

that's not DXR issue, it's DLSS
also RTX=DXR, it's MS who did most of the work.

Sure, but it sucks on sdr and with low contrast.
CP supports HDR and people gushed on playing it on the new alienware qd oled solely thanks to the HDR more so than thanks to rtx.

That's a good point. Most games have really shitty implementations of raytracing technology. Partly because the performance just isn't there.

Based on the OPs vid Cyberpunk 2077 seems to do alright with the speed that the shadows update. Though I'm sure I'd notice the flaws playing it myself.

Well the reality is that most people aren't going to blow 2k on a GPU and 2k on a monitor to get that experience. I think the GPUs are gonna get there first and realtime GI is pretty much the biggest change in graphical technology since we started making 3d games. It's impressive even without the fancy monitor. Would it be better with one? For sure but that's not going to be reality for the average gamer for quite some time.

In fact you can make all objects in your scene cast shadow if you want, even without ray tracing, and get acceptable results, but the thing is that it destroys the frame rate. In the case where there are several objects and light sources the frame rate is worse than with ray tracing, by the way, I don't know why, so the guys just turn off the ability of smaller objects to cast shadows.

32" Dell OLED monitor goes for $5k, you are free to get it
could also get ultra wide alienware or plain 42" tv from LG

also you did not research monitor market enough, monitor with dimming zones and 600 nits are available, they have near OLED contrast ratios, they cost $2k though

Why are shadows so demanding anyway?

that looks surprisingly good.
still not buying a 3090ti so i can run this at 1080p 30fps, though.

Nta but dimming zones are this weird middle development technology that's going to seem ancient in a couple years once we get past it that I can't see myself investing on a monitor like that.

youtu.be/TXI8rWiOF0k

>Well the reality is that most people aren't going to blow 2k on a GPU and 2k on a monitor to get that experience.
3080 + alienware qd oled would cost 2k € combined

Whos gonna buy a 2k monitor
I want a 300€ monitor that doesnt glow in the dark.

We can already make use of it. (see Metro Exodus) The problem is just that most people don't have hardware to support the tech.

Transparency effects gobble up almost as much performance from a 3d engine than RT does. Will you tell me you would rather have opaque glass/smoke/water just to get more FPS? After all it's would be barely noticeable.

we're a decade off from RT being viable/commonplace. people buying RTX shit today are ultra early adopters.
remember when PhysX required a dedicated card? seems surreal now, but eventually it became so fast and commonplace that no one even thinks about it anymore.

There's various ways to go about them but all of them have their own downsides.

The advantage of raytraced shadows is that the amount of objects doesn't matter. The whole scene gets shadows pretty much at the same performance cost.

>I want a 300€ monitor that doesnt glow in the dark.
gonna be a while

OLED or microLED is not coming to monitors for another decade at best. Production volume is nowhere near mass market numbers.

You can only get real pixel perfect shadows by simulating light bouncing around the scene with RT. Everything else is just piles of hacks upon hacks to give the illusion of proper shadows. Either one is expensive. RT even more so.

Yeah no shit. Raytracing has been around forever. It hasn't become realistic to do it realtime until recently and we're still not quite there yet. That's what I meant it taking a couple generations to brute force it with more power.

A ray traced turd is still a turd

>Raytracing has been around forever.
you mean math for it is 150 years old. but math for warp drive is also 60 years old, no warp drives in the next thousand years.

LOOOOOL it's been 15 years since pixel shared became mandatory imagine devs having to wait 10 years so retards like you would finally accept it as "viable" Imagine adopting pixel shading in 2015.

They're already making some QLED monitors but they're gonna be expensive for a while.

remember when games forced you to buy a new card because of new tech, now zoomers complain they have to buy a new card at all
damn console refugees

The neo g7 with 1200 zones and 165hz is 1.3k € locally.
It does have blacks and nits, just seems silly to buy it since its 16:9 curved VA and qd oled exist.

Even the act of raytracing for rendering has been used for about as long as 3d graphics have existed. Just not realtime rendering.

>QLED
ultra wide garbage, they simply cut tv matrix in half
I do not understand why is it so hard to cut tv matrix in 4 parts?

burn in is still a thing, using it as monitor would last your qled couple years tops

>remember when games forced you to buy a new card because of new tech
Yes, it';s called directx killing glide, it's pixel shaders, it's T&L, it's apg being phased out etc. You know nothing kid.

Raytracing can be really nice, but it can only run 'comfortably' on semi-linear games.

Open world games with ray tracing is a nightmare to optimize well.

Spiderman coming to PC with ray tracing will be interesting to see how well, if at all, is optimized.

Attached: screenshot_00000-min.png (1440x2560, 1.04M)

>it';s called directx killing glide
thank god, it was a clusterfuck.