What went wrong with RTX?

What went wrong with real time ray tracing technology in games?

How come "RTX on" just makes the FPS worse while also looking worse than games that don't use it?
None of the ray tracing games looks close to as good as a game that doesn't use it.

Attached: rtx-games-suck.jpg (2160x2520, 1.35M)

most people cant afford an RTX card that can actually handle ray tracing

It is still too taxing on computing power.
The whole RTX move was just Nvidia's way of upselling GPGPU designs/rejects to gaymers.

Even a 3090 gets sub 40 fps at 1440p in a lot of games like Dying Light 2.

It wasn't really raytracing the whole image, it just added gimmicky details like reflections

RTX is a meme. It just makes everything look like melting plastic and everything has a mirror shine

Attached: plate06.gif (1266x956, 471.79K)

Different game

Too intensive and doesn't look any better than well designed 'classic' dynamic lighting.

It does look good for minimal effort though which is what devs want

pricing and performance of cards.

Nvidia has a generation or two to convince public, before it will be abandoned.

Its good for making NPCs stand out less.

soul

Raytracing even at 1spp looks better than rasterization hacks, which not only provide less fidelity, but also cause weird visual bugs, like SSR breaking reflections depending on where the camera is pointed, static cube maps not actually reflecting the scene, etc.
The only time raytracing looks worse is when the resolution of something like a reflection drops in order to sustain a certain performance level. As GPUs get more powerful, you'll see less of that.

They've already convinced the public. The entire raytracing ecosystem is already developed and self-sustaining. AMD has been forced to implement raytracing, as has Intel. As far as market adoption, Nvidia won.

Remember Physx? It the same nVidia marketing strategy of creating a buzzword to sell more cards.

Although to be honest Physx was pretty cool.

PhysX wasn't even an Nvidia buzzword. It was originally a third party product that required a specific additional physics accelerator. If anything Nvidia made it less shitty when they bought it by making it GPU-accelerated instead.

They convinced public that it's something real and viable. It did not make a killer feature and did not make people to want in every game.
It's second generation and performance is still not here. They either have to drastically up and deliver somewhat acceptable performance with 4060/5060 and above , buy all the tech reviewers or go home. Because , in the end public will not care for some cool on paper feature, which is the state of RT as of now.

Yeah that's the part I think a lot of people don't quite get. The entire point of RT is that it 'just werks' in producing effects we've gotten pretty good at faking over the years. It's less about night and day differences and more about, at this point at least, being able to get free of all the weirdness that comes from these fake solutions we've all gotten used to.

jews.
>why jews
pushed ray tracing to soon because of shekel desire. game developers adopted it to soon because of shekel desire. retarded consumers bought into it early because (((influence e-celebs))). then jews made stuff like DLSS. a "reduce my image quality to make kiketracing practical" button. using (((influence e-celebs))) made retarded consumers into activist to shill it for free AND actually believe by reducing their image quality makes it look better!

>did not make people to want in every game
Untrue. Adding raytracing is seen as just a plus. There's no game that could come out with raytracing and people would say, "I wish you hadn't." To the extent it transforms the way the game looks (Minecraft), it can be a sales driver all its own.
>It did not make a killer feature
Yes, it did.
At best, you could skimp on the low-end and say "you wouldn't use it anyway" ala the Nvidia 16-series. Even that is now difficult.
At the middle and top-end, it is now impossible to sell a high-end GPU that does not have raytracing. Every review would slam it for it ("I can't believe you're losing raytracing this gen, they had it before, and their competitors have it for the same price") and the GPU maker who didn't use it would lose net sales, regardless of better rasterization performance.
Nvidia, AMD, and Intel know this, so they're not going to drop RT. It is a major selling point.
You literally cannot buy a current-gen GPU today that doesn't have raytracing, and from now on, you never will.

Shalom nvidia shill

It needs next gen graphics, and the gen after that to really hit its stride in games more sophisticated than quake.