Why did SLI fall out of fashion?

why did SLI fall out of fashion?

Attached: Four-way-SLI.jpg (723x400, 46.1K)

Other urls found in this thread:

nvidia.com/en-us/data-center/nvlink/
twitter.com/AnonBabble

made no sense for stuff like gaymes
in HPC's you need the extra computing

Because it led to people keeping their old GPUs and chaining them together to keep up with newer GPUs instead of buying the new thing.

it was always a niche thing to begin with

Easier to sell one card for the price of two.

This

It's been pretty much replaced by NVLink.

nvidia.com/en-us/data-center/nvlink/

Attached: A6000_en_01.jpg (1920x732, 93.92K)

Nvidia and developers both found it too difficult to optimize for.
Multi-GPU can still be done in DX12. Nobody bothers trying it, because it's still a mess.

It's too based for the mere consumer.

Attached: 1657222553578.jpg (3840x2880, 2.28M)

end user cost and developers not implementing it

SLI was ded tech from the beginning, only select titles supported it and it didn't even do +100% performance per card.
Only a bunch of youtubers used it for the views and a handful of retards that thought it would be cool to do what they saw on linus tech tips.

It was fundamentally a bad idea due to microstuttering reducing perceived picture quality even if on paper you got more frames/$ buying say, two 460s instead of a 480.

Generally this setup skewed towards giving you too much compute power and not enough vram. You end up with many of the same issues you would run into on a multi-CPU system in a lot of ways.

I think the final piece of the puzzle is that GPUs just got good enough, multiGPU even with stuttering was more appealing when reaching 1080p60 in all games was actually technically challenging.

Eventually people realised buying the best single card you could afford was a better idea since it was more stable, gave you the same or better perceived performance, was quieter/cooler to run, offered more flexibility in case/psu/mobo choice, and so forth. MultiGPU is still used in workstation builds but it really made no sense for gaming. I expect APUs to become the mainstream gaming option (there's latency advantages) before I expect dual GPU to come back again.

>that SLI
why, a better GPU would pay itself considering how much less power it would be using

Also there was an implementation cost as people said, yet dual GPU users were almost definitionally a minority of the market. So it just ended up with shit support. The juice wasn't worth the squeeze.

Niche for customer
Most gaymes not supported
Less perf-per-watt

From what I recall from sli/crossfire at the time, dual GPUs didn't use THAT much more power than a single GPU solution for the same performance. That really wasn't the issue.

Besides SLI used to be far more prevalent for high end 480/580/etc builds and you simply couldn't get the same performance any other way.

I fell for the meme and the experience was awful for gaming, I did mine a fair amount of bitcoin at the time though and multi GPU was fucking fantastic at that especially since I was using school electricity.

There wasn't anything better for mid-2008. Maybe I'll go with dual Quadro FX 5800 instead though.

Oh it does, when it scales close to 100%, it takes exactly double the power, those two GPUs easily eat up 300W alone when SLi works well.

Attached: 1645248628006.jpg (2000x1500, 595.56K)

>easily eat up 300W
over* 300W

More precisely around 350W.

Game developers don't want to implement it because it has no market share, gamers don't buy it because no games take advantage of it. A vicious circle.

Outside of video games, It still sees use in professional applications.

Motherboard manufacturers wanting to cut corners/reduce costs Some trying to allow SLI or Crossfire only support. Mismatching pcie 16x + 8x or 16x + 4x which reduced performance using either version.

damn bro i love wasting electricity too to game on a 1280x1024 screen

>Things that never happened

>has never SLI'd or CrossFire'd

You're wasting electricity in any case. The point is to have fun and enjoy yourself.

>on a 1280x1024 screen
The SLi setup is overkill for pretty much anything from that period, even when I run it at 1920x1440 @ 85Hz.

I'd do it just for the fun of building it, but any modern low end budget card will get the job done for a fraction of the power consumption. It's your energy bill though, unless you live with your parents.

Because AMD and Nvidia can't write drivers for single GPUs that don't suck ass- much less two.

I had two 780's and every game had flickering shitty shadows; if it even had an official SLI profile at all that didn't require manually flipping deep-registry bits with NVIDIA inspector.

They literally aren't talented enough to keep shader effects in sync across multiple GPUs in vidya, so from now on you're only gonna see multi card in AI and Crypto faggotry.

The schizo theory though was that it allowed people to buy hardware that was 6-7 years ahead of current performance, there were plenty of games I was playing at 4k in 2013.