Discrete GPUs were a mistake

Discrete GPUs were a mistake.

Attached: RTX 40 Series Titan.png (1250x944, 247.78K)

problem, kilocuck?

>he doesn't have a mini nuclear powerplant to power his nvidia gpu

If you want to render each scene within 2.5ms to satisfy your 400hz monitor you have to be serious about gaming ;-^)

gimi

Attached: gimi2.jpg (776x959, 94.76K)

good. who cares about power on a desktop.
they should make a GPU that consumes 8kW and has 10x the performance of this.

this is unironically AMD's fault
> Nvidia always leading the high end market by a large margin
> suddenly RDNA2 catches up
> now Novideo has to overclock the shit out of their GPUs to compete with RDNA3

This, as long as you have proper cooling and aren't popping breakers it's a non-issue.
Computer power consumption has been trending upwards for decades. The 8086 was a 1 watt chip.

Remember when g thought Fermí was hot?

Of just buy the midrange model. If you don't need to play games at 4K on ultra settings, you don't need the 800W ones.

Imagine how fast Blender will be on this.

Attached: blender-logo.png (2000x1635, 87.24K)

i used it on a dualcore igpu in 2012
shit took minutes to render 1 frame then i snapped the mobo when trying to repaste
good times

>have proper cooling
Have you ever used a computer that consumes >500w under load? I don't care how good your a/c is, that's not gonna fly.

>a/c
mutt spotted

>No a/c
You're not even in this discussion unless you want an indoor sauna

Attached: 1621970258405.jpg (1080x1080, 96.3K)

i don't live inside a volcano so it gets up to 20 degrees outside at most

Great, so it'll only get to 33c under load instead of 36c

what the fuck kind of nose is that?

>they should make a GPU that consumes 8kW and has 10x the performance of this.

1. that's retarded

2. anything that needs this performance will be using multiple gpus, you can't just go hurrr one card more power infinitely

>This, as long as you have proper cooling and aren't popping breakers it's a non-issue.
Computer power consumption has been trending upwards for decades. The 8086 was a 1 watt chip.

In America 1600 watt systems will trip the breaker.

If you want to simulate what an 8000 watt gpu would be like, run 5 space heaters in one room on high 24/7 and see if your AC can keep up (it can't)

Power usage absolutely matters

I've never understood why they didn't just make card, or mobo slots, to add VRAM modules
I'm almost always just running out of VRAM before I overshoot the cards core clockspeed.
Just like regular ram, we should be able to throw a module of vram into an empty slot and keep playing on the same rig for a while longer.

Why did no one do this?

Nvidia was a mistake.

Dude, small room electric heater is about 1000 Watts.