GPU consumes 300W and above

>GPU consumes 300W and above
>acceptable
>CPU consumes 250W
>not acceptable and gets memed

Attached: tomwarren_corei9_12900K_1.jpg (1200x1600, 271.7K)

You should try comparing the kind of performance GPU is capable of compared to a CPU.

>250W
lol
lmao even

Attached: 1636052409238.png (500x810, 61.82K)

>E cores disabled
if the e-cores consume so little energy why not make a chip with many e cores, they are smaller so you could fit more of them in the chip?

Bro what if the e cores had e cores

>GPU consumes 300W and above
Because it's a far larger piece of hardware computing far more complicated data if it's hitting peak usage
>CPU consumes 250W
Because it's designed by retards or OC'd real fuckin hard

Attached: 3FF0C8CD-7B03-4A46-A632-0329697E42D6.png (1442x960, 1.26M)

>Because it's a far larger piece of hardware computing far more complicated data if it's hitting peak usage
precisely the opposite, GPU has very simple instructions and results can even vary in accuracy, CPUs do precise complicated data, GPUs use brute force and hundreds of cores to process more simple data faster

P-cores consume less power with e-cores enabled since the ring downclocks by 1.1ghz

I wish gpus would hit 3ghz+ and mcm soon

I really don't care about power consumption. where did this meme come from? If I had a blade in some datacenter, maybe i'd care.

>CPU
>one tiny chip
>GPU
>a huge card including fans etc...

>since the ring downclocks by 1.1ghz
wtf? b-but bing bus

meme bus is awesome. it carried skylake and is used in zen cores too
sad to see it gimped in 12th gen because cuck cores literally can't keep up and makes the chip unstable with a 4.7ghz ring

65W-105W is the efficiency sweet spot for CPUs
180-250W is the sweet spot for GPUs

bullshit
how do you know that

the main reason for this is that cpu's tend to not only have much smaller dies and higher power density, another thing making them harder to cool is that coolers have to deal with additional heat spreaders and TIM making heat transfer even less efficient, Combined, this is why a 250w gpu with a dainty little dual slot heatsink will chill around 70-75c, while a cpu at 240w with a massive 2kg skyscraper of metal fins and heat pipes strapped to it is struggling not to hit 100c. Some of this could be mitigated with direct die contact which is what GPU's use, but the power density issue still isn't solved.

And now you know.

Attached: m0ypxm0g58sy.jpg (4032x2690, 568.11K)

Yes. It is acceptable because the GPUs - with the exception of some truly terrible products here and there - actually perform in proportion to their power consumption most often. Their typical workloads work very well with parallel processing and as such making a GPU bigger, fatter, more power hungry and with more cores actually does result in significant performance gains.

When Intlel releases a 350W abomination that's 5% better than the competition at less than half the power it's a joke that deserves ridicule. Nobody really complains about big, fat workstation/server CPUs with lots of cores which serve their own purpose as well, it's just the "mainstream" line that catches its well-deserved mockery.

you can blame the prevalence of single thread performance over everything else for consumer pcs.
Clock speed power consumption scales way faster than performance, but every little bit that can be squeezed out and cooled, may as well be.

It's intel's archaic boost algorithm to blame and the fact that they stopped giving a shit and chose to make 241W PL2 run indefinitely on 12th gen
In a sane world, Intel cpus would respect the TDP values and run in the most efficient part of the vf curve like ryzens do

I don't really think it has anything to do with that. Since Ryzen picked up speed, Intel has been getting some proper competition and they're releasing these abominations because they need to push their shit right to the absolute brink to eek out a tiny win in a review. If they could make an efficient design which defeated AMD at reasonable power levels they would not push it this far at "stock" operation, this is just the result of desperation.

Sure, if there was no competition, they would happily leave performance on the table and sell locked 8-cores running at 3ghz for $400-$500. But thanks to AMD that's not the case. Remember, you can easily still have that 3ghz chip that barely draws any power if you want it, you just need to dial in the settings yourself. You also save a ton of money thanks to competition keeping prices from going out of control. Competition is such a strange gripe to have, it only benefits consumers.