Is it just me or does almost all new tech nowadays run hot?

Is it just me or does almost all new tech nowadays run hot?

Attached: 166194271818189623.jpg (780x438, 79.49K)

thermal limit increases over time, power usage increase over time.
yes, that is a logical conclusion as to why things are running hot.

Yup. The transistors are becoming so compact that it's hard to lead the heat away from them

So how are we supposed to reach this ideal future where portable tech becomes super compact if it’s all starting to cook itself right now?

You will burn your cock off and you will be happy.

I want to make my own computer from scratch, how do I go about in making it? I don't have an EE degree

No, you're right.
Some of it also comes down to the thinness and weight demands, but performance needs power, and that power needs to be dissipated somewhere.

>but performance needs power
ever heard of undervolting or underclocking. my laptop only gets slightly hotter when im charging battery. and is cold other times

5g cloud

That’s a good idea but not an option on something like a cellphone

>underclocking
= less performance
>undervolting
results vary depending on silicon lottery

Yes, that is a reduction in performance mostly.

My M1 Pro macbook doesn't have that problem

x86 issue

M1s run hot, though.

Companies want to see year over year performance gains, and increase power consumption and decrease efficiency to reach that goal, putting more stress on cooling solutions.
Notice the amount of huge AIO coolers and double slot GPUs with giant coolers compared to 10-15 years ago.

they max out at 100ºC, like literally every other chip, but the difference is that you need to really push it to the limit to make it hot, and that's very rare.

>the difference is that you need to really push it to the limit to make it hot, and that's very rare.
For Apple users, maybe.

desktops, laptops, and even phones all now automatically overclock themselves when they have thermal and power headroom. back in the day your cpu+gpu would run at a base frequency all day and night long. nowadays you move your cursor or tap your touchscreen and your cpu+gpu will boost into 3-5ghz.

advanced quantum super computer farms that crunch all the processing in realtime and you're just receive the aftermath

only basic processes need to be done locally
the rest can be done offsite and accessed through the cloud
your device can just optimize for networking latency and related shit

So... back to terminals and timeshares we go, then?