Will integrated graphics (APU) make GPUs obsolete in the future?

will integrated graphics (APU) make GPUs obsolete in the future?
or will there always be a demand for GPUs for regular consoomers

Attached: APU-vs-CPU-vs-GPU.jpg (1200x675, 28.94K)

Other urls found in this thread:

aip.scitation.org/doi/abs/10.1063/1.5116025
twitter.com/AnonBabble

>will integrated graphics (APU) make GPUs obsolete in the future?
Never gonna happen. A discrete GPU always has more power available to it, more capacity for cooling, higher ultimate performance. Integrated graphics however have progressively been encroaching on low tier to mainstream performance cards since inception. We will inevitably reach a day where an IGP is good enough for absolutely everything aside from some ultra high end VR or full ray traced rendering. We're nearly there right now. In 1 or 2 more years we will have IGPs capable of putting last gen consoles to absolute shame while maintaining mobile power thresholds.

so it will only out perform low end dedicated gpus then until the end of time

Unless a company decides to produce a dedicated desktop part with a higher TDP and more thermal budget dedicated to the IGP thats likely going to be how it remains. The bar for low end GPUs still increases every year though. In a couple years we might have some APU with an IGP that matches or out performs the RX 5700 xt. A few more CU, new arch advancements, some new memory configuration, its entirely possible. As it stands now DDR6 will directly double bandwidth over DDR5, so we will see conventional dual channel configs provide over 200GB/s of raw system bandwidth, and this is on top of whatever TSV 3D stacked L3/L4 memory provides to the IGP in the future as well.

We will reach a day when an IGP surpasses a 1080ti in raw performance. We might sit back and say "well compared to the current high end that isn't much" but a 1080ti is absolutely nothing to scoff at as far as I'm concerned.

For people who don't care about gayming and don't do intensive things like 3D rendering, they're already good enough. Otherwise never, see:
>A discrete GPU always has more power available to it, more capacity for cooling, higher ultimate performance.

Every single Rance gamu can be played on highest settings on age old and outdated igpu. In other words: it's over.

Attached: 97 Chi-ha.jpg (269x411, 41.96K)

no but they're currently annihilating the low end gpu market

50 years from now an average CPU will have integrated graphics powerful enough as current GPUs. So if you only want to play roughly modern level and lower system requirements forever, yes they will. But I suspect games will continue to get performance bloat and need more demanding tasks making GPUs still necessary

yes look at apple

It would have to be like the Apple M1. Imagine if Zen4 APUs had full sized RDNA2 cores with 16gb of HBM2

Attached: IMG_0028.jpg (708x975, 689.76K)

Multiple memory channels and conventional DDR5 would be better than HBM

I am sorry but I believe that you are wrong.

Attached: R.jpg (680x509, 76.57K)

No one wants an interposer if they can avoid it. Apple could certainly afford it, but they decided to go with pure DDR channels for the M1 Max and Ultra. HBM would probably draw less power by an considerable amount, and yet they still didn't choose it.

>higher TDP

must suck being an x86 retard

>300W CPU
>500W GPU
>800W SoC
And exactly how do you plan to cool this thing?

Attached: yande.re 945380 cameltoe feet garter genshin_impact heels kaede_uehara nun pantyhose rosaria skirt_lift wallpaper.jpg (1920x1080, 446.96K)

Cute, kid

aip.scitation.org/doi/abs/10.1063/1.5116025
Lets fucking goooooooo

Even their top GPU is as slow as like a 3050 outside of a few canned benchmarks that are more a test of the massive integrated memory than the GPU

>3050
a 1050ti literally outperforms it

based ARM flexing chad

>Believing there is such a thing as an "ARM GPU"
Retard is ngmi.