I don't feel so good intelbros

I don't feel so good intelbros...

Attached: FPNvlmqXoAAJQOd.jpg (2048x1152, 139.68K)

Other urls found in this thread:

youtube.com/watch?v=Otcge1cn8Os
twitter.com/NSFWRedditImage

>First gen product
>Raja Koduri
>Intel display drivers
>Fat, lazy, uninspired Intel engineers
It was always going to be a dogpile.

If Raja wasn't on this team I would have higher hopes. He insisted that AMD stay on GCN for 7 generations. Look at Vega and first gen RDNA both ųarches he worked on and they both sucked. Now RDNA2 kicks ass because he's gone. He should just get paid to shovel cash into a fireplace

I can't really agree, first gen RDNA was good and vega was a stopgap that saved the company's GPU business, nobody expected AMD to be able to build a GPU to compete with pascal, even if it did use more power and was sold with tight margins.

>intel integrated graphics is actually decent
>amd integrated graphics is actually decent
everyone wins

but both of these are discrete GPUs

In that case get dunked on intel, your product is shit and you should feel bad

GCN was a great idea that never went past beta design. All they ever did to improve it was tertiary stuff that should go along with real ISA changes.

The Vega/Polaris GCN is fundamentally no different than what was seen in the Tahiti chips of 2011.
So much so that proper IPC testing was, IIRC, perhaps 3-5% better.
I believe that in 7 years the biggest real GCN perf leap was reducing div cycles from 40 to 28.

N-no sir... allow me to complete and mature the process...

this should only surprise the retards who have not been paying attention. intel can't into high competitive GPUs, not even in the entry-level.

>16gb of VRAM in a laptop
why

Attached: Screenshot_20220331-234514.png (2340x1080, 665.46K)

Have you played flight simulator 2020?

And yet GCN was still pretty competitive, at least in terms of perf/die area and perf/transistor count. I think it's actually faster by those metrics. At least the Polaris cards were.

Stop it with the antisemitism!

I could see intel throwing a lot of effort into software support like nvidia does, so that there's more reason to choose incel if you're spending a lot of time in blender or playing with ML shit

They could also surprise us and take a shit on NVENC, if their video codec stuff is good

They have proverbially the best software team out of all hardware manufacturers and are one of the biggest contributors to Linux itself and the GPU drivers in Mesa kind of proves this. They've only slacked for getting something similar to CUDA in place but oneAPI is really promising. Quicksync also shat on everyone in terms of how early it was when it first announced and it took 2-3 years before AMD and Nvidia surpassed it.
The only things harming it is the fact we're talking about Raja heading up the team and that they did tricks like this for the graphics department if anyone remembers.

youtube.com/watch?v=Otcge1cn8Os

Obviously, times are different now but it remains to be seen how well Arc does. I won't buy it but I will likely buy Battlemage.

Well at least they have ample room to grow. Gonna take a couple of generations before that architecture becomes competitive...

>vega
>sucked
Opinion discarded

:copium:

Al off a sudden 60 fps is not the target but actually 80, arbitrarily set by a company, is now the target. All other companies who used 60fps as target were wrong, we never did this, no, never.

Stop fucking acting like companies are good guys and bad guys, they all just want money. You should literally kill yourself if you think otherwise.

hes the dude that designed rdna though 🤨