GNU AND WINTEL FAGS BTFO

daringfireball.net/linked/2022/07/23/apple-silicon-inconvenient-truth

reviewers at ostensibly neutral publications are afraid that reiterating the plain truth about x86 vs. Apple silicon—that Apple silicon wins handily in both performance and efficiency—is not going to be popular with a large segment of their audience. Apple silicon is a profoundly inconvenient truth for many computer enthusiasts who do not like Macs, so they’ve gone into denial, like Fox News cultists and climate change.

Gruber is a literal known not a literal who.

Attached: main-qimg-58b1f6fcaba435de5ed9fe8e16dd54c8-pjlq.jpg (602x337, 16.13K)

didn't read, plus you are trans
iToddlers BTFO

>literal who website, didn't read lmao
>crapple
>tim faggot
>trans freaks
ITODDLERS BLOWN THE FUCK OUT AGAIN

Who gives a shit about either when the most popular computers sold are still 300 dollar Acer laptops with last gen Celerons in them?
Apple laptops will never go down to that price because it would devalue their brand. And since that won't happen the most popular computers will still be x86_64 PCs

Man made climate change is massively over exaggerated and you're definitely contributing to it by buying Apple products

Attached: 400bbc1e99898b6952ee84d55695d7f25dfa2be98c70b971ba54ccafe23c4079_1.jpg (1080x778, 141.11K)

Didn't I see an article here the other day about M2 macs having issues with fucking heat?

I like the Asahi Linux guys' take on the M1: due to standard UEFI plus the lack of SMM interrupts, it's actually a bit better than PCs freedom wise on everything but internal hardware selection. Until we get RISC-V SoCs of equivalent performance with libre bootloaders, it's not a bad choice. There's a reason OpenBSD targets it.

>Apple silicon is a profoundly inconvenient
This is the only right part on your message. The rest is just whinning.

If Apple wants people to use their silicon, they have to MAKE AN STANDALONE REPLACEABLE PART WITHOUT T2 BUILTIN SHIT.
A solid standalone ARM processor.
No proprietary coprocessors, no enforced TEE, compatible with socketed boards.
But they will not do it. Why? Because they are cheating. Their processor is shit, but it has tons of propietary ASIC crap offloading work from the main CPU cores. They can't compare in raw streght against x86, specially against AMD.

>the lack of SMM interrupts
That's means nothing when each SoC comes with a TEE coprocessor built in you can't access, and which can eFuse your CPU if you don't behave as a good citizen. Worse than any AMD PSP or Intel ME, which at least weren't eFused.

Only if you try to do massive renders on the M2 Air models

It's a $2 fix

Attached: ad933ef5-e473-4603-9919-67472769202f.38fc6adeecc9a2c148f7930accef1d0c.jpg (768x768, 45.1K)

LINCELS ON SUICIDE WATCH

Attached: 1629093331486.jpg (374x334, 32.14K)

based

>geekbench
Opinion immediately discarded.

markdown sucks which is all i need to know about this faggot

call me when Apple starts shipping laptop with removable SSDs and RAM sticks.

nobody wants that performance hit from peripheral interconnects. this is why Apple is the best. dorks that can't afford to buy a new laptop every 8 years should just stick with their chinese communist plastic shit

Holy shit, you're that user who was seething hard as fuck about Apple "cheating" with their own chips by including dedicated circuitry for common tasks media tasks Macs get used for.

That was such a stupid fucking hill to die on, it's like saying "Ackshually your gaymen PC ISN'T good at rendering graphics, it's offloading all the work to a GPU and that's CHEATING! Lemme see it run real-time raytracing in software mode on JUST the CPU!"
If the Apple chip is faster than the equivalent x86 at common tasks, who gives a shit that this is using dedicated circuitry? The idea of moralising that CPUs must brute force every task or it's cheating is fucking dumb.

been doing blender renders on m1 max all good no heat. no fan. m1 max > m2

street shit harder currynigger

Attached: 1654466028624.png (1017x3200, 1.32M)

You've been at it for 2 years user, let it go.