Nvidia 4000 series

>tfw one month away from announcement

Are you adequately prepared to watch Nvidia BTFO AMD into irrelevance, permanently?

Attached: Nvidia 4000.jpg (602x316, 40.1K)

I'm more afraid of Nvidia BTFO'ing my PSU and power bill.

fpbp
/thread

>permanently
until November

amd has already surpassed nvidia

>800W
That's something that would keep your room warm in the dead of winter at -40

Can't wait to save 10% on a literally-worse-in-every-way alternative.

>aka the beast

Attached: 1620188248273.png (1200x1125, 262.13K)

I use Linux because I'm not a complete slave like you OP. Why would I give a single solitary shit about a company releasing yet another card with no adequate driver support for my superior operating system? I don't care about how your Microshit performance compares to AMD. Go back to eating your goyslop.

Attached: 1657833567697.jpg (1080x1331, 81.24K)

>Linuxtard shows up to shill OS

Never fails.

Attached: re wt cat 1657951563245.jpg (826x871, 124.67K)

780 ti still works fine.

That power consumption of the TI doesn't seem right...

You know what... That's not a bad card for older 1080p stuff. Not at all. However 980TI is really fucking nice. I lucked out and bought one February of 2020 for less than $150. Doesn't even run all that hot.

I dont like spending money

>800W
Seems you will need a 1000W power supply just for the gpu

Ah yes except I got one for 70€ 2y ago yes it had CPU fans strapped to it because the original died but still runs.

Can Nvidia stop releasing housefires?

no. my 3080 will be fine for awhile. might be kind of interested in the 50xx series. might just go AMD next time though

What's a nice used 1080ti going for these days?

Around $1500

Its not. Thats for development purposes.

Literally zero reason to upgrade from 1070ti

Yeach i think i will stay with my 120W card.. This is just insane

144fps

meme. anything 80+fps is fine

I got it for 30 dollars at a yard sale lol.

who cares about new video cards in a world where every game is bottlenecked by cpus

what exactly is the purpose of more and more powerful and expensive GPUs when there are no new games that require it? I mean, the 3080 is already overkill for any game on the market right now and a 2080 still costs well over 1k$.

There is literally no reason to upgrade your GPU, unless you do VR.
If you upgrade GPU every new generation and you only play pancake games, you are fucking retarded.

Attached: 1367545029181.jpg (2400x1800, 498.57K)

>4000 series
>relevant
HAHAHAHAHAHAHHAHAHAHAHHAHHAHAHA

Attached: 1584309799067.jpg (1440x1153, 408.38K)

Wow, and here I though I scored with a GTX 750ti that was not installed correctly in an old dell that I bought for Twenty bucks yesterday.

>he doesn't want to run RDR2 at 4K 165hz SSAA 16xxxx

>160-bit GDDR5

Attached: bean.jpg (375x416, 21.85K)

And a 1070ti can't do that either.
Run cyberpunk at that frame rate

>when there are no new games that require it
So rich fucks can play 4k + RT.

more like btfo my 30 series investment.

is anyone else thinking of just giving NVIDIA up and going AMD the next time they build? I don't really care about ray tracing and 4K native does not seem that taxing to run for the majority of games I play. Probably not doing a new build until 2025 at this rate though.

Couldn't care less, developers will still make half-assed ports that'll tank any performance upgrade these GPUs give.

guys I'm still waiting to buy a graphics card since 2012!
>buying 900 series? wait for 10xx
>lmao buying 10xx? 20xx series are around the corner
>30xx when 40xx is LITERALLY coming
I propose you consoomers kill yourselves

Chiller bros WW@?

cant wait for retards to spend 3k on minimal gains yeah@@@!!!!

linuxfags are a special kind of retarded. they spend all their free time shilling their os but at the same time literally cry and whine when distors make it more accessible to windows people. it's like what do they want

>playing cyberpunk

I know what you're saying. Every once in a while I want to play a newer game, but ususally I only play older games and it feels stupid to have a three thousand dollar computer. It's just weird growing up in the mid nineties and early 2000's and when you constantly needed to upgrade and now a gaming computer can last nearly a whole decade. It's mind boggling.

Attached: 1660059308411159.jpg (1024x1024, 57.96K)

>300W for the xx70
Jesus

Its so funny how stupid these goyim are, get insider trading tip about Nvidia getting a shitload of US gov money to make their own US based foundry
>Nancy pelosi husband buys $5 million shares before anyone even knows this is going on
>too much heat, OMG SELL IT QUICK THE KNOW!
>lose $5 million by dumping it at a loss

fucking retarded boomers LMAO that's what you get for being greedy subhumans

Attached: Pelosi's husband dumps Nvidia stock as House eyes chip bill.png (1027x1102, 1.03M)

going from a 2070s to a 4070, for 4k purposes. this is going to be amazing.

RT is unironically shit in the majority of the games. I do not understand why anyone cares about it. Even old cards run 4K just fine for most things, and even then custom res like 1800p exists.

>300 watts
>10GB VRAM
>GDDR5
>rumored to be 160-bit bus
Why are they gimping the 4070 I'm not buying 40 series but if I were it would be the 4070. That's such a slap in the face to the mid range

How about releasing actual good games? What's the point of buying unnecessary cards every year? This is becoming like the cod/fifa of tech

>plays games on medium low, 1080p lucky to hit 60fps

Developers need better hardware to make gooder game dumb dumb smelly.

get mad, get sad
I don't play AAA, like you consoomer shills.

>needing anything beyond a 3080 max
it doesn't even struggle at 4K for modern gaming unless you need to turn on ultra RT and play at 144fps. you realize devs aren't targeting anything beyond ps5 (2060 equiv) power right.

x070 has always been getting nerfed. Chads know to wait for x060 (Ti).

the entire us is founded on it. it's the only first world country to have such a system

Attached: 1650983305322.png (660x250, 124K)

the thing is, all of these extremely hardware demanding games are terribly optimized and don't run well on any hardware. Last year we had cases of 3090s literally going up in flames when people played New World and Diablo 2 Resurrected on them. These super high level GPUs are usually the GPUs the game devs test their games on the least, thus they usually run into most serious problems. On a 1080, you can't play a game in 4k with 160 hz, but at least you don't have to be afraid that the GPU will literally explode.

Is this the cope thread where people with 1080 and 2080 tell each other their hardware is still jusasgood?
Yeah this is the place.

This.
3060ti was the thinking man's choice this gen.

LOL what the fuck, 800W. that is utterly absurd, fuck your wallet and fuck your power bill to make the few games that can even take advantage of this hardware look slightly better. nvidia paypigs are insane.

he unironically deserves to be arrested for this shitty coverup. I hate politicians

Except the 3070 is still better.