I would like to buy a computer to run Stable Diffusion and text models like GPT-J and OPT locally...

I would like to buy a computer to run Stable Diffusion and text models like GPT-J and OPT locally. That would cost me about $2500, which is too expensive.
I can already run these models on my laptop, but it takes 2 minutes to generate a single image (on my 4GB GPU) and 6 minutes to generate text (running GPT-J on the CPU).
Is there an alternative? Some optimization that I am unaware of? A cheap cloud GPU provider? Should I buy an old GPU like those Tesla K80?

Attached: h6ibcjdnrmalosa5dbqosbf0jqs5upqiggep_640x640+fill_ffffff.jpg (640x640, 60.86K)

Other urls found in this thread:

ebay.com/itm/255566366417
lambdalabs.com/gpu-benchmarks.
twitter.com/NSFWRedditVideo

wait for 4000 series and buy a 3090ti

I'm running a 12700f and a 360 12GB and it does images in like 10 seconds.

Will it become that cheaper?

3060*

you could always stop being a worthless coomer

For Stable DIffusion that would be fine. But GPT-J requires at least 16GB VRAM.

I recently bought two K80s. They haven't arrived yet, but I'm looking forward to try training with them (24GB VRAM each, so 48GB in theory). Theyre very cheap, less than 100USD each used. You need to buy a fan mod since server GPUs don't come with fans.

I'm considering doing just that if they get below $650. But I doubt they will go that low.

On Aliexpress they cost about $370 each. I guess you are in the US, outside the US everything is much more expensive.

>coomer
Being a porn addict puts you into special protection programs headed by Any Forums moderation.

ebay.com/itm/255566366417

Just get it shipped to a courier that'd take it to your country. Some sellers ship internationally too

That's a good idea. What kind of motherboard supports a K80 (or two)? Does it draw too much power?

>A cheap cloud GPU provider?

With paperspace pro ($8/m, cheapest plan) it takes 2 secs for SD (voldy's webUI version) on standard settings.

What GPU do they offer? Can you keep your files or do you have to reinstall everything every time like on colab?

You can plug them to any PCI-E slot. They draw around 300 watts I think? So any motherboard and psu should be able to drive one

RTX 5000 and RTX A4000 come for free on pro version, among other GPUs. Refer to the benchmark: lambdalabs.com/gpu-benchmarks. No, you don't have to reinstall everything. And you can set your own timeout time from 1-6 hours unlike colab, which shuts down for inactivity. They also offer better GPUs such as A100 for the higher plan or on a cost per hour basis.

Will Stable Diffusion stresses and burns out GPU like crypto mining?

if you're constantly producing images, yes

I have read that this GPU is actually made internally of two 12GB GPUs. Do you think that it would still be able to load, say, a 16GB pytorch model?
That sounds very good, maybe even too good. How do they profit from this?

>I would like to buy a computer to run Stable Diffusion and text models like GPT-J and OPT locally. That would cost me about $2500

a 3070 rig costs $1500 total