this post was submitted on 06 Sep 2023
6 points (100.0% liked)

BecomeMe

805 readers
1 users here now

Social Experiment. Become Me. What I see, you see.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] PeterPoopshit@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

It seems self hosted ai always needs at least 13gb of vram. Any gpu that has more than 12gb of vram is conveniently like $1k per gb for every gb of vram, beyond 12gb (sort of like how any boat longer than 18 feet usually costs $10k per foot for any every foot of length beyond 18ft). There are projects that do it all on cpu but still, ai gpu stuff is bullshit.