Sjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 29 days agoHow to clean a rescued pigeonsh.itjust.worksexternal-linkmessage-square125fedilinkarrow-up11.27Karrow-down112
arrow-up11.26Karrow-down1external-linkHow to clean a rescued pigeonsh.itjust.worksSjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 29 days agomessage-square125fedilink
minus-squareHnery@feddit.orglinkfedilinkarrow-up2·28 days agollama3 is not bad and you can easily run the smaller ones on an average desktop cornfuser
minus-squarePolarisFx@lemmy.dbzer0.comlinkfedilinkarrow-up1·28 days agoBut slowly, I filled my home server with whatever CUDA capable cards I have and it’s fine for SD, but I found llama way too slow. I rented a dual A2000 instance for a couple weeks and it was bearable, but still not great.
llama3 is not bad and you can easily run the smaller ones on an average desktop cornfuser
But slowly, I filled my home server with whatever CUDA capable cards I have and it’s fine for SD, but I found llama way too slow. I rented a dual A2000 instance for a couple weeks and it was bearable, but still not great.