this post was submitted on 05 Dec 2023
52 points (100.0% liked)

Chat

7500 readers
29 users here now

Relaxed section for discussion and debate that doesn't fit anywhere else. Whether it's advice, how your week is going, a link that's at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

i got sick again so the financial update and also this thread are late. i'll get the financial update up at a later point, or i might just combine it with january since there's not that much to report as far as i can tell

you are viewing a single comment's thread
view the rest of the comments
[–] MangoKangaroo@beehaw.org 2 points 11 months ago (1 children)

Thank you for your kind words. Every day gets brighter.

For homelab, I'm not 100% sure yet. I'm at least going to be getting a Synology NAS to replace my ancient Lenovo EMC2. I really wanted to get some hardware for running LLama 2 and KobaldCPP, but I'm struggling to find something that's equal parts not noisy (I live in a studio), affordable(ish), and that has the minimum specs I'd need. I was unironically considering a Mac Mini with a rack converter because of the energy efficiency and powerful iGPU, but sadly they only ship up to 32GB of RAM. Since my reading suggests I'd want at least 64GB of RAM for LLama 2's 70B version, I'm having to try some other way of doing things. I just wish I didn't live in a studio so I could grab a secondhand rackmount server without worrying about noise levels. 😭

[–] silentdanni@beehaw.org 2 points 11 months ago (1 children)

I have the same problem; my flat is only about 50sqm. Judging by the way things are going, I think there’s a chance Nvidia will release some consumer-grade hardware meant for LLMs in the near-ish future. Until they reveal their next lineup, although it may seem like a poor financial decision, I’m just sticking to using the cloud for running llms.

I’m also hoping to get my hands on some raspberry pis too. I would like to build a toy k3s cluster at some point and maybe run my own mastodon instance. :)

[–] MangoKangaroo@beehaw.org 1 points 11 months ago* (last edited 11 months ago)

Well at least I'm not the only one whose homelab ambitions are being crushed by their apartment layout. I think that I'm going to end up with a 2U compute rack, which means I'll probably limp along on one or two consumer low-profile GPUs. Now if only I could work out the details of the actual rack server hardware...

A Raspberry Pi cluster is interesting! My only real exposure to using Pis in a homelab was an old 1B I was using for PiHole. It was great right up until it stopped working.