xardoniak

joined 1 year ago
[–] xardoniak@alien.top 1 points 11 months ago

I use a p4000 to transcode Plex content. The GPU uses less power than maxing out my CPU. I also have the GPU available in Kasm though I'm unsure if that works as I mostly use kasm as a SSH jumpbox or access to an unfiltered Chrome session at work.

When I upgrade my CPU, I plan to put my spare 1650 in my host and spin up a windows gaming VM for guests and remote play.

[–] xardoniak@alien.top 1 points 1 year ago

I used to use Nginx Proxy Manager for exposing services but generally you end up exposing the login page for that particular app and you have a different login per app which is a pretty shitty solution for non-IT folk. I've tried to set up Authelia and other similar things and found them to be very annoying to set up / configure. Maybe I'm just an idiot though!

I would suggest having a think about what you want to expose and whether there's a better way (eg overseerr instead of exposing radarr/sonarr)

CloudFlare tunnels are also great - they obfuscate your public IP and can have a login form in front of them. You provide a list of email addresses that can log in to Cloudflare and only those users can access the website. I have mine set up to auth through Google accounts for example but you can use GitHub, office and I believe Discord. Not managing user accounts has been a life saver for me... You can also block access from outside of your country.

[–] xardoniak@alien.top 1 points 1 year ago (3 children)

The main advantage of a laptop in a homelab is the "built in UPS" but they're not designed to be left on 24/7

Realistically though, that's pretty old hardware. Do you know what the power consumption is like? It may be cheaper over 6 months to buy a raspberry pi or thin client and use that instead.

[–] xardoniak@alien.top 1 points 1 year ago

I use Uptime Kuma to monitor particular services and NetData for server performance. I then pipe the alerts through to Pushover