this post was submitted on 14 Jul 2023
642 points (95.6% liked)

Fediverse

27830 readers
177 users here now

A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

founded 1 year ago
MODERATORS
 

The fanbase is still large, but the Lemmy community hasn't quite caught up yet, and now there is a transitional period where the audience is smaller.

you are viewing a single comment's thread
view the rest of the comments
[–] Bilbo@hobbit.world 2 points 1 year ago (2 children)

Fair enough. I'll look into automating it using some sort of storage from another provider.

[–] JickleMithers@kbin.social 2 points 1 year ago

Backblaze is fairly cheap but can be slow to get data from.

[–] Dave@lemmy.nz 1 points 1 year ago (1 children)

Even just a cronjob or scheduled task to download the backups to a machine at another location would be a big improvement. Then you can do it far more often because it's automated.

But personally I like to have both a copy on a PC and a cloud backup, in addition to the server.

[–] Bilbo@hobbit.world 1 points 1 year ago (2 children)

I'm using the easy Lemmy script to run the docker instance. How do I take a backup of a running docker instance.

The backups I've done so far are full shard backups. But I don't have a way to automate that.

[–] Dave@lemmy.nz 2 points 1 year ago

The page here explains getting a database dump on a running instance (and how to restore): https://join-lemmy.org/docs/administration/backup_and_restore.html

Then just back up the other files in the volumes directory where Lemmy is installed (everything except postgres, which is what the database dump does).

The pictrs volume includes both the uploaded images and the image cache. I have no idea how to separate out the uploaded images so you don't have to back up the cache, I just back it all up.

[–] Die4Ever@programming.dev 1 points 1 year ago (1 children)

this is the bash script I use to create backups

#! /bin/bash
# https://join-lemmy.org/docs/administration/backup_and_restore.html#a-sample-backup-script
now=$(date +"%Y-%m-%d_%H.%M.%S")

cd ~/lemmy && (docker-compose exec -T postgres pg_dumpall -c -U lemmy 1> dump.sql 2> dump.errors)
cd ~/lemmy && zip -r9 ~/bak-lemmy-$now.zip ./ --exclude "volumes/postgres/*"
rm -f ~/lemmy/dump.sql

it creates very small zip files as a result so it's very efficient

I made a cron for it to run every 3 hours, like

0 */3 * * * ~/lemmy/backup.sh
[–] Bilbo@hobbit.world 1 points 1 year ago (1 children)

I figured out how to do this with docker container, but that's not ideal for a script.

Using docker compose it just fails with: Service "postgres" is not running container #1

I can see lemmy-easy-deploy if I do: docker compose ls

The service name is postgres in the docker-compose.yml file. Any idea what the issue might be?

[–] Die4Ever@programming.dev 0 points 1 year ago (1 children)

Where is this lemmy-easy-deploy? I haven't seen that before, maybe if I read how it works I can figure out what's wrong

[–] Bilbo@hobbit.world 0 points 1 year ago (1 children)
[–] Die4Ever@programming.dev 1 points 1 year ago (1 children)

I think you might just need to change the cds to go into the correct directory where the active docker-compose.yml file is, which should be in the folder called live

[–] Bilbo@hobbit.world 1 points 1 year ago

Sadly, no. That's already where I was running it.