this post was submitted on 15 Oct 2024
182 points (99.5% liked)

Linux

48344 readers
586 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Can you please share your backup strategies for linux? I'm curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] oscardejarjayes@hexbear.net 3 points 1 month ago* (last edited 1 month ago)

restic to a local server and to cloud storage. it varies by device, but usually just everything in /home/. The rest of the operating system should be reproducible, whether through images, ansible, nix, or guix, given the information in /home/.

scheduling is done through systemd, usually (or the non-systemd equivalent). I use BackBlaze now, but I switch around occasionally. restic has policy based snapshot removal, and a prune option.

[–] b34n5@hexbear.net 3 points 1 month ago

I really make backups only a few times. I have the configuration files of my systems on my GitHub and Codeberg. The rest, I don't need; the only things I keep are books and music that I download from the internet, which I have on a 1TB external hard drive.

When I have made a backup for a specific reason, I have done it with rsync. It's a tool that works quite well and is for the command line.

[–] twinnie 3 points 1 month ago (2 children)

I use OneDrive. I know people will hate but it’s cheap and works on everything (well, it takes a third party tool on Linux). If I care about it it goes in OneDrive, otherwise I don’t need it that much.

[–] cmlael67@lemmy.world 2 points 1 month ago

May I ask why you prefer that over Google Drive, or others such as Dropbox or Mega? I used it extensively when I used Windows, but that's been several years.

[–] mm_maybe@sh.itjust.works 1 points 1 month ago

may I ask which third-party tool you use? i'm using onedriver and it's pretty unreliable in my experience

[–] LemmyBe@lemmy.world 3 points 1 month ago* (last edited 1 month ago)

I use Bluebuild to create a reproducible system, plus a post-install script to handle other post-install tasks such as setting up initial preferences.

Also Vorta to backup files and settings to external HD, plus OneDrive Linux client to sync files and settings to cloud.

[–] phoenixz@lemmy.ca 3 points 1 month ago

Main drive is a 1 1TB super fast m.2 device, backup drive is an 8TB platter drive with btrfs.

Bunch of scripts I wrote myself copy all important stuff to the platter drive every night using rsync, then makes a snapshot with current date. Since its all copy on write, i have daily backups for like 3 years now. Some extra scripts clean up some of the older backups, lowering the backup frequency to once a week after a year, once every 4 weeks after 2 years.

I have similar solutions for my servers where i rsync the backups over the Internet.

[–] hallettj@leminal.space 3 points 1 month ago

When I researched this previously I concluded that there are two very good options for regular backups: Borg and Restic. These are especially efficient at backing up a diff of what has changed since the last backup. So you get snapshots of your filesystem state at each backup point without using a huge amount of space. You can mount any snapshot as a virtual directory. After the initial backup, incremental backups take a minute or two.

I use Borg, and I back up to cloud storage on Borgbase. I use Vorta as a GUI for Borg. I have Vorta start automatically when I start my window manager, and I have it set up for daily backups. I set up the same thing on my kid's computer.

I back up my home directory. I have some excluded directories like ~/.cache, and Steam's data directory. I use Baobab to find large directories that I don't want backed up.

I use the "exclude caches" option in the Borg "create archive" settings. That automatically excludes Rust target/ directories because they follow the Cache Directory Tagging Specification. Not all programming languages' tooling follows that spec so I also use directory name pattern excludes. For example I have an exclude pattern for .*/node_modules/.*

I use NixOS, and I keep my system config in a git repo so I don't need backups for anything outside my home directory.

[–] ikidd@lemmy.world 3 points 1 month ago

Keep everything on Nextcloud and back that up via Proxmox Backup Server.

Nuke and pave takes me less time to reconfigure Plasma and install NC client than bothering to back anything up directly.

[–] neo@hexbear.net 2 points 1 month ago (1 children)

Pika Backup for /home/ to an external drive. It's an automatic solution with a simple GUI that serves as a front end to Borg iirc. Lets you easily browse and mount old backups. Anything outside of my actual personal files can be recreated or restored trivially, so I don't care to back them up.

I also have a manual dump of /etc/ but i change it so infrequently that it doesn't really need looking after.

load more comments (1 replies)
[–] shadowtofu@discuss.tchncs.de 2 points 1 month ago

I use syncthing to sync almost everything across my computer, laptop (occasional usage), server (RAID1), old laptop (powered up once every month or so), and a few other devices (that only get a small subset of my data, though). On the computer, laptop, and server, I have btrfs snapshots (snapper). Overall, this works very well, I always have 4+ copies of my data in 2+ geographical locations.

[–] GustavoM@lemmy.world 2 points 1 month ago

.dotfiles on github

Big/critical files on an external HD

simple as

[–] traches@sh.itjust.works 2 points 1 month ago* (last edited 1 month ago)

Software & Services:

Destinations:

  • Local raspberry pi with external hdd, running restic REST server
  • RAID 1 NAS at parents' house, connected via tailscale, also running restic REST

I've been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven't gotten to it.

Edit: For the backup set I back up pretty much everything. I'm not paying per gig, though.

[–] drwho@beehaw.org 2 points 1 month ago

All of my servers make local dumps of their databases and config files to directories owned by unprivileged users. This includes file paths, permissions, and ownerships (so I know how to put them back).

My primary research server at home uses rsync to pull copies of those local backups from my servers.

My primary research server uses Restic to make a daily incremental backup to Backblaze's B2 service.

[–] xlash123@sh.itjust.works 2 points 1 month ago

For my home server, I use Restic and a cronjob to weekly take snapshots of all my services. It then gets synced to a Backblaze B2 bucket (at $6/TB/mo). It's pretty neat, only saving the difference between the previous and current snapshot, removes older snapshots, and encrypts everything.

[–] nichtburningturtle@feddit.org 2 points 1 month ago

I have my important folders synced to my Nextcloud and create nightly snapshots of that to a different drive using borg.

One thing I still need to do, is offsite encrypted backups using rsync.

[–] capital@lemmy.world 2 points 1 month ago* (last edited 1 month ago)

restic -> Wasabi, automated with shell script and cron. Uses an include list to tell it what paths to back up.

Script has Pushover credentials to send me backup alerts. Parses restic log to tell me how much was backed up, removed, success/failure of backup, and current repo size.

To be added: a periodic restore of a random file to have its hash compared to the current version of the file (will happen right after backup, unlikely to have changed in my workload), which will be subsequently deleted, and alert sent letting me know how the restore test went.

[–] krakenfury@lemmy.sdf.org 2 points 1 month ago

I sync important files to s3 from a folder with awscli. Dot files and projects are in a private git repos. That's it.

If I maintained a server, I would do something more sophisticated, but installation is so dead simple these days that I could get a daily driver in working order very quickly.

[–] potentiallynotfelix@lemmy.fish 2 points 1 month ago

If I feel like it, I might use DD to clone my drive and put in on a hard drive. Usually I don't back up, though.

[–] gerdesj@lemmy.ml 2 points 1 month ago

You have loads of options but you need to also start from ... "what if". Work out how important your data really is. Take another look and ask the kids and others if they give a toss. You might find that no one cares about your photo collection in which case if your phone dies ... who cares? If you do care then sync them to a PC or laptop.

Perhaps take a look at this - https://www.veeam.com/products/free/linux.html its free for a few systems.

[–] spacemanspiffy@lemmy.world 1 points 1 month ago

Dotfiles are handled by GNU Stow and git. I have this on all my devices.

Projects like in git.

Media is periodically rsynced from my server to an external drive.

Been meaning to put all my docker-composes into git as well...

I don't back up too much else.

[–] clif@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

Internal RAID1 as first line of defense. Rsync to external drives where at least one is always offsite as second. Rclone to cloud storage for my most important data as the third.

Backups 2 and 3 are manual but I have reminders set and do it about once a month. I don't accrue much new data that I can't easily replace so that's fine for me.

[–] qwerty@discuss.tchncs.de 1 points 1 month ago

Pendrive for the important stuff, paper for the really important stuff and brain for everything else.

[–] TomBombadil@hexbear.net 1 points 1 month ago

My backup is begging my computer to implode so I can experience the sweet relief of getting offline.

But also I use external discs and make copies of important files I can't recreate. Don't care too much about config as I am happy enough to distro hop and set things up anew.

[–] Peasley@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

I built a backup server out of my old desktop, running Ubuntu and ZFS

I have a dataset for each of my computers and i back them up to the corresponding datasets in the zfs pool on the server semi-regularly. The zfs pool has enough disks for some redundancy, so i can handle occasional drive failures. My other computers run arbitrary filesystems (ext4, btrfs, rarely ntfs)

the only problem with my current setup is that if there is file degradation on my workstation that i dont notice, it might get backed up to the server by mistake. then a degraded file might overwrite a non-degraded backup. to avoid this, i generally dont overwrite files when i backup. since 90% of my data is pictures, it's not a big deal since they dont change

Someday i'd like to set up proxmox and virtualize everything, and i'd also like to set up something offsite i could zfs-send to as a second backup

Timeshift for configs to a locally attached drive. Home partition to cloud with rsync

load more comments
view more: ‹ prev next ›