this post was submitted on 13 Jun 2023
30 points (100.0% liked)

Free and Open Source Software

17926 readers
43 users here now

If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I am using duplicati and thinking of switching to Borg. What do you use and why?

you are viewing a single comment's thread
view the rest of the comments
[–] furrowsofar@beehaw.org 2 points 1 year ago (1 children)

I am old school. I just use GNU Tar with the Pax format and multiple external detachable encypted hard drives. Reason is it is simple and a well known tool that is very common with a standard archive format.

[–] GnomeComedy@beehaw.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

I'm curious - how much data are you backing up with that method and how frequently are you doing your backups? Doesn't sound like it would scale well, but I'm also wondering if maybe this is perfect and I've just been over thinking it.

[–] furrowsofar@beehaw.org 3 points 1 year ago (1 children)

There is not a size limit. Lot of these other methods actually use GNU Tar behind the scenes anyway. More then that GNU tar has been used for decades for this purpose. Pull out any Unix book from 2 decades ago and you will see "tar", "cpio", and "dump/restore" as the way. The new tool out there is Pax and in fact GNU Tar supports the new "pax" format. Moreover GNU Tar with Pax format can backup almost full disk structure including hard links, ACLs, and extended attributes which a lot of tools do not do. It is still useful to archive some things at a lower level like your partition table, and boot blocks of course. You also have to decide what run-level (such as rescue) you want to archive in, and/or what services you should stop, or provide separate to file system dumps for depending on your system. Databases, and things like ecryptfs take some special thought (thought it does for any tool). It is also good to do test restores to verify your disaster plan.

I use tar on many systems. My workstation is about 1TB of data. Backup is about 11 hours though I think it could be faster if I disabled compression (I currently use the standard gzip compression which is not optimal). I think the process is CPU bound by the compression at the moment. Going to uncompressed or using parallel gzip at level 2 is probably the fastest you can do and should really speed things up by 4X or more. I have played with this some for my wife and her raw backup is a lot faster now. My wife uses USB 3 external drives specifically plugged into USB 3 ports (the one with the SS symbol and the blue interior), and with a USB 3 related cable. I use 6TB naked SATA drives I insert into a hot mount enclosure and store in storage boxes. My backup system can theoretically do incrementals too, but it has some issues since I have moved to BTRFS so I do not use that at the moment. Did always use before. I have an idea how to fix, but need to debug and test incrementals now.

How often: I backup monthly. When my incrementals were working I use to do it weekly or whenever I got nervous. Other option for the BTRFS file systems would be to use their native backup tools. Not sure though, I like to use generic stuff. Lot to be said for generic.

Big downside of tar is the mind numbing man page. Getting the options correct takes some real thought. You also have to be comfortable with the shell and Bash scripting. Big upside you can customize exactly what you want.

[–] davefischer@beehaw.org 2 points 1 year ago (1 children)

tar dates all the way back to the 70s.

[–] furrowsofar@beehaw.org 3 points 1 year ago (1 children)

Yes, I actually did not know how far back, thanks. Wikipedia seems to say 1979. I know my system admin book dated 1992 talks about it and it was common then. I think my brother use to use it in the early 1980s for his job and maybe I did too a few times. Wikipedia says GNU Tar is newer and traces back to 1987. The formats have changed some and there are several. The PAX format is much newer which I think was standardized in 2001 but GNU Tar would have taken time to implement it. I do not know that date.

People seem to forget that tar worked well back then and still does.

[–] davefischer@beehaw.org 2 points 1 year ago (1 children)

I had the chance to play with late 70s Unix for a bit a few years ago. (Hardware on loan from a museum.) VERY minimal, but still recognizable. (Well, my Unix reflexes are old - I started in the mid 80s.)

[–] furrowsofar@beehaw.org 3 points 1 year ago (1 children)

Interesting. About then I was using a VAX. Somehow I spend most of my time on other stuff until I switched to Linux around 2000.

[–] davefischer@beehaw.org 2 points 1 year ago (1 children)

My first Unix was 4.3BSD on a VAX-11/750. (There was another 11/750 running VMS, but I didn't like that nearly as much.)

[–] furrowsofar@beehaw.org 2 points 1 year ago (1 children)

Yes VMS. That was what I was using. Unix. I did use it for something a few times. The university had one of those mini-supper computers that were a thing for awhile.

[–] davefischer@beehaw.org 1 points 1 year ago (2 children)

Oooh - what mini super? Something weird, or just a small vector machine? That was an interesting niche...

[–] furrowsofar@beehaw.org 1 points 1 year ago

Not really sure what it was. Maybe a small vector machine, or maybe a small cluster of them. I frankly do not remember much about it. Kind of forgotten about VMS until you reminded me. It was a time of much change in the 80s. I started on an IBM 370, then a departmental VAX, a Xerox Star System (a Word Processor on which the Mac was based), then we moved to Macs and Workstations. I had a Micro VAX I used, and Macs. All very expensive stuff. Personally I had a video terminal at at home at the start of the 80's, then a Commodore 64, then late 80's I bought a Mac at a huge price which I used until about 1997. There were some other systems kicking around during that time too that are hard to remember. One was something from Honeywell at a company I worked for during summers.

[–] furrowsofar@beehaw.org 1 points 1 year ago

Actually fun reminiscing a little. Have not thought about this stuff in decades. One thing I always though was kind of fun. When I started collage terminals were just coming in for students and there was not enough of them. Huge lines. Me I would go over to the row of empty card punches and punch up a deck for my assignment, walk over the the window and give it to the operator and have it read. Then I would get in line for a terminal which by then was often shorter, login, do any editing and debugging, and run and print my assignment in like 30 minutes. Not sure why others did not do this. Just seemed like the way to go.