this post was submitted on 07 Apr 2024
42 points (100.0% liked)

Linux

1259 readers
85 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Hi! A friend just recommended the backup tool that comes with Ubuntu. I took a look at it and was wondering what you guys include and exclude from the backups. I just installed wire guard VPN and but the config file in the etc/wireguard folder, where it belongs. I would have to include this folder as well if I want to keep my configs. And I guess many programs do the same, so how do you know what to include, so you can just revert to the last backup if something breaks or you get a new machine? Maybe that is a stupid question, but it was going through my head for some time now. Thanks a lot!

top 20 comments
sorted by: hot top controversial new old
[–] ChojinDSL@discuss.tchncs.de 4 points 7 months ago (2 children)

If you don't know, or aren't sure. Backup everything if you have the space. Once you've hit a couple of disaster scenarios, it will become apparent what stuff is really important.

Obviously, the stuff you can't recreate otherwise is most important. But apart from that, even the stuff you can recreate from other sources might be worth backing up because of time savings. E.g. faster to restore from backup than to recreate.

[–] kevincox@lemmy.ml 2 points 7 months ago

Yup. Step 1 is backup everything. Step 2 is maybe improve your reproducibility and then remove the things that can be reproduced from the backups.

[–] pbjamm 1 points 7 months ago

Also, while it may be fairly easy to recreate the OS/Application install from scratch that is generally small potatoes storage wise compared to you music/movies/photos etc that you for sure want to back up.

[–] Penguincoder 3 points 7 months ago (1 children)

An OS can be restored. Backup your data, so /home for sure and maybe any custom configs for /etc, like your wireguard configs. So anything you specifically edited/added for /etc directory.

[–] everett@lemmy.ml 5 points 7 months ago

Skipping the OS backup is reasonable, but you probably want to at least save a package list. Add something like dpkg -l > ~/packages.txt to your backup script.

[–] limelight79@lemm.ee 2 points 7 months ago (1 children)

Data and configurations.

If you have the space, software is nice because it's easier to get the system going again, but the data (your files - music, documents, pictures) and system configuration files (/etc for example) are the most critical. If you have databases set up, learn about their dump commands and add that.

You don't have to use the same method for everything. My pictures are backed up to another side in a second computer and to Amazon Glacier for $2/month (I'll have to pay to download them if I ever need it, but I'll gladly pay if I'm in that situation - those should only be needed if I have a major house fire or something like that). My weekly backups are my /home directories, /etc, /root, a database dump, and maybe one or two other important things.

[–] kevincox@lemmy.ml 3 points 7 months ago (1 children)

Really configuration is best not backed up but created from some source of truth like a Git repo. But a backup can serve as a poor-man's version control.

[–] limelight79@lemm.ee 2 points 7 months ago* (last edited 7 months ago)

An interesting idea, but it might be overkill for a home setup.

[–] avidamoeba@lemmy.ca 2 points 7 months ago* (last edited 7 months ago) (2 children)

If you want to be able to restore the machine completely, with everything installed and configured, then yes you have to backup everything. There's generally two ways, file-level backup where you'd use something like rsync, tar, etc. and block-level where you'd backup the whole partition/disk using something like dd, clonezilla, etc. The latter is the easiest to restore but it's a bit of a pain to backup because the system generally has to be offline, booted from alternative OS. The forner is a bit more difficult to restore but not by much, and it's so easier to backup. You can do it while the system is live. I'd probably try that first. Find documentation on backing up a complete root filesystem with rsync/tar and you're good to go. Some ideas. It's typically a single command which can be run on a schedule.

The built-in GUI backup tool is generally intended for your own user data. In order to be able to backup other things it'll have to run as root or be given caps and that might get more complicated than using straight rsync/tar.

[–] lemmyvore@feddit.nl 2 points 7 months ago (1 children)

You can use Borg for both things you mentioned. It stores deduplicated chunks so it doesn't care if you backup files or a block device.

Not sure why you'd have to be offline to do that though.

[–] avidamoeba@lemmy.ca 1 points 7 months ago* (last edited 7 months ago)

Because if you're not offline, something is writing to the filesystem and changing blocks while you're copying. If you're lucky what you copied would be outdated. If you're less lucky it would cause fs inconsistency which would be cleaned up by fsck. If you are even less lucky you'd end up with silently corrupted files, e.g. a text file with old parts mixed with new. If you're even less lucky, you'd hit a vital metadata part of the fs and it would not be mountable anymore.

To clarify, the filesystem being block-copied has to be offline or mounted RO, not the whole OS. However if that's the root/home filesystem, then you can't unmount it while the OS is online.

If you don't want to deal with that you need a filesystem or volume manager that supports snapshots, then you can copy the snapshot. E.g. snapshot your root LVM vol, then block-copy the snapshot.

[–] WbrJr@lemmy.ml 1 points 7 months ago (2 children)

What I am always wondering, to set up Linux until everything runs without problem, it takes quite some time for me. I use Linux for about a year regularly, and had to set it up about 4-5 times. And it almost always is a pain and I need to search online for some time until everything works. Is it getting easier the more often it's done? Or do you create a setup script that runs everything if you reinstall the system?

[–] avidamoeba@lemmy.ca 1 points 7 months ago* (last edited 7 months ago)

I use config-as-code for some stuff but in reality there are many manual steps that aren't covered. This is why I run an LVM mirror (RAID1) with two SSDs and I keep a full backup. The system hasn't been reinstalled in 10 years.

If you feel the way you do, you should probably just do a full disk backup with clonezilla or dd every X days and be done with it. If X is large, e.g. months, you should also run home dir backup more often. The Ubuntu built-in tool is great for that. Then when something dies, restore the whole OS from the clonezilla/dd backup, boot, then restore the most recent home dir backup, reboot, and you're back. Minimal effort.

[–] AnokLola@mastodon.social 1 points 7 months ago (1 children)

@avidamoeba @WbrJr Just install a pre-configured distro like Mint or Fedora and stay away from Arch

[–] WbrJr@lemmy.ml 1 points 7 months ago

I started my journey with fedora, but got annoyed by things like not working videos. Ubuntu works for me pretty well and I had very little issues with it compared to fedora. And that's what I seek in an os

[–] GadgeteerZA@fedia.io 1 points 7 months ago

@WbrJr@lemmy.ml I'm on Manjaro Linux but principles are the same. I have an SSD boot drive and a 4TB hard drive for /home data etc. I also have a second 4TB drive for backups:

  1. Timeshift app - does snapshots of OS to backup drive. I have 4x hourly snapshots, 2 daily ones, and one weekly one. This allows easy roll back from any updates or upgrades that went wrong.
  2. luckyBackup app - does a full rsync backup daily of /home data and configs. There are other rsync apps too, and you can opt for versions it you have space. But usually I've been fine with recovering anything I deleted or overwrote by mistake. I do this more for hard drive failure. I do also have one additional 1TB drive I keep in a safe. I connect this myself once a month or so for an offline backup.
[–] Tick_Dracy@lemm.ee 1 points 7 months ago* (last edited 7 months ago) (1 children)

Hijacking this topic, I use this software on Windows, which does incremental backups of the system (including the OS, alongside documents, downloads, etc). It can also be easily restored by booting a custom image from an USB and restore the image created.

Is there anything like this with Linux?

[–] gigatexal@mastodon.social 1 points 7 months ago

@Tick_Dracy @WbrJr ZFS snapshots and boot environments could probably do this. Not sure about the usb thing though. @allanjude (tagging Allan so I don’t besmirch ZFS too much).

[–] cmnybo@discuss.tchncs.de 1 points 7 months ago

I take a btrfs snapshot of my root partition daily so I can easily revert to an older version if I break something or get a bad update. There's nothing on my desktop or laptop root partition that can't be easily replaced, so I don't bother with any backups apart from the snapshots.

On my server, I keep multiple backups of /etc/ since there is a lot of stuff in there that I manually setup.

If you just want to backup the configuration, you can backup the entire /etc/ directory, it will only take a few MB when compressed.