Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

  • phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    6 days ago

    Main drive is a 1 1TB super fast m.2 device, backup drive is an 8TB platter drive with btrfs.

    Bunch of scripts I wrote myself copy all important stuff to the platter drive every night using rsync, then makes a snapshot with current date. Since its all copy on write, i have daily backups for like 3 years now. Some extra scripts clean up some of the older backups, lowering the backup frequency to once a week after a year, once every 4 weeks after 2 years.

    I have similar solutions for my servers where i rsync the backups over the Internet.

  • BastingChemina@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    7 days ago

    I have a synology NAS with all my documents and family photos. I’m using the synology drive app on Linux and synology photo on android.

    All of that is backed up on Backblaze

  • mvirts@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 days ago

    Shout out to all the homies with nothing, I’m still waiting to buy a larger disk in hopes of rescuing as much data from a failing 3TB disk as I can. I got some read errors and unplugged it about 3 months ago.

  • Kongar@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Synology NAS. I really love that thing. I use their synology drive software to backup the Linux home folder, as well as windows PCs, iPads, iPhones etc. I use their photos mobile software to automatically backup phone photos and videos. I also synchronize a few select folders between PCs so certain in-use files are always up to date. I set the NAS to keep 30 old versions of every file. This works great for my college kids - dad has a copy of everything in case they nuke a paper or something (which has happened).

    I stopped cloning drives long ago. Now I just reinstall the os and packages. With Linux, this is honestly faster than deploying a backup - a single pacman command installs everything I want. Then I just log into things as I open them. Ya I might have to futz around with some settings or redownload some big games on steam - but the eye candy and games can wait - I can be productive pretty quickly after an install.

    I DO use btrfs with automatic snapshots (snapper and btrfs assistant). This saves me from myself when I bork an update (which I’ve done more than once). If I make a mistake, I just rollback a snapshot, and try again without my stupid mistakes. This has saved my install 3 or 4 times now.

    Lastly, I sneaker net an external hard drive to my office. On it is a manual backup of the NAS. I do this once per month. This protects from catastrophic failures like my house burning down. I might lose a month or so of pictures in the worst case scenario, but I still have my 25+ years of pictures of my kids, wedding videos, etc.

    In the end, the only thing that really matters is not losing my lifetime of family pictures and the good memories they provoke.

    • ddh@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 days ago

      Synology NAS here also, divided into private (family stuff, docker volumes etc) and public (Linux ISOs and anything that can be redownloaded). Both get backed up weekly to an older NAS with Hyper Backup. Private additionally goes onto a LUKS encrypted drive monthly which is spot-checked, taken offsite, and the previous offsite drive brought back. I don’t back up any PC (don’t care, just reinstall) or phones (they are backed up on iCloud).

  • JubilantJaguar@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    Here’s one that probably nobody else here is doing. The backup goes on my mobile device. Yes, the thing in my pocket.

    • Mount it over SSHFS on the local network
    • Unlock a LUKS container in the form of a 30GB sparse file on the device
    • rsync the files across
    • Lock, unmount

    The backup is incremental but the container file never changes size, no matter what’s in it. Your data is in two places and always under your physical control. But the key is never stored on the remote device, so you could also do this with a VPS.

    Highly recommended.

  • sloppy_diffuser@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    I use immutable nixos installs. Everything to redeploy my OS is tracked in git including most app configurations. The one exception are some GUI apps I’d have to do manually on reinstall.

    I have a persistence volume for things like:

    • Rollbacks
    • Personal files
    • Git repos
    • Logs
    • Caches / Games

    I have 30 days (or last 5 minimum) of system rollbacks using BTRFS volumes.

    The personal files are backed up hourly to a local server which then backs up nightly to B2 Backblaze using rclone in an encrypted volume using my private keys. The local server has a mishmash of drives in a mirrored LVM setup. While it works well for having mixed drives, I’ll warn I haven’t had a drive failure yet so I’m not sure the difficulty of replacing a drive.

    My phone uses the same flow with RoundSync (rclone + GUI).

    Git repos are backed up in git.

    Logs aren’t backed up. I just persist them for debugging and don’t want them lost after every reboot.

    Caches/Games are persisted but not backed up. Nixos uses symlinks and BTRFS to be immutable. That paradigm doesn’t work well for this case. The one exception is a couple game folders are part of my personal files. WoW plugin folder, EvE online layouts, etc.

    I used to use Dropbox (with rclone to encrypt). It was $20/mo for 2Tb. It is cheaper on paper. I don’t backup nearly that much. Backblaze started at $1/mo for what I use. I’m now up to $2/mo. It will be a few years before I need to clean up my backups for cost reasons.

    The local server is a PC in a case with 8 drive bays plus some NVME drives for fast storage. It has a couple older drives and for the last couple years I typically buy a pair of drives on sale (black Friday, prime day, etc). I have a little over 30TB mirrored, so slightly over 60TB in total. NVME is not counted in that. One NVME is for the system, the others are a caching layer (monero node) or temp storage (transcoding as it also my media server).

    I like the case, but if I were to do it again, I’d probably get a rack mountable case.

    • ikidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 days ago

      You seem pretty organized in your strategy, I would suggest you just pull a drive in your LVM to check how that goes for you. I’ve had issues in JBOD style LVM volumes with drive swaps, but YMMV.

      Frankly, I use ZFS now in anything that I would have use LVM in before. The feature set is way more robust. Also, an offsite ZFS replication to zfs.rent is a good backup of a backup. But Backblaze is pretty solid too.

      • sloppy_diffuser@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Good call on a simulated failure. When I first set it up, it was LVM/BTRFS or ZFS as my top choices. It was a coin toss at the time because I hadn’t built this sort of setup before.

  • SavvyWolf@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Firstly, for my dotfiles, I use home-manager. I keep the config on my git server and in theory I can pull it down and set up a system the way I like it.

    In terms of backups, I use Pika to backup my home directory to my hard disk every day, so I can, in theory, pull back files I delete.

    I also push a core selection of my files to my server using Pika, just in case my house burns down. Likewise, I pull backups from my server to my desktop (again with Pika) in case Linode starts messing me about.

    I also have a 2TiB ssd I keep in a strongbox and some cloud storage which I push bigger things to sporadically.

    I also take occasional data exports from online services I use. Because hey, Google or Discord can ban you at any time for no reason. :P

  • neo [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Pika Backup for /home/ to an external drive. It’s an automatic solution with a simple GUI that serves as a front end to Borg iirc. Lets you easily browse and mount old backups. Anything outside of my actual personal files can be recreated or restored trivially, so I don’t care to back them up.

    I also have a manual dump of /etc/ but i change it so infrequently that it doesn’t really need looking after.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Keep everything on Nextcloud and back that up via Proxmox Backup Server.

    Nuke and pave takes me less time to reconfigure Plasma and install NC client than bothering to back anything up directly.

  • hallettj@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    When I researched this previously I concluded that there are two very good options for regular backups: Borg and Restic. These are especially efficient at backing up a diff of what has changed since the last backup. So you get snapshots of your filesystem state at each backup point without using a huge amount of space. You can mount any snapshot as a virtual directory. After the initial backup, incremental backups take a minute or two.

    I use Borg, and I back up to cloud storage on Borgbase. I use Vorta as a GUI for Borg. I have Vorta start automatically when I start my window manager, and I have it set up for daily backups. I set up the same thing on my kid’s computer.

    I back up my home directory. I have some excluded directories like ~/.cache, and Steam’s data directory. I use Baobab to find large directories that I don’t want backed up.

    I use the “exclude caches” option in the Borg “create archive” settings. That automatically excludes Rust target/ directories because they follow the Cache Directory Tagging Specification. Not all programming languages’ tooling follows that spec so I also use directory name pattern excludes. For example I have an exclude pattern for .*/node_modules/.*

    I use NixOS, and I keep my system config in a git repo so I don’t need backups for anything outside my home directory.

  • Minty95@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    Timeshift for the system, works perfectly, if you screw up the system, bad update for instance just start it, and you’ll be back up running in less than ten minutes. Simple Cron backups for data, documents etc, just in case you delete a folder, document, image etc

  • smeg@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    The important stuff is in cloud storage using Cryptomator (I’m hoping that rclone should make sync simple), I should probably set up time shift in case things do go wrong

      • smeg@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Ooh that’s interesting to know! Though I do make use of Cryptomator on my phone too, is rclone on Android in a useable state?

  • Random Dent@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Currently I use Borg Backup with Vorta as a GUI. I don’t really do anything automated/scheduled, I just back it up manually to an external SSD every few days or so. I pretty much do my whole /home folder, except for a couple of subfolders that aren’t really necessary (and Videos, which I back up separately.)

    I do eventually want to upgrade to a NAS, but I’m waiting until we move to start setting that up. Also I don’t really have an off-site plan yet which I know is bad, but I need to figure that out.