this post was submitted on 15 Oct 2024
172 points (99.4% liked)

Linux

47941 readers
1542 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Can you please share your backup strategies for linux? I'm curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] seaQueue@lemmy.world 3 points 2 weeks ago* (last edited 2 weeks ago)

I leverage btrfs or ZFS snapshots. I take rolling system level snapshots on a schedule (daily, weekly, monthly and separately before any package upgrades or installs) and user data snapshots every couple of hours. Then I use btrbk to sync those snapshots to an external drive at least once a week. When I have all of my networking gear and home services setup I also sync all of this to storage on my NAS. Any hosts on the network keep rolling snapshots stored on the NAS as well.

Important data also gets shoveled into a B2 bucket and/or Google drive if I need to be able to access it from a phone.

I keep snapshots small by splitting data up into well defined subvolumes, anything that can be reacquired from the cloud (downloads, package caches, steam libraries, movies, music, etc) isn't included in the backup strategy. If I download something and it's hard to find or important I move it out of downloads and into a location that is covered by my backups.

[–] vortexal@lemmy.ml 2 points 2 weeks ago (1 children)

The only thing I use as a backup is a Live CD that's mounted to a USB thumb drive.

I used to use Timeshift but the one time I needed it, it didn't work for some reason. It also had a problem of making my PC temporarily unusable while it was making a backup, so I didn't enable it when I had to reinstall Linux Mint.

[–] Teppichbrand@feddit.org 2 points 2 weeks ago

Same, Timeshift let me down one time when I needed it. I still use it though, and I'm afraid to upgrade Mint because I don't want to set my system again for of the upgrade fails to keep my configuration and Timeshift fails to take me back

[–] drwho@beehaw.org 2 points 2 weeks ago

All of my servers make local dumps of their databases and config files to directories owned by unprivileged users. This includes file paths, permissions, and ownerships (so I know how to put them back).

My primary research server at home uses rsync to pull copies of those local backups from my servers.

My primary research server uses Restic to make a daily incremental backup to Backblaze's B2 service.

[–] krakenfury@lemmy.sdf.org 2 points 2 weeks ago

I sync important files to s3 from a folder with awscli. Dot files and projects are in a private git repos. That's it.

If I maintained a server, I would do something more sophisticated, but installation is so dead simple these days that I could get a daily driver in working order very quickly.

[–] xlash123@sh.itjust.works 2 points 2 weeks ago

For my home server, I use Restic and a cronjob to weekly take snapshots of all my services. It then gets synced to a Backblaze B2 bucket (at $6/TB/mo). It's pretty neat, only saving the difference between the previous and current snapshot, removes older snapshots, and encrypts everything.

[–] nichtburningturtle@feddit.org 2 points 2 weeks ago

I have my important folders synced to my Nextcloud and create nightly snapshots of that to a different drive using borg.

One thing I still need to do, is offsite encrypted backups using rsync.

[–] GustavoM@lemmy.world 2 points 2 weeks ago

.dotfiles on github

Big/critical files on an external HD

simple as

[–] potentiallynotfelix@lemmy.fish 2 points 2 weeks ago

If I feel like it, I might use DD to clone my drive and put in on a hard drive. Usually I don't back up, though.

[–] savvywolf@pawb.social 2 points 2 weeks ago

Firstly, for my dotfiles, I use home-manager. I keep the config on my git server and in theory I can pull it down and set up a system the way I like it.

In terms of backups, I use Pika to backup my home directory to my hard disk every day, so I can, in theory, pull back files I delete.

I also push a core selection of my files to my server using Pika, just in case my house burns down. Likewise, I pull backups from my server to my desktop (again with Pika) in case Linode starts messing me about.

I also have a 2TiB ssd I keep in a strongbox and some cloud storage which I push bigger things to sporadically.

I also take occasional data exports from online services I use. Because hey, Google or Discord can ban you at any time for no reason. :P

[–] traches@sh.itjust.works 2 points 2 weeks ago* (last edited 2 weeks ago)

Software & Services:

Destinations:

  • Local raspberry pi with external hdd, running restic REST server
  • RAID 1 NAS at parents' house, connected via tailscale, also running restic REST

I've been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven't gotten to it.

Edit: For the backup set I back up pretty much everything. I'm not paying per gig, though.

[–] shadowtofu@discuss.tchncs.de 2 points 2 weeks ago

I use syncthing to sync almost everything across my computer, laptop (occasional usage), server (RAID1), old laptop (powered up once every month or so), and a few other devices (that only get a small subset of my data, though). On the computer, laptop, and server, I have btrfs snapshots (snapper). Overall, this works very well, I always have 4+ copies of my data in 2+ geographical locations.

[–] Minty95@lemm.ee 2 points 2 weeks ago* (last edited 2 weeks ago)

Timeshift for the system, works perfectly, if you screw up the system, bad update for instance just start it, and you'll be back up running in less than ten minutes. Simple Cron backups for data, documents etc, just in case you delete a folder, document, image etc . Both of these options to a second internal HD

[–] MonkderVierte@lemmy.ml 2 points 2 weeks ago* (last edited 2 weeks ago)

Constant work in progress.

[–] capital@lemmy.world 2 points 2 weeks ago* (last edited 2 weeks ago)

restic -> Wasabi, automated with shell script and cron. Uses an include list to tell it what paths to back up.

Script has Pushover credentials to send me backup alerts. Parses restic log to tell me how much was backed up, removed, success/failure of backup, and current repo size.

To be added: a periodic restore of a random file to have its hash compared to the current version of the file (will happen right after backup, unlikely to have changed in my workload), which will be subsequently deleted, and alert sent letting me know how the restore test went.

[–] gerdesj@lemmy.ml 2 points 2 weeks ago

You have loads of options but you need to also start from ... "what if". Work out how important your data really is. Take another look and ask the kids and others if they give a toss. You might find that no one cares about your photo collection in which case if your phone dies ... who cares? If you do care then sync them to a PC or laptop.

Perhaps take a look at this - https://www.veeam.com/products/free/linux.html its free for a few systems.

[–] spacemanspiffy@lemmy.world 1 points 2 weeks ago

Dotfiles are handled by GNU Stow and git. I have this on all my devices.

Projects like in git.

Media is periodically rsynced from my server to an external drive.

Been meaning to put all my docker-composes into git as well...

I don't back up too much else.

Timeshift for configs to a locally attached drive. Home partition to cloud with rsync

[–] qwerty@discuss.tchncs.de 1 points 2 weeks ago

Pendrive for the important stuff, paper for the really important stuff and brain for everything else.

[–] fmstrat@lemmy.nowsci.com 1 points 2 weeks ago

All important files go in /data.

/data is ZFS, snapped and sent to NAS regularly

Every time I change a setting, it gets added to a dconf script. Every time I install software, I write a script.

Dotfiles git repo for home directory.

With that, I can spin up a fresh machine in minutes with scripts.

[–] clif@lemmy.world 1 points 2 weeks ago* (last edited 2 weeks ago)

Internal RAID1 as first line of defense. Rsync to external drives where at least one is always offsite as second. Rclone to cloud storage for my most important data as the third.

Backups 2 and 3 are manual but I have reminders set and do it about once a month. I don't accrue much new data that I can't easily replace so that's fine for me.

[–] Veraxis@lemmy.world 1 points 2 weeks ago

For system files/configuration on my machines, timeshift set to run once a week.

For family photos and shared files, I built a pair of SFTP servers made from old HP thin-client PCs at two different geographic locations which automatically sync to each other once a day via cron job using vsftpd and lftp. Each one has both an NVMe and SATA SSD which run in a software RAID 1 configuration.

For any other files, a second local server also using vsftpd and two SSDs in USB enclosures. I manually back them up using rsync on an irregular basis.

load more comments
view more: ‹ prev next ›