Same idea here. I know I really ought to have physical backups, but an image backup would be larger than my external hard drive(s), and I haven't taken the time to look into anything else.
Same idea here. I know I really ought to have physical backups, but an image backup would be larger than my external hard drive(s), and I haven't taken the time to look into anything else.
I'm in the same boat. I don't have a physical backup in the form of an external hard drive or a backup server, so this question made me stop and think for a second why I don't. But I really don't...
I'm in the same boat. I don't have a physical backup in the form of an external hard drive or a backup server, so this question made me stop and think for a second why I don't. But I really don't have anything I can't replace that isn't stored in the cloud.
That's what I use for backups of my business accounting software, but I'm seriously considering changing since losing some sales tax files from Drive. It never really dawned on me that three...
That's what I use for backups of my business accounting software, but I'm seriously considering changing since losing some sales tax files from Drive. It never really dawned on me that three months worth of records could just vanish, but they did. Thankfully I had paper copies and was able to recreate the spreadsheets, but it was quite a bit of work.
I'm assuming you use Bitbucket for free private repos and GitHub for free public repos? If so, I used to do the same, now I'm just using GitLab since it has free public and private repos.
Bitbucket and GitHub for code
I'm assuming you use Bitbucket for free private repos and GitHub for free public repos? If so, I used to do the same, now I'm just using GitLab since it has free public and private repos.
btrbk. It uses btrfs' snapshot features to copy incremental backups to another location on a given schedule. I use a raspberry pi 1 with a 1TB USB HDD as a "backup server", it used to live...
btrbk. It uses btrfs' snapshot features to copy incremental backups to another location on a given schedule.
I use a raspberry pi 1 with a 1TB USB HDD as a "backup server", it used to live off-site in a friend's basement, now I have to find a better place for it than my desk. Having my Nextcloud, PC, laptop and backups all in one flat is not a very comfortable thought.
The RPi idea is interesting! Right now I'm just using an HDD for backup and it isn't off-site. Off-site backup once saved me in the aftermath of a very nasty theft. Things have changed since then...
The RPi idea is interesting! Right now I'm just using an HDD for backup and it isn't off-site.
Off-site backup once saved me in the aftermath of a very nasty theft. Things have changed since then and I ought to find a way to do it again.
Yeah, it's a nice and cheap solution for DIY off-site backups. The only problems I have with it: it's slow as crap, especially with btrfs on the SD card, running pacman -Syu means 10-30 minutes of...
Yeah, it's a nice and cheap solution for DIY off-site backups. The only problems I have with it:
it's slow as crap, especially with btrfs on the SD card, running pacman -Syu means 10-30 minutes of lock-up
it doesn't have any out-of-band management. If it dies or you shut it down, someone needs to unplug it and plug it back in.
As long as it's not lying around in some datacenter though, it's just fine.
Huh, yeah, that's true. Gotta take a closer look at that when I've got time. An alternative would be to craft something out of a microcontroller with network connection that just drives the pi's...
Huh, yeah, that's true. Gotta take a closer look at that when I've got time. An alternative would be to craft something out of a microcontroller with network connection that just drives the pi's GPIO and serial pins, then you'd even have console access.
I believe it's fully integrated into systemd nowadays, which could make it simpler if you run a distro with it. (I did it manual, and needed half a day to figure out all the little...
I believe it's fully integrated into systemd nowadays, which could make it simpler if you run a distro with it. (I did it manual, and needed half a day to figure out all the little incompatibilities with all the various versions of the Pi.)
This is what I use. I'm a professional video editor and so my back-ups are huge and I can't really use a cloud service for it. But I keep all footage I have unless the contract specifically asks...
This is what I use. I'm a professional video editor and so my back-ups are huge and I can't really use a cloud service for it. But I keep all footage I have unless the contract specifically asks me to delete after its finished. So I have loads of HDDs from over the years as the technology keeps changing.
Squashfs. It's a usually builtin filesystem for Linux (what I use everyday) that is read-only, and supports some pretty decent compression (usually about 36% of the size for me, but experiences...
Squashfs. It's a usually builtin filesystem for Linux (what I use everyday) that is read-only, and supports some pretty decent compression (usually about 36% of the size for me, but experiences differ, it uses zlib under the covers).
My laptop compiles all my documents and important information into a squashfs file every ten minutes or so. Then, when I connect my laptop up to an external drive, those backups are immediately stored on it (edit here. Multiple backups, not singular).
You can also mount your backup without fear of breaking it, so you can go fishing for old files.
Squashfs also allows you to test integrity, and bit-to-bit comparisons rather easily, to ensure you can still restore from backup.
Speaking of restore, it's a simple dd command (run probably from a live disc), and I have my drive back in a known working order.
Synology NAS with two HDDs in a RAID 1 configuration. I once helped spec a similar setup for work and went with a 12 HDD unit with RAID 6, which allowed for two similtaneous HDD failures before...
Synology NAS with two HDDs in a RAID 1 configuration. I once helped spec a similar setup for work and went with a 12 HDD unit with RAID 6, which allowed for two similtaneous HDD failures before loss of data.
I'll second Synology. I have a 4-bay DS918+ with 2x8 TB and 2x2 TB, single redundancy. The 2x8 hosts all my raw media files and other housekeeping type stuff, then the 2x2 TB is split between time...
I'll second Synology. I have a 4-bay DS918+ with 2x8 TB and 2x2 TB, single redundancy. The 2x8 hosts all my raw media files and other housekeeping type stuff, then the 2x2 TB is split between time machine backups, surveillance station backups, and DS Photo syncs. I also have certain files backed up to a 25 GB box.com account I got years ago when they were offering free pro accounts for a day (found in /r/freebies). There are other files backed up to my Google Drive as well.
I follow the 3-2-1 rule. Three copies of anything I want backed up (keepass database, encrypted archive of ~/) Two of them on different media One of the previous two stored off-site (a VPS, in my...
I follow the 3-2-1 rule.
Three copies of anything I want backed up (keepass database, encrypted archive of ~/)
Two of them on different media
One of the previous two stored off-site (a VPS, in my case).
Been doing that for over a decade, have yet to lose a single bit of data despite multiple full system/disk failures.
They didn't sign a canary, then when they got called out on it they said they were changing from warrant canaries to transparency reports, then after getting pressured and having a thousand people...
They didn't sign a canary, then when they got called out on it they said they were changing from warrant canaries to transparency reports, then after getting pressured and having a thousand people say they were cancelling their sub on Twitter they signed a canary.
I only backup stuff in the documents folder (i should backup my /home folder... anyway) to my desktop (with nextcloud) that is always on at home and sometimes to my HDD, but as i don't have...
I only backup stuff in the documents folder (i should backup my /home folder... anyway) to my desktop (with nextcloud) that is always on at home and sometimes to my HDD, but as i don't have anything to do that in the background i sometimes forget :(
What i like on nextcloud (i think syncthing had the same) is file versioning, it stores like 10 previous copies of the file, so if i do some shit on a file i'm working on and save i can recover the previous version
hmm nextcloud client does it's thing, but i only choose to sync /home/myself/Documents, changing it is quite easy, but uploading all my downloads is PITA when i'm on ADSL connection (the...
hmm nextcloud client does it's thing, but i only choose to sync /home/myself/Documents, changing it is quite easy, but uploading all my downloads is PITA when i'm on ADSL connection (the server/desktop at home has a fiber connection 100/100)
About syncing to the external HDD, my main worry is if i need to suddenly disconnect the HDD in the middle of a syncronization
For file duplication I use Syncthing: Syncthing is installed on all devices Folders are synced between desktops, laptop, (android) phones Some folders are "masters", meaning that it doesn't...
For file duplication I use Syncthing:
Syncthing is installed on all devices
Folders are synced between desktops, laptop, (android) phones
Some folders are "masters", meaning that it doesn't accepts changes made by the other side
A "central" desktop keeps a copy of all important folders from each device
Syncthing is configured to keep deleted files for XXX days
Syncing only happens when devices are connected to my local network (using ethernet or wifi)
Now, this is not a proper backup. If a file is modified, it replaces the original file on the other side too, but if my phone goes for a swim, I have at least two copies on two different places.
I only have around 150GB of data that I consider to be really important. I make a monthly encrypted backup to an external drive and to Backblaze B2.
I already have two local (raid) copies of media content (again, using Syncthing), but from time to time I also do a manual backup to an external disk. I only have around 1TB of this stuff and I could always download everything again if something destroys my house.
I could improve this setup with real time backup by using something like Google Drive, Dropbox, etc, but for my usage it's not something I really need.
tl;dr:
Syncthing to sync content between devices;
Important stuff is backed up every month to an external drive and cloud;
Media content on two raid machines, manual backup to external drive every 3 months or so;
Not perfect, but I've been using this for 3-4 years and never had any issues.
I had a go using Syncthing a few years ago. I got it working but it wasn't straightforward and some stuff refused to sync. Keep meaning to give it another go. I guess it's much better now?
I had a go using Syncthing a few years ago. I got it working but it wasn't straightforward and some stuff refused to sync. Keep meaning to give it another go. I guess it's much better now?
I don't, unfortunately. Even if I could afford a backup service, my bandwidth is shit and uploading would cause literally all of the other web traffic to slow to a grinding halt until the backup...
I don't, unfortunately. Even if I could afford a backup service, my bandwidth is shit and uploading would cause literally all of the other web traffic to slow to a grinding halt until the backup finished. Even physical storage would be okay, but then I would have to rely on my lazy, forgetful ass to actually plug the thing in and run a backup every now and then.
An out of date backup is better than no backup though. You could have bigger files like photos on an external hdd and use a cloud service like dropbox for important documents (those typically take...
An out of date backup is better than no backup though. You could have bigger files like photos on an external hdd and use a cloud service like dropbox for important documents (those typically take up very little space anyway). If you care about any of your files at all, consider creating a backup.
I used to use HDDs but then I was lazy a just back everything up. Now I'm having to look back over it and it's the most frustrating thing imaginable. I've now switched to using Google drive. I...
I used to use HDDs but then I was lazy a just back everything up. Now I'm having to look back over it and it's the most frustrating thing imaginable.
I've now switched to using Google drive. I save all the handmade files onto to it as well as images. I don't need a backup of my a game or software instalation nor do I need a downloaded folder with is full of just drivers and random crap. I also will regularly delete stuff in downloads and my documents, keeping it clean obviously backing up the important stuff. I also have a Plex server where I'm dumping the tvshows and movies I've torrrented and want to keep.
I don’t have (and avoid having) too many super important files. Most of my important code is on Github or Gitlab, I have an external drive with terabytes of family photos, and I’ll upload the...
I don’t have (and avoid having) too many super important files. Most of my important code is on Github or Gitlab, I have an external drive with terabytes of family photos, and I’ll upload the occasional important file to Google Drive (usually GPG encrypted if sensitive).
Laptop Arq over SFTP to a server/NAS in my house. Backblaze, $5/m for unlimited storage. Server/NAS Duplicacy (the free and open source command line version — there's also a paid GUI) backing up...
Duplicacy (the free and open source command line version — there's also a paid GUI) backing up to a Google Drive business account. $10/m for unlimited storage.
Arq is old-fashioned paid software. It's about $50 and works with a number of cloud providers, if you're into that. I like Arq, but I've had a couple of issues related to it being, I believe, a one-person operation. Once they announced that there was a major bug in the garbage collection process that might delete data that you still need. More recently I ran into a bug where canceling a restore midway through can leave partial files on disk which it won't try to overwrite the next time you back up. I'm considering Duplicacy as an alternative.
Duplicacy has been super solid, just uses a lot of memory during multi-TB backups. I've hit funny issues with Google Drive, too. One of my chunks got marked as malware and I had to patch Duplicacy to send the right API parameter to ignore that. Another time, one of my chunks just couldn't be downloaded and then suddenly could be a few days later.
The poor security is definitely worrisome but in my case it's a non issue. My main goal for backing up files is to make sure family photos are backed up off site somewhere. Although it wouldn't be...
The poor security is definitely worrisome but in my case it's a non issue. My main goal for backing up files is to make sure family photos are backed up off site somewhere. Although it wouldn't be good if the files were somehow compromised there's really nothing there that would be problematic for me.
Financial documents and other sensitive documents are stored in seperate encrypted containers on my machine for piece of mind. My assumption is that these types of files are just as much at risk on my computer as they would be somewhere else. For things that I want to have an extra layer of security I use this method.
Does it work for me? Yes. Will it work for others? Maybe. Is it an excellent option? No but it's good enough for my case. Is the price right? Very much.
Edit: By the way if you have any suggestions that are user friendly and cheap that are better than Backblaze's offering, I'm all ears.
I should clarify that the financial documents and other sensitive documents are not stored unencrypted on the harddrive and the encrypted containers are what get backed up. I don't just keep them...
I should clarify that the financial documents and other sensitive documents are not stored unencrypted on the harddrive and the encrypted containers are what get backed up. I don't just keep them on my machine but, I also don't put my faith in them being kept secure by backblaze's security measures.
Even if there is a breach on their end and the encrypted containers get out somehow, at least there is a good layer of protection on them. On my machine I mount and access the files as needed but they aren't needed frequently so it ends up being a good system for me.
I have a recovery disk of my Windows system in case it needs restoring, and I back up sensitive files on an external hard drive that I keep in a high-security vault (also known as "the bottom of...
I have a recovery disk of my Windows system in case it needs restoring, and I back up sensitive files on an external hard drive that I keep in a high-security vault (also known as "the bottom of an IKEA storage box filled with unused cables and other computer hardware). Most everything i have on my devices I don't really care if I lose or not. Nothing is really unique or important enough for that to matter.
External HDDs primarily, though I also like to copy smaller documents on to memory sticks for easier access. I tend to have multiple local backups as they make the most sense to me.
External HDDs primarily, though I also like to copy smaller documents on to memory sticks for easier access. I tend to have multiple local backups as they make the most sense to me.
I clone my laptop's drive to a little USB Samsung T2 often, and the laptop does a recurring backup to my Synology which syncs it to a Synology at my dad's house (as well as having its own external...
I clone my laptop's drive to a little USB Samsung T2 often, and the laptop does a recurring backup to my Synology which syncs it to a Synology at my dad's house (as well as having its own external USB backups). I don't back my gaming desktop up because... meh.
The only things I'd be really pissed about losing are my photos and 1Password vault, and those have their own associated cloud services as well.
I have several servers in cloud and I have a synced folder with one of them and then the rest of them just sync to each other. For that I use Syncthing, someone could very well use Resilio, it is...
I have several servers in cloud and I have a synced folder with one of them and then the rest of them just sync to each other. For that I use Syncthing, someone could very well use Resilio, it is not opensource but has encrypted folders option, with Syncthing you would need to encrypt files on disk a bit differently.
Syncthing coordinated via a server I rent (documents and business files encrypted), distributing certain files to all my machines and backing up phone pics etc Backblaze (10TB or so) for the main...
Syncthing coordinated via a server I rent (documents and business files encrypted), distributing certain files to all my machines and backing up phone pics etc
Backblaze (10TB or so) for the main desktop
Multiple HDDs locally for important files like photography, etc.
Very little of my data is meaningful to me. I keep a few local copies on every hard drive just because they're so small. I probably have some copies around on USB sticks or Google drive as well...
Very little of my data is meaningful to me. I keep a few local copies on every hard drive just because they're so small. I probably have some copies around on USB sticks or Google drive as well but it's not something that I regularly bother with.
Hell even if I lost all of my data and backups it really wouldn't cause me any issues, at most I lose a couple of funny pics that I look at once a year when I decide to poke around some old folders, and then forget about until next time.
Syncthing to distribute often accessed files over all devices where I need them in a pseudo backup fashion. A local copy on a btrfs multi-device array. restic in combination with rclone to dump...
Syncthing to distribute often accessed files over all devices where I need them in a pseudo backup fashion.
A local copy on a btrfs multi-device array.
restic in combination with rclone to dump all other stuff I need backed up but don't need to access on many devices or regularly on mega.nz actually.
For important, mostly administrative, files I use Google Drive. Though I've been thinking of a non-cloud reliant solution I'm a little worried that the risk of failure of such a solution outweighs...
For important, mostly administrative, files I use Google Drive. Though I've been thinking of a non-cloud reliant solution I'm a little worried that the risk of failure of such a solution outweighs the risk of a cloud data breach.
Same idea here. I know I really ought to have physical backups, but an image backup would be larger than my external hard drive(s), and I haven't taken the time to look into anything else.
I'm in the same boat. I don't have a physical backup in the form of an external hard drive or a backup server, so this question made me stop and think for a second why I don't. But I really don't have anything I can't replace that isn't stored in the cloud.
That's what I use for backups of my business accounting software, but I'm seriously considering changing since losing some sales tax files from Drive. It never really dawned on me that three months worth of records could just vanish, but they did. Thankfully I had paper copies and was able to recreate the spreadsheets, but it was quite a bit of work.
I'm assuming you use Bitbucket for free private repos and GitHub for free public repos? If so, I used to do the same, now I'm just using GitLab since it has free public and private repos.
btrbk. It uses btrfs' snapshot features to copy incremental backups to another location on a given schedule.
I use a raspberry pi 1 with a 1TB USB HDD as a "backup server", it used to live off-site in a friend's basement, now I have to find a better place for it than my desk. Having my Nextcloud, PC, laptop and backups all in one flat is not a very comfortable thought.
The RPi idea is interesting! Right now I'm just using an HDD for backup and it isn't off-site.
Off-site backup once saved me in the aftermath of a very nasty theft. Things have changed since then and I ought to find a way to do it again.
Yeah, it's a nice and cheap solution for DIY off-site backups. The only problems I have with it:
pacman -Syu
means 10-30 minutes of lock-upAs long as it's not lying around in some datacenter though, it's just fine.
Actually, the Pi has a watchdog timer. You can use that to reboot the Pi if it gets locked up for too long.
Huh, yeah, that's true. Gotta take a closer look at that when I've got time. An alternative would be to craft something out of a microcontroller with network connection that just drives the pi's GPIO and serial pins, then you'd even have console access.
I believe it's fully integrated into systemd nowadays, which could make it simpler if you run a distro with it. (I did it manual, and needed half a day to figure out all the little incompatibilities with all the various versions of the Pi.)
This is what I use. I'm a professional video editor and so my back-ups are huge and I can't really use a cloud service for it. But I keep all footage I have unless the contract specifically asks me to delete after its finished. So I have loads of HDDs from over the years as the technology keeps changing.
Squashfs. It's a usually builtin filesystem for Linux (what I use everyday) that is read-only, and supports some pretty decent compression (usually about 36% of the size for me, but experiences differ, it uses zlib under the covers).
The heart of my backup is basically:
My laptop compiles all my documents and important information into a squashfs file every ten minutes or so. Then, when I connect my laptop up to an external drive, those backups are immediately stored on it (edit here. Multiple backups, not singular).
You can also mount your backup without fear of breaking it, so you can go fishing for old files.
Squashfs also allows you to test integrity, and bit-to-bit comparisons rather easily, to ensure you can still restore from backup.
Speaking of restore, it's a simple dd command (run probably from a live disc), and I have my drive back in a known working order.
Synology NAS with two HDDs in a RAID 1 configuration. I once helped spec a similar setup for work and went with a 12 HDD unit with RAID 6, which allowed for two similtaneous HDD failures before loss of data.
I'll second Synology. I have a 4-bay DS918+ with 2x8 TB and 2x2 TB, single redundancy. The 2x8 hosts all my raw media files and other housekeeping type stuff, then the 2x2 TB is split between time machine backups, surveillance station backups, and DS Photo syncs. I also have certain files backed up to a 25 GB box.com account I got years ago when they were offering free pro accounts for a day (found in /r/freebies). There are other files backed up to my Google Drive as well.
I follow the 3-2-1 rule.
Three copies of anything I want backed up (keepass database, encrypted archive of ~/)
Two of them on different media
One of the previous two stored off-site (a VPS, in my case).
Been doing that for over a decade, have yet to lose a single bit of data despite multiple full system/disk failures.
Did you hear the drama over them a week or three ago?
Warrant Canary dissapeared, prices changed...
Lemme look it up in r/privacy.
EDIT: Hmm, it appears my information was outdated.
They didn't sign a canary, then when they got called out on it they said they were changing from warrant canaries to transparency reports, then after getting pressured and having a thousand people say they were cancelling their sub on Twitter they signed a canary.
I only backup stuff in the documents folder (i should backup my /home folder... anyway) to my desktop (with nextcloud) that is always on at home and sometimes to my HDD, but as i don't have anything to do that in the background i sometimes forget :(
What i like on nextcloud (i think syncthing had the same) is file versioning, it stores like 10 previous copies of the file, so if i do some shit on a file i'm working on and save i can recover the previous version
Guessing from
/home
you're on Linux... Maybe it's time to add an entry to cron?hmm nextcloud client does it's thing, but i only choose to sync /home/myself/Documents, changing it is quite easy, but uploading all my downloads is PITA when i'm on ADSL connection (the server/desktop at home has a fiber connection 100/100)
About syncing to the external HDD, my main worry is if i need to suddenly disconnect the HDD in the middle of a syncronization
For file duplication I use Syncthing:
Now, this is not a proper backup. If a file is modified, it replaces the original file on the other side too, but if my phone goes for a swim, I have at least two copies on two different places.
I only have around 150GB of data that I consider to be really important. I make a monthly encrypted backup to an external drive and to Backblaze B2.
I already have two local (raid) copies of media content (again, using Syncthing), but from time to time I also do a manual backup to an external disk. I only have around 1TB of this stuff and I could always download everything again if something destroys my house.
I could improve this setup with real time backup by using something like Google Drive, Dropbox, etc, but for my usage it's not something I really need.
tl;dr:
Not perfect, but I've been using this for 3-4 years and never had any issues.
I had a go using Syncthing a few years ago. I got it working but it wasn't straightforward and some stuff refused to sync. Keep meaning to give it another go. I guess it's much better now?
Dropbox and a raid setup. Also multiple copies of files on different computers.
I don't, unfortunately. Even if I could afford a backup service, my bandwidth is shit and uploading would cause literally all of the other web traffic to slow to a grinding halt until the backup finished. Even physical storage would be okay, but then I would have to rely on my lazy, forgetful ass to actually plug the thing in and run a backup every now and then.
An out of date backup is better than no backup though. You could have bigger files like photos on an external hdd and use a cloud service like dropbox for important documents (those typically take up very little space anyway). If you care about any of your files at all, consider creating a backup.
Ya get a WD drive or something with included software and just leave it plugged in, or plug it in every weekend and have it run a backup
I used to use HDDs but then I was lazy a just back everything up. Now I'm having to look back over it and it's the most frustrating thing imaginable.
I've now switched to using Google drive. I save all the handmade files onto to it as well as images. I don't need a backup of my a game or software instalation nor do I need a downloaded folder with is full of just drivers and random crap. I also will regularly delete stuff in downloads and my documents, keeping it clean obviously backing up the important stuff. I also have a Plex server where I'm dumping the tvshows and movies I've torrrented and want to keep.
I don’t have (and avoid having) too many super important files. Most of my important code is on Github or Gitlab, I have an external drive with terabytes of family photos, and I’ll upload the occasional important file to Google Drive (usually GPG encrypted if sensitive).
Arq is old-fashioned paid software. It's about $50 and works with a number of cloud providers, if you're into that. I like Arq, but I've had a couple of issues related to it being, I believe, a one-person operation. Once they announced that there was a major bug in the garbage collection process that might delete data that you still need. More recently I ran into a bug where canceling a restore midway through can leave partial files on disk which it won't try to overwrite the next time you back up. I'm considering Duplicacy as an alternative.
Duplicacy has been super solid, just uses a lot of memory during multi-TB backups. I've hit funny issues with Google Drive, too. One of my chunks got marked as malware and I had to patch Duplicacy to send the right API parameter to ignore that. Another time, one of my chunks just couldn't be downloaded and then suddenly could be a few days later.
The poor security is definitely worrisome but in my case it's a non issue. My main goal for backing up files is to make sure family photos are backed up off site somewhere. Although it wouldn't be good if the files were somehow compromised there's really nothing there that would be problematic for me.
Financial documents and other sensitive documents are stored in seperate encrypted containers on my machine for piece of mind. My assumption is that these types of files are just as much at risk on my computer as they would be somewhere else. For things that I want to have an extra layer of security I use this method.
Does it work for me? Yes. Will it work for others? Maybe. Is it an excellent option? No but it's good enough for my case. Is the price right? Very much.
Edit: By the way if you have any suggestions that are user friendly and cheap that are better than Backblaze's offering, I'm all ears.
I should clarify that the financial documents and other sensitive documents are not stored unencrypted on the harddrive and the encrypted containers are what get backed up. I don't just keep them on my machine but, I also don't put my faith in them being kept secure by backblaze's security measures.
Even if there is a breach on their end and the encrypted containers get out somehow, at least there is a good layer of protection on them. On my machine I mount and access the files as needed but they aren't needed frequently so it ends up being a good system for me.
I have a recovery disk of my Windows system in case it needs restoring, and I back up sensitive files on an external hard drive that I keep in a high-security vault (also known as "the bottom of an IKEA storage box filled with unused cables and other computer hardware). Most everything i have on my devices I don't really care if I lose or not. Nothing is really unique or important enough for that to matter.
External HDDs primarily, though I also like to copy smaller documents on to memory sticks for easier access. I tend to have multiple local backups as they make the most sense to me.
Right now Google Drive but I've been wanting to switch to a NAS backup.
I clone my laptop's drive to a little USB Samsung T2 often, and the laptop does a recurring backup to my Synology which syncs it to a Synology at my dad's house (as well as having its own external USB backups). I don't back my gaming desktop up because... meh.
The only things I'd be really pissed about losing are my photos and 1Password vault, and those have their own associated cloud services as well.
I have several servers in cloud and I have a synced folder with one of them and then the rest of them just sync to each other. For that I use Syncthing, someone could very well use Resilio, it is not opensource but has encrypted folders option, with Syncthing you would need to encrypt files on disk a bit differently.
Syncthing ftw.
Syncthing coordinated via a server I rent (documents and business files encrypted), distributing certain files to all my machines and backing up phone pics etc
Backblaze (10TB or so) for the main desktop
Multiple HDDs locally for important files like photography, etc.
Very little of my data is meaningful to me. I keep a few local copies on every hard drive just because they're so small. I probably have some copies around on USB sticks or Google drive as well but it's not something that I regularly bother with.
Hell even if I lost all of my data and backups it really wouldn't cause me any issues, at most I lose a couple of funny pics that I look at once a year when I decide to poke around some old folders, and then forget about until next time.
Syncthing to distribute often accessed files over all devices where I need them in a pseudo backup fashion.
A local copy on a btrfs multi-device array.
restic in combination with rclone to dump all other stuff I need backed up but don't need to access on many devices or regularly on mega.nz actually.
For important, mostly administrative, files I use Google Drive. Though I've been thinking of a non-cloud reliant solution I'm a little worried that the risk of failure of such a solution outweighs the risk of a cloud data breach.
On my mac I'm using backblaze, which I've found to be a pretty good service. I'd love to use it on my PC if only they had a linux client :(