Discussion Backup Strategy and Hardware (Which Storage Media Do YOU Support?)

MalVeauX

Senior member
Dec 19, 2008
653
176
116
Hey all,

I think we all consider redundancy and all that for our data to an extent. But now it's 2020. It's a great year to really stop and think about backup solutions. Actual backup solutions, not just minor redundancy. But true archival level backup solutions that could survive a 2020 catastrophe. For example, I'm in Florida, with two potential hurricanes coming down on my coast line at the same time. This is a great year to think about my current backup strategy and question it and what it will survive potentially.

So what's your backup solution(s)?

Hard drive? Second hard drive? 3rd hard drive? (Taking the multiple physical copies approach, and refreshing every 3+ years?)
Redundancy across several hard drives? Multiple locations (including off-site, in a fireproof waterproof container)? (Plus refreshing very 3+ years)?
Optical media (such as M-disc class engraving)?
Flash/SSD options for short term and travel? (refreshing within the year?)
Cloud storage (if so, how much capacity, storing RAW files or just display files with compression)?
Tape? Does this still exist for today's capacity needs?

How often do you refresh your data?
Do you test your backups?

Do you think your current system will survive a physical catastrophe (fire, flood)?
And if you had to retrieve data from your off-site backup (if you have one), will get you get back ALL your data? Or will you only get back some? And how long will it take you to get it all back (if cloud based)?

Very best,
 
  • Like
Reactions: VirtualLarry

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
This thread has potential. I may add my thoughts in a bit, however, that may take a while.

Edit: OK, I have multiple NAS units set up, and I generally upgraded to larger-capacity (more bays, larger drives) as I can afford it, and then just de-commission the older NAS servers as a sort of "legacy backup". I also make "cold backups" to external HDDs, of the important stuff on the NAS unit(s).

As far as disaster-recovery precautions, well, I don't have anything of THAT much important on my NAS, really, maybe a couple of things, but I would not want them in cloud storage either. I guess I could mail some DVDs or BD-Rs to people that I know, for safe-keeping. Haven't gotten that far, though. Mostly just old computer code that I've written in the past, that would be useful for a resume portfolio or nostalgia. (Most of it is for DOS real-mode, which wouldn't work on modern PCs anymore anyways.)
 
Last edited:
  • Like
Reactions: MalVeauX

MalVeauX

Senior member
Dec 19, 2008
653
176
116
Heya,

I'm thinking up front, it's important to distinguish between simply having uptime hardware and solutions compared to backups at the archival level. I don't think all data deserves archival class treatment, only data that will need to be available in the long run, so archival class media, true backup solution, likely is rather different for most people compared to their solutions for simply having good uptime on their active data. An example being, some software today may not be at all useful to have a backup of in 10 years, but your family photos or archived photos for your research (think microscopy, astro, etc) may be something you wish to survive 20~30 years, or more, without constant refreshing onto newer media every few years. Or insert any example of data that you might think is important enough to keep long term and might actually need to be accessed by someone one day. I also think anything that is a true backup or archival class backup needs to be so simple that nearly anyone can recover it (unless its sensitive of course). Meaning, I think a medium and file system that is going to be relevant and common that future hardware can handle it. An example would be today's hardware is still commonly able to read a CD-R from the late 1990's with your JPG's and TXT files on there no problem, despite the drives changing from IDE to SATA, and several revision. But the optical platform never went away despite it's popularity shifting to two different mediums (DVD-R and BD-R, etc). So extrapolate another 20~30 years with DVD-R and BD-R to whatever is the future, I would assume optical drives are still going to be relevant, even if for nothing more than archival purposes.

So for current active data sets, I'm thinking modern HDD or SSD in a system with redundancy is the first line of defense for creating uptime. This could be in the same system, or two separate systems, such as a living set of drives in your primary system being used, that is also copied onto a separate system, such as an external drive system, or network attached storage, with the ability to have redundancy, such as mirroring or for some using various parity systems. This allows the lowest cost per capacity drives to be used with modern speeds, because redundancy gives you that little room to have a fault or failure and still have a living data set at your immediate disposal. Some may use RAID systems. Others may favor the ZFS file system. Or something else even. I think the less exclusive hardware needed, the better. And I think if the file system itself has data checking and validation as part of it is also good (like ZFS). The argument for HDD vs SSD is mostly going to be cost and capacity related, as the HDD is still the cheapest cost per capacity but has shorter lifespan and needs recycling more often; the SSD has higher cost per capacity, but has longer lifespan in a working environment and needs less frequent recycling, but ultimately still needs to be refreshed eventually (10 year being a maximum likely).

The second set of data separate from the above primary active data is likely still not a backup, but rather simply something that supports uptime via redundancy in the form of having separate physical copies on separate media. Some do this on Flash media or solid state media, but this is only good for short term and only for low capacity needs. Some do this with HDD or SSD in hot or cold storage systems depending on their setup and is still only good for short term. The most common high capacity thing would be an external HDD with access via eSATA, USB or Network. The most common low capacity thing would be the flash solid state memory modules that everyone has laying around. And then there's the massive capacity needs with redundancy as another option, such as having a second NAS with mirroring or parity built in for a generous support of uptime having two completely separate systems with the same data both with independent redundancy systems. While this is still not a backup, this is a nominal uptime strategy for someone at home at least and still affordable enough to do.

Everyone gets to that point I think, above. Figuring out if external HDDs are ok for your needs. Or if some Flash sticks handle most of your small needs. Or maybe a little bit of Cloud use for stuff that's not high capacity and/or super sensitive information. For some, the living data is likely not redundant in the primary system and the external HDD or NAS is the secondary set of data with or without redundancy, which is still a total redundant system. It's getting to that third level that we start to enter the idea of a true backup or archival medium and system to allow for greater redundancy and not just uptime. Ie, can it survive 20 years in a vault somewhere and be usable with access from future hardware and in container and file systems that will not require specialists or VM's just to access them.

I don't think HDD, SSD (at least yet), Flash solid state, nor cloud services really do this justice. Even with redundancy. Everything has a compromise in some way. So it's important to figure out if your data is worth the cost, compromise and complexity and what system will work for your needs.

For example, when I think of a backup, I think of the following:

Will the medium itself survive 20~30 years in cold storage?
Will the medium have hardware to access it in 20~30 years, or more?
Could this system be refreshed and migrated to new mediums in 20+ years?
Will the file system and/or containers be universally accessible in 20~30 years without legacy hardware/software?
Can someone other than yourself retrieve this data without advanced knowledge of all of this hardware/software (ie, can you stroke out and your family get the data)?
Will the encryption be needed or will it be worth risking loss if needed if you were to become unavailable or lose that key in 20~30 years?
Will the medium survive common risk stuff (flash house fire, water damage, crush damage, etc)?
Will the system be subject to ransom online (where you really need it encrypted to avoid host snooping; and will this system survive 20~30 years anyways)?

For anecdotal argument, here's my experience with this:

I have no surviving portable media like floppies or zip discs, etc, from the 90's.
I have no surviving HDD's from the 90's (IDE/SCSI connections, etc; yet notice the power coupling is still available today over 30 years later!)
I have no flash surviving from the early 2000's (though USB has retained compatibility for its existence).
All my data from the 90's that I've kept has migrated over time to new hard drives over time (I physically moved to new drives).
I have CD-R's I burned in the late 90's that are still accessible and today's modern hardware and OS reads them fine without anything legacy!
I have DVD-R's I burned in the late 90's and early 2000's and they're still accessible and again today's hardware uses them without any legacy!

So I'm thinking from an archival class, true backup point of view for my purposes, optical is likely the way to go for backing up data that I want accessible by a third party in 20~30 years on common hardware without any special software needed in common containers and file systems. And today it's likely BD-R centered in some way.

The idea being, if I back up important data, like images (RAW, JPG, etc), research images, plain text documents of critical information, legal stuff in PDF or TXT or DOC, onto optical media with simple structure (ie, UDF, ISO, etc, FAT based or EXT based, etc) using folder names and file names without needing meta data just to know what you're looking at, that can be tucked into a safe that can survive flash fire and some flooding or even stored off site somewhere to increase redundancy, and in 20~30 years when I stroke out or have a heart attack, my kids or grandkids even, or another relative or even a friend could be handed a box of optical discs and they not even know what they are can look at them and figure it's an optical disc and inserts it into some common readers in their day and it actually simply spin up and work. Just like I can access a CD-R from the 90's today with my common modern hardware, 30 years later.

Thoughts?

Alternatives?

How would you approach this concept?

Very best,
 
Last edited:
  • Like
Reactions: VirtualLarry

Muadib

Lifer
May 30, 2000
17,914
838
126
You can still use tape if you want. Look up LTO tape drive. Let me first warn you, it's not cheap.

I have about 500GB of data that I deem critical. I have it backed up to my two NAS units, and to my google drive. I have a backup external drive that backs up my newest NAS. Both that drive and the Google drive data is encrypted. I have stuff that I don't call critical, but would hate to lose. That's around 100GB of pics & music, which is also stored on both my NAS units. I was looking into tape, but I'm still stunned by the cost. I may still look into a older LTO drive, as this would go well with my NAS unit as they use SAS, and they make a SAS card for my NAS.
 
  • Like
Reactions: MalVeauX

MalVeauX

Senior member
Dec 19, 2008
653
176
116
You can still use tape if you want. Look up LTO tape drive. Let me first warn you, it's not cheap.

I have about 500GB of data that I deem critical. I have it backed up to my two NAS units, and to my google drive. I have a backup external drive that backs up my newest NAS. Both that drive and the Google drive data is encrypted. I have stuff that I don't call critical, but would hate to lose. That's around 100GB of pics & music, which is also stored on both my NAS units. I was looking into tape, but I'm still stunned by the cost. I may still look into a older LTO drive, as this would go well with my NAS unit as they use SAS, and they make a SAS card for my NAS.

Thanks; I was just reading about LTO tape actually. Definitely not an inexpensive solution for an individual at home sort of thing. Much more like a business level or larger approach, just from a cost perspective, as it needs fairly strict climate control to achieve any sort of longevity, which is still good for it's capacity, not much will achieve those capacities that are not short lived as it is, but the medium itself is not very durable relative to the environment (it's not happy when its really hot and humid for example). Plus the hardware to read/write. Plus storage. Plus, you have to have knowledge on how to operate the equipment and load/unlock the cartridges. Not something a family member would likely be thrilled to attempt if you keel over and they inherit the data (images and music for example).

I think between two individual NAS's with redundancy on each, data loss is greatly minimized, but this requires constant upkeep and refreshing over the years. Archival level backup with mediums like tape and optical seem to be the goto for big industry. Tape being a lot more scalable with associated significant cost. Optical being a lot more realistic for someone without enterprise budgets.

For example, 500Gb of data that is critical could be stored on optical media for less than $60 USD with a $65~100 writer and not require strict climate conditions and physically take up little space. This is the sort of thing I'm always considering. My critical data is not high capacity. My living data is a little over 12TB at this moment. But the only data in that entire capacity range that is critical to never less is less than 1TB. So the cost to have rather robust backups in the form of optical media is an achievable cost without seeing quad-digits.

I wonder how well steel tape would hold up to heat and humidity if the vault they're stored in experienced a flash fire or was opened up via a storm with rain (hurricane, etc).

Very best,
 

killster1

Banned
Mar 15, 2007
6,208
475
126
wow i didnt read all of your reply :) butttttttttttttttttt, what safe do you have? mine withstands flood and fire (at least 6 feet of flooding and 60 mins of fire?) i burn to bluray and offline hard drives off site, also burn to bluray and wd red 8tb and keep it in my safe at the house, i would like to also incorporate online cloud storage but what to choonse goes on there and how to encrypt it first, veracrypt or a simple 7zip. Used to have two nas that would backup to each other at different houses but i do it all by hand now instead of switching drives after it filled up.
 
  • Like
Reactions: MalVeauX

MalVeauX

Senior member
Dec 19, 2008
653
176
116
wow i didnt read all of your reply :) butttttttttttttttttt, what safe do you have? mine withstands flood and fire (at least 6 feet of flooding and 60 mins of fire?) i burn to bluray and offline hard drives off site, also burn to bluray and wd red 8tb and keep it in my safe at the house, i would like to also incorporate online cloud storage but what to choonse goes on there and how to encrypt it first, veracrypt or a simple 7zip. Used to have two nas that would backup to each other at different houses but i do it all by hand now instead of switching drives after it filled up.

Interesting. So you employ BD-R, cold storage HDD and potentially cloud? Uptime should be good. Has a level of redundancy. I'm curious though, how often do you refresh that 8TB HDD? Do you simply replace it every few years with a new one with the data re-written to it? How much capacity do you think you need in a cloud?

I'm interested in learning more about common safes. Not some enterprise class thing or huge heavy thing that will survive a volcano or something. Mostly just can a basic affordable home safe be reliable enough to potentially have internal media survive a flash fire (maybe not a 6 hour total burn down to the ground, but a flash fire that lasts less than an hour) which is way more common than total melt downs where the entire thing is burned to the ground. Or realistic to my area, seasonal hurricanes that rip the roof open or a tree falls and you get water damage for a few days, so high humidity, temperature fluctuation, etc. That destroys media often more than a little fire might and is a threat every year, realistically, to me on the coast of Florida.

I was just looking at some real world tests on some common, inexpensive safes. This one was interesting:


Throws it in the pool, builds a fire around it and lets it go all the way. Destroys the safe. But still opens it up and the papers are fine. I'm not sure how optical media would behave, as it takes a lot of heat to get paper to ignite (480F or so). I'm not sure what media would survive that and not melt or become unreadable in terms of optical or tape. The M-disc material supposedly has a melting point somewhere between 200C and 1000C. 200C is 392F, so if paper didn't ignore, maybe the optical media could survive. I need to find real world tests to see what will allow an optical disc media to survive a fire in a safe basically. Water damage is easy, as the media will survive as long as its in a case. But fire is a different matter, or indirectly the heat associated with it, as prolonged heat will eventually take out everything in the safe.

Very best,
 

killster1

Banned
Mar 15, 2007
6,208
475
126
not every bit of data is backed up the same for me but the important stuff is, really i dont know what i would decide as far online cloud.

well the 8tb has a dupe of it already offsite and also 2xbluray backups of it, i dont refresh it or even use it. it very well could be dead the first time i try to use it.


i have one of these in both houses. yea i know someone could cut through it most likely pretty easy but to pry the door off would be a task. and cutting through it would most likely stir the neighbors, along with the alarm etc.
 
  • Like
Reactions: MalVeauX

mikeford

Diamond Member
Jan 27, 2001
5,666
157
106
What do you all think about creating some kind of generic, PAR type recovery file? The idea being limited to loss of some reasonable number of data sectors, then store the recovery data in several ways and places, if its something like 10% of original files "the cloud" even seems somewhat reasonable to me. Raid modes seem like too much overhead and not enough flexibility in storing the recovery data.

Keep a few same model, same brand, same version drives to allow for circuit board swaps if that is the failure.

Make sure all your users know what to do and not do when a problem arises with data.

Put your NAS units on a UPS and enable auto shut down before the UPS runs out of juice.
 
  • Like
Reactions: MalVeauX

mikeford

Diamond Member
Jan 27, 2001
5,666
157
106
Watch some youtube videos of people cutting a safe in half with a circular saw aka skilsaw using a diamond blade, less than 5 min, not all that much noise.
 
  • Like
Reactions: MalVeauX

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
What do you all think about creating some kind of generic, PAR type recovery file?
Most definitely a good idea. I've done that, with one of my backups of an old computer. I normally backup to 4GB-sized files, so then I created a PAR2 archive chunk set of an appropriate size (like 10-20% PAR2 chunks? I forget what the default is.) Then I backup a certain number of 4GB files to BD-R, along with a certain number of the PAR2 chunks. Spread them out, so that losing one disc might allow recovery of the rest. You should probably calculate that all out, and not "wing it".
 
  • Like
Reactions: MalVeauX

killster1

Banned
Mar 15, 2007
6,208
475
126
Watch some youtube videos of people cutting a safe in half with a circular saw aka skilsaw using a diamond blade, less than 5 min, not all that much noise.
well there is alarms double backed up so they would have to break into both houses at once and it sure sounds noisy to me. maybe you can link the video? the one i saw was super loud not that it matters how loud, most people are not going to sit around for 5 mins sawing through safes to get my family photos when there are big screens and pc's everywheer to steal.

but i do agree PAR files are super cool method i didnt really consider unless i rar'd the files too. guess you can par regular files.
 
  • Like
Reactions: MalVeauX

MalVeauX

Senior member
Dec 19, 2008
653
176
116
not every bit of data is backed up the same for me but the important stuff is, really i dont know what i would decide as far online cloud.

well the 8tb has a dupe of it already offsite and also 2xbluray backups of it, i dont refresh it or even use it. it very well could be dead the first time i try to use it.


i have one of these in both houses. yea i know someone could cut through it most likely pretty easy but to pry the door off would be a task. and cutting through it would most likely stir the neighbors, along with the alarm etc.

That's true, it's hard to guage a backup if its not tested. So as you pointed out, it could be dead. Your BD-R's are likely fine though.

Not too terribly worried about physical theft of the data via stealing a safe. My data would be useless to them. And if that's my backup (archival level media) then I still have at least two physical copies of all that stuff so it's only the inconvenience and anxiety of theft and loss, but the data is not lost.

What do you all think about creating some kind of generic, PAR type recovery file? The idea being limited to loss of some reasonable number of data sectors, then store the recovery data in several ways and places, if its something like 10% of original files "the cloud" even seems somewhat reasonable to me. Raid modes seem like too much overhead and not enough flexibility in storing the recovery data.

Keep a few same model, same brand, same version drives to allow for circuit board swaps if that is the failure.

Make sure all your users know what to do and not do when a problem arises with data.

Put your NAS units on a UPS and enable auto shut down before the UPS runs out of juice.

Most definitely a good idea. I've done that, with one of my backups of an old computer. I normally backup to 4GB-sized files, so then I created a PAR2 archive chunk set of an appropriate size (like 10-20% PAR2 chunks? I forget what the default is.) Then I backup a certain number of 4GB files to BD-R, along with a certain number of the PAR2 chunks. Spread them out, so that losing one disc might allow recovery of the rest. You should probably calculate that all out, and not "wing it".

PARing can work of course, but there are already lots of PAR or parity based systems that we can employ and do use. I'm more concerned with the idea of a backup medium that will survive time I think at this point. As you pointed out, using the concept of parity to rebuild data if one piece of the pool is lost is common these days on arrays. A ZFS pool for example with 2 parity drives would allow for even more loss and still be able to rebuild. The problem is the drives need to be refreshed every couple of years, so its not a backup, but just an uptime solution.

Very best,
 
  • Like
Reactions: killster1

killster1

Banned
Mar 15, 2007
6,208
475
126
Main issue with a safe is that the thief doesn't know the contents only have value to you, and assumes its full of cash etc.
Well as I said its backed up many times in different locations the point is flood and fire proof, but I really would like to see how long it takes to saw through a 1000 pound safe. Its not a storage locker like the ones I watched them saw through. 1700f for one hour fire rating.

On another note I heard google business accounts gave unlimited storage for clouds maybe that has changed?? Guess I could live with 4tb cloud. Right now I am paying for a stupid one drive that they deleted the data because I didn't login in so long been to lazy to request them to try and retrieve it since I never lost the data and don't really need it I should cancel
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
I have one external hard drive in an enclosure attached to my PC for the first copy. It gets updated when I add files to my PC. The whole drive It is encrypted with BitLocker (Windows 10 PRO).

I also keep an additional encrypted hard drive in our safe as well as the 2nd local copy. It gets updated a few times a year (unless I add a significant amount of files that need backed up).

Lastly, I use cloud storage as the last step in case I would lose my local drives due to weather, fire, or theft. I encrypt the files before they get uploaded, and then the cloud service encrypts them as well.
 
  • Like
Reactions: MalVeauX

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
I back up important stuff (family photos and videos, documents, downloaded software archive, personal project notes as well as source code and assets for old mod projects, basically stuff that I can't replace) to an external disk weekly. There are two large disks and they rotate. I should shuttle them back and forth to a remote location, but I don't have a good remote location and realistically I'm to lazy to do that level of work every week. I have physical power shut offs for the non-in use disk and I swap those every week after briefly checking the backup log.

There is less important stuff like movies which are split between the two disks. This is because there isn't enough space to hold a full two copies of both archives. But its still backed up every other week. This stuff counts as "would be very annoying and time consuming to reacquire". My plan is that if important stuff grows to large I will stop backing these up to make space for that instead.

Its all stored on an unraid server, which has a parity disk so that a single drive failure can be endured. I'm not sure I will upgrade it to dual parity, but its an option. I prefer unraid because my strategy is "get full use out of the full life of a bunch of garbo disks I have lying around". I spin the disks down to save power (I believe the damage this supposedly causes is overblown and I intend to use the disks until they are worn out anyway)

The unraid server also runs my VMs. They sit on a btfs disk outside the array. Every weekend a script shuts down all the VMs so they are in a clean state, then creates a read only snapshot of the VM location before starting them all back up. Backup runs off the read only archive and down time is only a few minutes or so...not that anyone would actually notice.

I do need to setup a bitrot detection scheme. I have some ideas for it. I have tested the backups...as part of a mistake I made during an upgrade, so not how you're suppose to test them lol.
 
  • Like
Reactions: MalVeauX

MalVeauX

Senior member
Dec 19, 2008
653
176
116
What are you all using to do your backup's? Manual? Or some software that schedules & syncs multiple sources both local and over network?

Very best,
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
I've been using Bvckup2 in a VM. I wish I had a linux tool that worked the same way instead but I haven't found one. I'm not super happy with this setup but I think I'm just going to tighten it down some for now since despite being weird it has worked pretty reliably.
 
  • Like
Reactions: MalVeauX

piokos

Senior member
Nov 2, 2018
554
206
86
Currently I use a pretty simple system, designed from a risk perspective (I guess that's what working in risk management does to your brain).

Risk approach means you're not starting from how often you want to backup or how important the data is, but in what situation a particular backup would have to be recovered.
For me all data is important. If something isn't important, I just delete it.
However, this is where the reality kicks in: I use cloud as main backup and I can't afford to keep everything there. So as of today my photos and videos are only partially backed up on cloud (last 5 years or so).

This simplifies my situation since I only consider 2 scenarios:
1) primary copy lost - when just a single PC / server is affected (corrupted files, electronics failure, PC lost/stolen outside the house etc.)
2) all copies at home lost (theft, flood, fire etc.)

The important part is that I don't treat any copy at home as a backup. They don't provide any additional protection.

Solutions:
1) I don't store unique data on PCs or small home servers. Everything is synced to the NAS as soon as a device (PCs, smartphones, cameras) connects to home network.
2) my NAS is synced to cloud and I regularly update the offsite copy.
That's it. Super simple. No second drives hidden behind the toilet. :)

Since some data is not kept in the cloud (and realistically: never will be), I'm exposed to situations when the offline backup - the only complete copy - becomes unavailable. I'm looking for a place for a second copy.
On the other hand, since I live in an area which is not exposed to large natural disasters, keeping the offline storage in the same city is OK.
 
  • Like
Reactions: MalVeauX

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
I've often thought of taking a single large cheap hard disk with a raspberry pi, plopping it down at my parents house and then setting up an rsync schedule. My upload bandwidth is terrible though. And spreading my stuff out increases my workload to maintain it. That said, cloud services seem to shut down and change terms often making them not really set and forget affairs either.
 
  • Like
Reactions: MalVeauX

piokos

Senior member
Nov 2, 2018
554
206
86
Interesting; thanks!

What do you use to sync all your devices to the NAS (software)?
On Linux machines I use Synology Drive Client (since I use a Synology NAS). Other NAS makers offer similar software.
I don't know any independent solutions for DIY NASes well enough to recommend.

On Windows - depending on file location - it could be either:
Windows -> OneDrive -> NAS (most files in C:\Users, current projects etc.)
or
Windows -> NAS -> OneDrive (photos, media, Downloads etc.)

I do this because it instantly syncs files between laptop and smartphone when I'm out.
Pushing everything through NAS is possible as well, but it's quite a bit slower and I'd have to be on VPN almost all the time.

Big chunk of my data are photos and videos. And since I use Adobe Lightroom, all photos have to go through a Windows PC.
I use a dedicated "temp" directory where I dump files from all sources (cameras, smartphones). It's synced between my Windows PC and OneDrive.
Once in a while I import these files into Lightroom (they're renamed and moved to a proper photo directory structure). Afterwards they're treated as any other file on the PC (Windows -> NAS [-> OneDrive]).
 
  • Like
Reactions: MalVeauX

piokos

Senior member
Nov 2, 2018
554
206
86
I've often thought of taking a single large cheap hard disk with a raspberry pi, plopping it down at my parents house and then setting up an rsync schedule. My upload bandwidth is terrible though.
You're doing that many file operations? You could always limit this to the most important stuff.

The stuff I really needed backed up - personal documents, photos, my coding projects - isn't changing that fast. In case on my projects - surely not as fast as I would want. :D

Of course I don't backup my whole drives. Windows, programs etc are excluded - I do a monthly partition image.
I also don't backup stuff that can be easily downloaded from the net.
For example: I keep local repo of many large projects from GitHub and these are excluded from any of my backups or cloud syncs - partly because of frequent changes, but primarily because the sheer number of files can really burden file indexing.
And spreading my stuff out increases my workload to maintain it.
Depends on the solution. I'm considering just buying a simple Synology for my mother (like the cheapest DS120j) and reserving some space for Synology Hyper Backup.
Hyper Backup can target rsync servers as well, but I have to place to keep one. :/
That said, cloud services seem to shut down and change terms often making them not really set and forget affairs either.
I'm not sure which cloud services you've used. Maybe smaller (cheaper) companies do that. AWS, Google and Azure (and big dedicated solutions, like Backblaze) probably aren't going anywhere. Surely, they'll last longer than any DIY server / NAS you may set up on premise. :)
 
  • Like
Reactions: MalVeauX