Question How many backups do you keep?

Nov 17, 2019
10,673
6,398
136
One? Two? Three? More?

I'm talking weekly backups of the full drive. My weekly saved my butt royally, but in looking at the drive they're stored on, I see them back to October. Is that doing anything other than wasting space on the storage drive?
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
4-5 depending on how you count. For irreplaceable data I do the following:

There is the primary in use file. A copy on an external drive. Another on a generally powered-off and disconnected different external drive. A backup on my NAS*, and finally a burned copy on blu-ray stored at another location.

First protects against random drive failure. Second protects against power issues, ransomware etc. Third is for bit-rot protection, don't want data corruption issues creeping up. Last is for general safety against burglary, natural disasters and such.

I generally don't bother with OS backups, since it's just I find it easier to just reinstall the OS if anything happens. My data is important, not the OS.

*custom built with ZFS for bit-rot protection.
 
  • Like
Reactions: Magic Carpet

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
Almost forever.
Using opensource multiplatform zpaqfranz (my fork) or zpaq (original one) you can take the data forever.
In this example a fileserver from 2018 to 2021 (~500.000 files, ~500GB) packed in ~877GB

Code:
zpaqfranz v51.27-experimental snapshot archiver, compiled May 26 2021
fserver_condivisioni.zpaq:
Block          1 K       10.526 (block/s)
(...)
Block        102 K        3.584 (block/s)
1042 versions, 1.538.727 files, 15.716.105 fragments, 877.457.003.477 bytes (817.20 GB)
Long filenames (>255)     4.526

Version(s) enumerator
-------------------------------------------------------------------------
< Ver  > <  date  > < time >  < added > <removed>    <    bytes added   >
-------------------------------------------------------------------------
00000001 2018-01-09 16:56:02  +00308608 -00000000 ->      229.882.913.501
00000002 2018-01-09 18:06:28  +00007039 -00000340 ->           47.356.864
00000003 2018-01-10 15:06:25  +00007731 -00000159 ->            7.314.709
(...)
00001039 2021-05-02 17:17:42  +00030599 -00031135 ->       12.657.155.316
00001040 2021-05-03 17:14:03  +00000960 -00000095 ->          398.358.496
00001041 2021-05-04 17:13:40  +00000605 -00000004 ->           95.909.988
00001042 2021-05-05 17:15:13  +00000579 -00000008 ->           82.487.415

Works just like (about) snapshots (if you know zfs) or "time machine"

The data is simply never deleted, always added.

Normally, every three years or so, you will rename the backup file (without deleting it!) to make restores faster.
If you are curious you can find the executables for Windows 32/64 and the source on github / sourceforge.

I normally use it on FreeBSD machines, so it definitely compiles on them too.
Sometimes I also check the port on Linux, non-Intel QNAP (!) and ESXi
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
The same for Virtual Machines.
In this "real world" example a Windows 2008R2 MS-SQL (~350GB) .vmdk daily copy for about 4 months in 189GB
Code:
root@f-server:/copia1/copiepaq/spaz2020 # zpaqfranz i 15-05-2021-macchinavirtuale.zpaq
zpaqfranz v51.27-experimental snapshot archiver, compiled May 26 2021
15-05-2021-macchinavirtuale.zpaq:
126 versions, 290 files, 15.495.782 fragments, 189.762.363.335 bytes (176.73 GB)

Version(s) enumerator
-------------------------------------------------------------------------
< Ver  > <  date  > < time >  < added > <removed>    <    bytes added   >
-------------------------------------------------------------------------
00000001 2021-01-04 16:38:37  +00000011 -00000000 ->       45.149.547.720
00000002 2021-01-05 18:49:27  +00000003 -00000001 ->          278.216.143
00000003 2021-01-06 18:39:19  +00000002 -00000000 ->          216.609.487
00000004 2021-01-07 18:41:00  +00000002 -00000000 ->          233.273.768
00000005 2021-01-08 18:49:33  +00000002 -00000000 ->          218.684.612
00000006 2021-01-09 18:45:58  +00000002 -00000000 ->          212.519.557
00000007 2021-01-10 18:44:58  +00000002 -00000000 ->          210.964.209
00000008 2021-01-11 18:47:06  +00000002 -00000000 ->          969.157.474
00000009 2021-01-12 18:52:24  +00000002 -00000000 ->        1.016.607.767
00000010 2021-01-13 18:43:08  +00000002 -00000000 ->          610.777.735
00000011 2021-01-14 18:49:49  +00000002 -00000000 ->        2.658.699.818
00000012 2021-01-15 19:25:33  +00000009 -00000000 ->          641.989.918
00000013 2021-01-16 18:36:37  +00000002 -00000000 ->          285.394.708
00000014 2021-01-18 19:22:55  +00000002 -00000000 ->          519.225.307
00000015 2021-01-19 18:33:37  +00000002 -00000000 ->          523.761.748
00000016 2021-01-20 18:40:36  +00000002 -00000000 ->          427.725.303
00000017 2021-01-21 19:40:04  +00000002 -00000000 ->          429.508.570
00000018 2021-01-22 18:46:11  +00000009 -00000002 ->        1.438.297.091
00000019 2021-01-23 18:45:25  +00000002 -00000000 ->          133.205.506
00000020 2021-01-24 18:40:46  +00000002 -00000000 ->           98.337.838
00000021 2021-01-25 18:32:02  +00000002 -00000000 ->          514.671.565
00000022 2021-01-26 19:00:33  +00000002 -00000000 ->          475.025.170
00000023 2021-01-27 18:51:46  +00000002 -00000000 ->          573.309.368
00000024 2021-01-28 18:47:29  +00000002 -00000000 ->        1.254.912.246
00000025 2021-01-29 19:05:03  +00000002 -00000000 ->       14.064.142.238
00000026 2021-01-30 18:39:50  +00000002 -00000000 ->          137.102.715
00000027 2021-01-31 18:39:36  +00000002 -00000000 ->           78.702.323
00000028 2021-02-04 18:52:09  +00000009 -00000000 ->        1.367.201.111
00000029 2021-02-05 19:45:07  +00000002 -00000000 ->        2.439.368.844
00000030 2021-02-06 18:42:43  +00000002 -00000000 ->          185.851.686
00000031 2021-02-07 18:40:17  +00000002 -00000000 ->           98.279.603
00000032 2021-02-08 18:24:12  +00000002 -00000000 ->          658.679.189
00000033 2021-02-09 19:28:10  +00000002 -00000000 ->          724.191.963
00000034 2021-02-10 18:43:14  +00000002 -00000000 ->          387.756.262
00000035 2021-02-11 18:41:59  +00000002 -00000000 ->          926.983.150
00000036 2021-02-12 18:46:21  +00000002 -00000000 ->        2.338.200.084
00000037 2021-02-13 18:40:29  +00000002 -00000000 ->          128.782.533
00000038 2021-02-14 18:40:50  +00000002 -00000000 ->           94.599.691
00000039 2021-02-15 18:43:13  +00000002 -00000000 ->          662.639.195
00000040 2021-02-16 19:04:57  +00000002 -00000000 ->        4.064.211.173
00000041 2021-02-17 18:44:56  +00000002 -00000000 ->          783.775.666
00000042 2021-02-18 18:47:52  +00000002 -00000000 ->          651.089.077
00000043 2021-02-19 19:14:12  +00000002 -00000000 ->        1.691.581.136
00000044 2021-02-20 18:44:25  +00000002 -00000000 ->          143.413.834
00000045 2021-02-21 18:42:21  +00000002 -00000000 ->          115.394.237
00000046 2021-02-22 19:04:43  +00000009 -00000000 ->          604.687.090
00000047 2021-02-23 18:45:54  +00000002 -00000000 ->          459.653.388
00000048 2021-02-24 18:47:39  +00000002 -00000000 ->          462.166.546
00000049 2021-02-25 19:30:45  +00000002 -00000000 ->          638.437.403
00000050 2021-02-26 18:52:36  +00000002 -00000000 ->          400.949.563
00000051 2021-02-27 18:45:10  +00000002 -00000000 ->          144.771.866
00000052 2021-02-28 18:45:16  +00000002 -00000000 ->          251.842.831
00000053 2021-03-01 18:30:39  +00000002 -00000000 ->          453.201.723
00000054 2021-03-02 18:48:35  +00000002 -00000000 ->          573.961.549
00000055 2021-03-03 18:49:16  +00000002 -00000000 ->          469.802.426
00000056 2021-03-04 18:47:54  +00000002 -00000000 ->          467.278.684
00000057 2021-03-05 18:49:34  +00000002 -00000000 ->          378.171.663
00000058 2021-03-06 18:47:25  +00000002 -00000000 ->          141.843.431
00000059 2021-03-07 18:45:35  +00000002 -00000000 ->          114.217.568
00000060 2021-03-08 18:31:26  +00000002 -00000000 ->          470.715.529
00000061 2021-03-09 18:47:08  +00000002 -00000000 ->          688.527.419
00000062 2021-03-10 18:50:46  +00000002 -00000000 ->          851.620.501
00000063 2021-03-11 19:43:29  +00000002 -00000000 ->          538.428.648
00000064 2021-03-12 18:59:46  +00000002 -00000000 ->          624.828.008
00000065 2021-03-13 19:09:20  +00000002 -00000000 ->           70.332.788
00000066 2021-03-14 18:46:05  +00000002 -00000000 ->          142.919.035
00000067 2021-03-15 18:32:20  +00000002 -00000000 ->          518.975.358
00000068 2021-03-16 18:48:06  +00000002 -00000000 ->        3.586.397.423
00000069 2021-03-17 18:47:18  +00000002 -00000000 ->          415.249.843
00000070 2021-03-18 18:49:29  +00000002 -00000000 ->          661.228.265
00000071 2021-03-19 19:18:41  +00000002 -00000000 ->          591.821.209
00000072 2021-03-20 18:44:15  +00000002 -00000000 ->          156.256.438
00000073 2021-03-21 18:45:47  +00000002 -00000000 ->          109.317.547
00000074 2021-03-22 18:35:16  +00000002 -00000000 ->          523.361.320
00000075 2021-03-23 18:48:49  +00000002 -00000000 ->        8.152.305.730
00000076 2021-03-24 18:50:08  +00000002 -00000000 ->          674.542.534
00000077 2021-03-25 18:51:13  +00000002 -00000000 ->          562.785.574
00000078 2021-03-26 18:58:07  +00000002 -00000000 ->          725.138.887
00000079 2021-03-27 18:47:52  +00000002 -00000000 ->          158.700.870
00000080 2021-03-28 17:48:06  +00000002 -00000000 ->          137.939.498
00000081 2021-03-29 17:31:02  +00000002 -00000000 ->          739.337.623
00000082 2021-03-30 18:14:29  +00000002 -00000000 ->        7.465.722.204
00000083 2021-03-31 17:51:05  +00000002 -00000000 ->        1.267.026.725
00000084 2021-04-01 17:48:14  +00000002 -00000000 ->        1.604.504.042
00000085 2021-04-02 18:04:16  +00000002 -00000000 ->        1.517.751.104
00000086 2021-04-03 17:53:54  +00000002 -00000000 ->          180.345.315
00000087 2021-04-04 17:45:45  +00000002 -00000000 ->          338.825.767
00000088 2021-04-05 17:26:59  +00000002 -00000000 ->          325.496.165
00000089 2021-04-06 17:49:08  +00000002 -00000000 ->          660.531.157
00000090 2021-04-07 17:54:01  +00000002 -00000000 ->        1.012.921.296
00000091 2021-04-08 17:49:42  +00000002 -00000000 ->        1.214.261.842
00000092 2021-04-09 17:47:26  +00000002 -00000000 ->          653.887.678
00000093 2021-04-10 17:45:29  +00000002 -00000000 ->          271.828.291
00000094 2021-04-11 17:47:35  +00000002 -00000000 ->          465.468.135
00000095 2021-04-12 17:34:34  +00000002 -00000000 ->          963.529.795
00000096 2021-04-13 17:47:20  +00000002 -00000000 ->          682.595.486
00000097 2021-04-14 17:48:31  +00000002 -00000000 ->          611.244.113
00000098 2021-04-15 18:36:23  +00000002 -00000000 ->          691.029.161
00000099 2021-04-16 17:49:44  +00000002 -00000000 ->          798.070.847
00000100 2021-04-17 17:46:23  +00000002 -00000000 ->          159.940.908
00000101 2021-04-18 17:47:06  +00000002 -00000000 ->          357.788.997
00000102 2021-04-19 17:32:03  +00000002 -00000000 ->          702.327.107
00000103 2021-04-20 17:50:54  +00000002 -00000000 ->        1.159.537.962
00000104 2021-04-21 18:02:51  +00000002 -00000000 ->          523.750.617
00000105 2021-04-22 17:49:19  +00000002 -00000000 ->          894.026.152
00000106 2021-04-23 17:51:17  +00000002 -00000000 ->          930.936.188
00000107 2021-04-24 17:46:42  +00000002 -00000000 ->          324.102.044
00000108 2021-04-25 17:46:15  +00000002 -00000000 ->          136.123.197
00000109 2021-04-26 17:33:23  +00000002 -00000000 ->        1.468.958.592
00000110 2021-04-27 17:52:47  +00000002 -00000000 ->        1.039.538.615
00000111 2021-04-28 17:56:09  +00000002 -00000000 ->          715.357.937
00000112 2021-04-29 17:52:15  +00000002 -00000000 ->        2.827.978.717
00000113 2021-04-30 17:54:00  +00000002 -00000000 ->        1.169.973.522
00000114 2021-05-01 17:47:43  +00000002 -00000000 ->          287.543.553
00000115 2021-05-02 18:23:51  +00000002 -00000000 ->          292.162.248
00000116 2021-05-03 17:32:23  +00000002 -00000000 ->        2.070.325.286
00000117 2021-05-04 17:50:21  +00000002 -00000000 ->        2.329.208.471
00000118 2021-05-05 17:51:21  +00000002 -00000000 ->        7.364.626.031
00000119 2021-05-06 19:23:52  +00000002 -00000000 ->        8.777.478.818
00000120 2021-05-08 17:49:17  +00000002 -00000000 ->        1.976.302.505
00000121 2021-05-09 17:48:32  +00000002 -00000000 ->          739.944.367
00000122 2021-05-10 17:32:06  +00000002 -00000000 ->          766.287.183
00000123 2021-05-11 17:50:21  +00000002 -00000000 ->          843.823.137
00000124 2021-05-12 17:49:16  +00000002 -00000000 ->        1.269.770.842
00000125 2021-05-13 18:29:39  +00000002 -00000000 ->       11.400.291.552
00000126 2021-05-14 18:04:38  +00000002 -00000000 ->        1.015.135.548

67.135 seconds (all OK)

So the answer is: use the right tool!
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
4-5 depending on how you count. For irreplaceable data I do the following:

There is the primary in use file. A copy on an external drive. Another on a generally powered-off and disconnected different external drive. A backup on my NAS*, and finally a burned copy on blu-ray stored at another location.

First protects against random drive failure. Second protects against power issues, ransomware etc. Third is for bit-rot protection, don't want data corruption issues creeping up. Last is for general safety against burglary, natural disasters and such.

I generally don't bother with OS backups, since it's just I find it easier to just reinstall the OS if anything happens. My data is important, not the OS.

*custom built with ZFS for bit-rot protection.
Using even "cheap" cloud services (like dropbox), and zpaq, it's easy to take a copy of data forever.
Typically, if a ssh server is available, it's possible to make copy overnight (zpaqed-rsync) or every hours (zfs-rsyced replica).
But this require a source and target FreeBSD OS
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Using even "cheap" cloud services (like dropbox), and zpaq, it's easy to take a copy of data forever.

I do hope this doesn't come across as too strong.

Firstly, "the cloud" doesn't exist. It's just a fancy name for someone elses computer(s). With all the implications thereof. Secondly, those cloud services have a distressing habit of shutting down without much warning.

Why should I trust "the cloud" with my irreplaceable data?
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
I do hope this doesn't come across as too strong.

Firstly, "the cloud" doesn't exist. It's just a fancy name for someone elses computer(s). With all the implications thereof. Secondly, those cloud services have a distressing habit of shutting down without much warning.

Why should I trust "the cloud" with my irreplaceable data?
If your data is important, but not fundamental (= you don't want / can spend) you can keep it, always encrypted, for example on dropbox

If, on the other hand, you want / you can spend (something not taken for granted in the home environment) you can rent a machine (so it becomes YOUR machine, at least for management) from OVH and HETZNER, for example, where you will store your irreplaceable data fully encryped.

If you are quite "paranoid" you can rent two (it cannot be ruled out that one will be destroyed as happened for some OVHs) or more, in different countries.
I do this every day, but the rental costs go up.

A "reliable" copying mechanism (with 7 copies of which at least 5 checked every day) can be found on the FreeBSD forum, with related details.
I'm not sure if ZFS machines are very popular on this forum, but I could be wrong.

The key element is the expense you can/want afford.

PS I administer a hundred physical and virtual UNIX servers almost all over the world, I think to understand quite well what "cloud" is
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
One? Two? Three? More?

I'm talking weekly backups of the full drive. My weekly saved my butt royally, but in looking at the drive they're stored on, I see them back to October. Is that doing anything other than wasting space on the storage drive?
It's depends on the filesystem.
We are talking of a Windows machine?
Or a more general situation ?
Do you want to store DATA only, or SOFTWARE too?
For data (on Windows) I suggest to do something like that
Code:
zpaqfranz a d:\mybackup.zpaq c:\data c:\pippo c:\photo
In this example you will take (forever) everything into d:\mybackup.zpaq

If you want to make an IMAGE of a Windows drive I suggest
http://www.drivesnapshot.de/en/index.htm (something like a "ghost") with differential backups and a decimation (take the last X).

If you want to make a copy (of the data) you can adopt an image, for example an encrypted container (truecrypt or veracrypt), to mount on Windows, Linux or whatever you want.
On it you will write the data.
And then you will make a copy of it (unmounted) with zpaq / zpaqfranz.
This way you can be sure that you have obtained an identical (sector-wide) backup while still keeping it on a file.

In this case the use of a container has a double effect: it is always encrypted (this is good), and therefore its copy will also be encrypted.
=====
Real world example (Windows).
Create a (let's say) XGB TC file to store your most sensitive data, mount it to use it in Windows normally (it's practically identical, only slower).
When you want to make a backup you unmount, than
Code:
zpaqfranz a d:\mycontainerbackup.zpaq c:\data\mycontainer.tc -m 0
(-m0 = no compression, useless for encrypted container)

Repeat every time you want, and you will get (into d:\mycontainerbackup.zpaq) as many "snapshots" of your container, ready to be restored, perfect to the single sector.

In this example a 30GB containter with 63 copies in about 40GB
Code:
zpaqfranz v51.27-experimental snapshot archiver, compiled May 26 2021
provona.zpaq: 
Block          1 K       30.303 (block/s)
Block          2 K       12.658 (block/s)
63 versions, 1.617 files, 601.840 fragments, 42.649.102.406 bytes (39.72 GB)
Non-latin (UTF-8)             5

Version(s) enumerator
-------------------------------------------------------------------------
< Ver  > <  date  > < time >  < added > <removed>    <    bytes added   >
-------------------------------------------------------------------------
00000001 2020-11-27 09:37:36  +00000278 -00000000 ->       31.226.311.999
00000002 2020-11-28 09:34:09  +00000010 -00000002 ->           13.696.770
00000003 2020-11-30 08:46:36  +00000029 -00000005 ->          137.395.696
00000004 2020-11-30 08:56:16  +00000004 -00000000 ->            7.276.531
00000005 2020-11-30 09:13:47  +00000003 -00000000 ->            4.687.403
00000006 2020-11-30 09:20:15  +00000003 -00000000 ->            4.687.514
00000007 2020-11-30 09:21:15  +00000003 -00000000 ->            1.056.495
00000008 2020-11-30 09:23:38  +00000003 -00000000 ->            4.590.014
00000009 2020-11-30 09:25:46  +00000003 -00000000 ->            2.366.787
00000010 2020-11-30 09:27:56  +00000003 -00000000 ->            4.599.789
00000011 2020-11-30 09:32:13  +00000003 -00000000 ->            2.990.775
00000012 2020-11-30 09:33:14  +00000003 -00000000 ->            1.389.823
00000013 2020-11-30 09:35:00  +00000003 -00000000 ->            2.487.966
00000014 2020-11-30 09:36:33  +00000003 -00000000 ->            3.160.000
00000015 2020-11-30 09:38:57  +00000003 -00000000 ->            4.688.022
00000016 2020-11-30 10:03:43  +00000003 -00000000 ->            1.081.892
00000017 2020-11-30 10:04:41  +00000001 -00000000 ->        1.915.078.952
00000018 2020-11-30 10:17:50  +00000001 -00000000 ->               71.130
00000019 2020-11-30 10:21:04  +00000004 -00000000 ->            8.566.138
00000020 2020-11-30 10:27:08  +00000004 -00000000 ->            2.089.289
00000021 2020-11-30 10:35:13  +00000003 -00000000 ->            2.998.117
00000022 2020-11-30 10:37:12  +00000001 -00000000 ->            2.980.974
00000023 2020-11-30 16:21:42  +00000001 -00000000 ->          116.858.335
00000024 2020-12-02 18:00:47  +00000004 -00000001 ->           16.412.348
00000025 2020-12-03 17:58:58  +00000001 -00000000 ->           10.361.612
00000026 2020-12-05 14:23:06  +00000005 -00000000 ->           17.143.226
00000027 2020-12-05 17:29:47  +00000007 -00000000 ->           18.312.906
00000028 2020-12-06 10:39:08  +00000001 -00000000 ->            4.397.887
00000029 2020-12-10 09:02:38  +00000009 -00000005 ->           15.648.768
00000030 2020-12-12 10:28:07  +00000012 -00000000 ->            7.900.375
00000031 2020-12-15 10:08:35  +00000018 -00000000 ->           33.656.848
00000032 2020-12-17 08:49:23  +00000011 -00000001 ->           11.624.550
00000033 2020-12-19 10:12:13  +00000008 -00000000 ->           10.495.324
00000034 2020-12-27 12:55:52  +00000020 -00000000 ->          206.523.162
00000035 2021-01-01 12:25:45  +00000042 -00000001 ->           20.545.680
00000036 2021-01-07 18:30:39  +00000016 -00000001 ->        1.734.563.911
00000037 2021-01-14 12:38:35  +00000013 -00000000 ->          913.554.890
00000038 2021-01-19 09:50:01  +00000017 -00000000 ->          263.208.515
00000039 2021-01-20 18:27:06  +00000018 -00000000 ->           97.674.450
00000040 2021-01-24 14:56:44  +00000006 -00000000 ->            9.138.038
00000041 2021-01-28 13:43:43  +00000010 -00000009 ->          759.198.976
00000042 2021-01-28 13:51:15  +00000001 -00000000 ->            1.790.167
00000043 2021-01-30 13:41:37  +00000006 -00000000 ->            4.761.376
00000044 2021-01-31 17:41:20  +00000001 -00000000 ->            5.743.409
00000045 2021-02-07 15:20:45  +00000684 -00000000 ->          122.078.186
00000046 2021-02-09 10:40:34  +00000001 -00000000 ->            9.795.700
00000047 2021-02-10 18:09:25  +00000006 -00000001 ->          411.219.990
00000048 2021-02-22 14:03:21  +00000007 -00000000 ->        3.393.672.599
00000049 2021-02-26 11:13:55  +00000005 -00000000 ->            3.641.307
00000050 2021-03-05 14:21:27  +00000003 -00000000 ->           11.262.424
00000051 2021-03-05 14:42:13  +00000001 -00000000 ->            1.789.030
00000052 2021-03-22 07:51:53  +00000006 -00000001 ->          145.845.029
00000053 2021-03-30 09:35:10  +00000023 -00000000 ->            5.638.615
00000054 2021-04-11 08:43:23  +00000265 -00000229 ->          283.907.475
00000055 2021-04-11 18:54:52  +00000002 -00000000 ->            1.789.379
00000056 2021-04-12 09:55:21  +00000001 -00000000 ->            1.788.483
00000057 2021-04-12 12:32:17  +00000001 -00000000 ->            1.788.483
00000058 2021-04-17 12:53:56  +00000009 -00000001 ->            2.757.086
00000059 2021-04-17 13:05:21  +00000001 -00000000 ->           74.456.918
00000060 2021-04-17 13:22:28  +00000001 -00000000 ->            2.670.731
00000061 2021-04-30 09:16:44  +00000001 -00000000 ->          170.130.434
00000062 2021-05-03 14:59:42  +00000001 -00000000 ->           82.176.252
00000063 2021-05-24 12:08:22  +00000001 -00000000 ->          288.927.456

At every run ONLY the "sectors" (in fact zpaq blocks of the containers,the explanation would be long) changed will be stored, with the min possible use of media space.

So you don't have to worry about copying "everything" (robocopy, rsync, 7z, rar, whatever), because "everything" is already in the container file.

Of course you can also use vhds, but in my opinion is better to take everything encrypted
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
If, on the other hand, you want / you can spend (something not taken for granted in the home environment) you can rent a machine (so it becomes YOUR machine, at least for management) from OVH and HETZNER, for example, where you will store your irreplaceable data fully encryped.

Uhm. That's not foolproof either:

https://www.theregister.com/2021/03/10/ovh_strasbourg_fire/

(I actually know a guy whose business was almost ruined by that fire, loosing a few private backups is small potatoes)

I'm glad you found a solution which works for you. It doesn't for me.
 
  • Like
Reactions: MalVeauX

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
Uhm. That's not foolproof either:

https://www.theregister.com/2021/03/10/ovh_strasbourg_fire/

(I actually know a guy whose business was almost ruined by that fire, loosing a few private backups is small potatoes)

I'm glad you found a solution which works for you. It doesn't for me.
If the guy want to make a disaster recovery plan I can do very easily.
And yes, I have a solution to keep backups forever.

If you are quite "paranoid" you can rent two (it cannot be ruled out that one will be destroyed as happened for some OVHs) or more, in different countries.
I do this every day, but the rental costs go up.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
I keep 3 full backups per drive. Unfortunately, they are currently residing on my main system since my NAS went down. I need to setup another soon. I don't keep any data off premise atm.
 
Nov 17, 2019
10,673
6,398
136
Maybe I phrased it wrong since some of you seem to be typing in Swahili.

I use Acronis and it's scheduled to do a full back up of each of two drives once a week. I'm not really seeing a reason to keep more than three or four weekly backups for each drive.
 

blckgrffn

Diamond Member
May 1, 2003
9,111
3,029
136
www.teamjuchems.com
One thing I will say after getting smacked in the mouth with ransome ware (on my primary home file server) is that at some point you should have cold backups.

As in, ones you do every so often and just keep aside. LIke a external HD you hook up and do a backup while you are working some day then you put it back away monthly or so. If your backups get corrupted or encrypted they are not useful.

My bacon was SAVED since I had a Windows backup setup to a dedicated drive that was therefore not shown in the normal view of the filesystems. I got a full back up at 1 am that was done minutes later, the encryption/shredding of my files started at 2 am. I had onedrive setup but I was gone for the weekend so it got about 99% of my files before I pulled the plug (literally). OneDrive restore was useless because it did a filename change to all the files so all the OG files at that point just blip out of existence (something about volume shadow copies and Windows).

I felt so lucky and was in shock as I managed to retrieve hundreds of gigabytes of bespoke data I thought lost.

Of course my Crashplan plan had expired just before this. I miss home Crashplan a lot.
 

Golgatha

Lifer
Jul 18, 2003
12,640
1,482
126
2 backups synced every few days, mirrored from my file server (9TB) to my desktop (12TB). Cold storage backup (12TB USB) made monthly of everything on the file server. Swap cold storage backup with an identical (12TB USB) external and make another monthly copy of the file server within a day or two. One cold storage is stored offsite. So, two working copies and two roughly month-old, cold storage copies. Also, I upload a copy of my most sensitive data to Google Drive. All backups are encrypted using TrueCrypt and data being uploaded to Google Drive is in a TrueCrypt file container.
 
  • Like
Reactions: blckgrffn

BonzaiDuck

Lifer
Jun 30, 2004
15,699
1,448
126
On my Number One system I let my scheduled definition file for Macrium Reflect Workstation do one full backup monthly, a differential backup every Monday and an incremental backup every work-day. I might have scheduled to keep old backups to two or three months earlier, but I configured the program to delete old backups -- full, differential and incremental -- when there's less than X hundred gigabytes of free space remaining on the backup drive.

I also have individual programs, like Quicken, which create their own backups. Add MS Outlook to that list. Those backups are made to my home server upstairs, determined by the program -- I don't remember setting any parameters, but backups occur every couple days.

Other household systems are backed up by the server automatically with Windows Server 2012 R2 Essentials.

My document archive, film and recordings, music and a library of general software install programs are on the server. The important files are backed up from the server to its local backup disk with automated Syncback SE, and the same inventory of important files are duplicated with folder-duplication using Stablebit Drive-Pool.

I don't like to lose stuff . . . .
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
I keep 3 full backups per drive. Unfortunately, they are currently residing on my main system since my NAS went down. I need to setup another soon. I don't keep any data off premise atm.
So your "story" is three copies, let's say three days, and it's all about data from a Windows machine.
If I understand correctly, I suggest you try zpaq, so as to have virtually unlimited backups (let's say realistically 1,000-2,000).
Free (opensource software) and tested for years

Maybe I phrased it wrong since some of you seem to be typing in Swahili.

I use Acronis and it's scheduled to do a full back up of each of two drives once a week. I'm not really seeing a reason to keep more than three or four weekly backups for each drive.
In this case you can lose all the data accumulated in a week, and you cannot restore any deleted or overwritten (perhaps by mistake) documents older than about a month.
Of course it all depends on the importance of the data itself, but you can do better.


One thing I will say after getting smacked in the mouth with ransome ware (on my primary home file server) is that at some point you should have cold backups.
...
You can make up for this need, as I tried to explain, by using mechanisms immune to ransomware which are free (up to a few GB, dropbox for example) and instead expensive (up to a few terabytes, let's say 120 euros / year)

2 backups synced every few days,... make another monthly copy of the file server within a day or two. One cold storage is stored offsite. So, two working copies and two roughly month-old, cold storage copies. Also, I upload a copy of my most sensitive data to Google Drive. All backups are encrypted using TrueCrypt and data being uploaded to Google Drive is in a TrueCrypt file container.
A month-old copy is just about useless in many situations.
You can send your <2GB Truecrypt container where you want, for example a always connected NAS (cheap), or even via ssh (require a VDSL or better internet connection)


On my Number One system I let my scheduled definition file for Macrium Reflect Workstation do one full backup monthly, a differential backup every Monday and an incremental backup every work-day. ...
I don't like to lose stuff . . . .
I don't think there are any backup control policies.
It's not as important to test backups as to test restores in realistic scenarios.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,111
3,029
136
www.teamjuchems.com
You can make up for this need, as I tried to explain, by using mechanisms immune to ransomware which are free (up to a few GB, dropbox for example) and instead expensive (up to a few terabytes, let's say 120 euros / year)

I have about TB I want to protect and drives I already own for other reasons. It's effectively free and more immune to more sophisticated attacks that work to corrupt data over a very long period of time.

I really swap my drives once a year. You might consider this a worthless interval, but I have years of family photos and other similar, non-changing files interspersed my changing files.

It's also incredibly simple and requires no scripting, no third party tools and has darn near unlimited bandwidth because I have a recovery drive in my hand before the fire department is finished hosing down the wreckage of my home.

It's great you have a system that works for you. I've got a system that works for me and is superior considering the details of my use case. As all DR plans should consider.
 
  • Like
Reactions: Insert_Nickname

BonzaiDuck

Lifer
Jun 30, 2004
15,699
1,448
126
So your "story" is three copies, let's say three days, and it's all about data from a Windows machine.
If I understand correctly, I suggest you try zpaq, so as to have virtually unlimited backups (let's say realistically 1,000-2,000).
Free (opensource software) and tested for years


In this case you can lose all the data accumulated in a week, and you cannot restore any deleted or overwritten (perhaps by mistake) documents older than about a month.
Of course it all depends on the importance of the data itself, but you can do better.



You can make up for this need, as I tried to explain, by using mechanisms immune to ransomware which are free (up to a few GB, dropbox for example) and instead expensive (up to a few terabytes, let's say 120 euros / year)


A month-old copy is just about useless in many situations.
You can send your <2GB Truecrypt container where you want, for example a always connected NAS (cheap), or even via ssh (require a VDSL or better internet connection)



I don't think there are any backup control policies.
It's not as important to test backups as to test restores in realistic scenarios.
I don't make "routine" tests, but everything has been tested through real events. I've had to restore two systems from the server backups, and it all works just fine. I've had to make at least three restorations for my Super-Dooper-Best desktop system, and it also works perfectly. Those latter restorations were necessary back in 2017 when I was configuring and then correcting a dual-boot system that would get borked by milestone Windows feature upgrades. And -- if I recall -- they may have been limited to boot-volumes, but if you can restore a boot-volume, you can easily restore a data disk.

Anything else -- our tablets and phones for instance -- have both cloud backup and limited file backup on my Super system (not the upstairs server). I don't use "the cloud" unless it is integral to the device and OS, such as my Android tablets. In this latter case, it's almost automatic when you set up your accounts, and it allows information sharing about apps and files across multiple tablets.

Macrium has backup templates, which address the day and time, the frequency of the backups, whether they are full, differential or incremental. So you can define total backup management in profiles, including the management of space on the target backup disk. Then, you schedule the backup according to a particular profile, and don't bother with it anymore until you get a notice of backup failure. I may have that happen once ever one or two years, and it always seems to be a problem of available storage on the backup disk. Mostly, Macrium manages the space on the backup disk without problem or intervention. The fixes are simple: you just offload the oldest full backup, its differentials and incremental files which have the same backup ID string, and check that success occurred the following day.
 

Golgatha

Lifer
Jul 18, 2003
12,640
1,482
126
A month-old copy is just about useless in many situations.
You can send your <2GB Truecrypt container where you want, for example a always connected NAS (cheap), or even via ssh (require a VDSL or better internet connection)

For me it works. The cold storage has audio, video, home movies, pictures, game installers (GoG.com, etc), full drive images, driver backups, etc. Basically stuff that doesn't change much in a month's time. I have a 4TB drive image (R:\ for Restore) in each of the desktops for more timely backups and immediate restore operations (I use Macrium Reflect). The offsite cold storage (12TB USB) is for a "my house burned down" scenario. The onsite cold storage (12TB USB) is stored in a water and fire proof lockbox as well to try and keep it safe in the event of a fire or basement flood.

Regarding the whole disk backups, I think something a lot of people miss is testing those backups.
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
I don't make "routine" tests, but everything has been tested through real events...
Therefore, you are not sure that your data has not been changed, corrupted or lost.
You "hope" everything is OK, but you don't know it.
Of course it is a mechanism that is fine for a home user but, personally, I do not recommend it.
Any unverified backup is a risk.
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
For me it works...

Regarding the whole disk backups, I think something a lot of people miss is testing those backups.
Works for you... but... are you sure you cannot do something better?
About testing backup, as explained before, it is essential

In fact, I'm a storage manager and disaster recovery architect, so I'm more used to protecting corporate rather than home-based data.
But not being sure that your backups are perfect is always a "black hole".
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
Turning back to initial question: how many versions did you keep, and for how long?
With UNIX systems, NOT Windows filesystems, forever (zfs send).

The key point is to have a deduplication mechanism capable of minimizing the space used, and thus stretching to the maximum (virtually to infinity).
It is not "magic", it is simply the elimination of the invariant portions of data (which are the majority, in general).

I personally use zpaq and srep, especially the former (much more comfortable).

Windows? Forever (data), about 3 months (daily image backup)
 

Red Squirrel

No Lifer
May 24, 2003
67,201
12,029
126
www.anyf.ca
I'd say around 3 copies. My backup setup is a bit of a hodge podge and been wanting to make it better.

Basically I have local backups on each server that simply mirrors stuff to a local backup folder with rsync. So that's 1.

Then the backup server grabs those backups and puts it in a separate area on that server, so that's 2.

And then finally I have some removable backup jobs I run once in a while that grab the data from the backups and puts them to individual drives. I will then have multiple copies of the same backups. So I'll count that as 3.

For things like code, I will have the jobs setup to backup into a date based folder like year/month/*. to have versioning but this is kind of wasteful of disk space.

Been wanting to come up with something better that can span jobs across multiple drives as well as keep versioning. Right now I need to sub divide the removable jobs to ensure they don't just fill the drives but if I made something smarter it could just keep track of each individual file and do versioning etc... and essentially just tell me which drive I need to insert. Kinda like tape based solutions. It would then do incremental backups by only backing up the files that changed, but there would also be some rules I can set to keep at least N copies of a given folder. I have a bunch of small drives I don't use so it would be perfect for a solution like this where they are essentially treated like tapes.

All of my live data including the backups is also on raid so that adds a bit of extra fault tolerance. It's not a backup in itself but it adds an extra layer of safety against drive failures and downtime caused by such. The removable backups are on individual drives though. Every now and then I look into tape then realize it's not worth it. Very expensive.
 

fcorbelli

Junior Member
May 26, 2021
15
3
36
github.com
I'd say around 3 copies. My backup setup is a bit of a hodge podge and been wanting to make it better.

Basically I have local backups on each server that simply mirrors stuff to a local backup folder with rsync. So that's 1.

Then the backup server grabs those backups and puts it in a separate area on that server, so that's 2.

And then finally I have some removable backup jobs I run once in a while that grab the data from the backups and puts them to individual drives. I will then have multiple copies of the same backups. So I'll count that as 3.
...
If I understand right, you have 1,5 copies (the first two are the same,maybe. Maybe because not checked).
In this case you cannot restore anything older then the third, and you will have a iatus between copy 1 and copy 3.
In this case you could really use ZPAQ technology to get

- almost forever retention of data
- keep copy 1 for speed of recovery (and using a different software, that's always good)
- very fast copy 2
- very fast copy 3 or whatever
- much, much more advanced check of copies (even the first, rsync-ed)
- virtually no management complexity (if you can use robocopy or rsync, you can use zpaq too)
- ease of encrypting (optional)
- ability to copy only modified data to removable media (USB stick), keeping the main stock (USB HDD) elsewhere
- if you use Linux or UNIX (BSD) you can do the same things
- If you have really really important data, and you can sacrifice cheap SSDs (the cheapest, the better), you can have 100% security of the backups you make, by restoring and comparing them. It is the enterprise level-restore policy
- 0 cost (opensource)


What is ZPAQ?

It is NOT my software, I am NOT the original author


So I don't have to take credit, or even less money

In a nutshell it is an archiver (similar to 7z or RAR just to understand) in which data is added in deduplicated form (like zfs' snapshots, or Mac Time Machine)

Since the developer (ex Dell) has retired and no longer deals with it, I have developed my fork (always opensource, always on github) which adds features, typically in the context of copy verification, ZFS - UNIX support (and much more) :)

It is certainly more evolved, "cooler", but it essentially does the same things as the original version (just to maintain file format compatibility).

So my advice is: try it, the original 7.15 or mine, it doesn't matter.

They are free!
 

BonzaiDuck

Lifer
Jun 30, 2004
15,699
1,448
126
Therefore, you are not sure that your data has not been changed, corrupted or lost.
You "hope" everything is OK, but you don't know it.
Of course it is a mechanism that is fine for a home user but, personally, I do not recommend it.
Any unverified backup is a risk.
If it's a matter of verification, the software does that.

I agree with you in concept. I was simply not in the habit of going through unnecessary restoration cycles just to "prove" that my backup system is working properly. But the real-world situations that compelled me to make restorations proved out.

One could ask "what if they didn't?" Given that I have older off-line backups of data, I would still lose some data with the dated backups. But they did "prove out" -- for each of the few different backup means and methods I regularly use.