dedicated software that can create verifiable historical backup files. Like Veeam or Macrium, or the new generation like Duplicacy, Arq, Borg, etc. All of them have integrity verification integrated.
dedicated software that can create verifiable historical backup files. Like Veeam or Macrium, or the new generation like Duplicacy, Arq, Borg, etc. All of them have integrity verification integrated.
if you use real backups, and not just simple copies, then your backup software has verify function. For simple copies you should use hash files or something that can build a hash database and verify it. Btw. you should already use hash checking for live data anyway. For archiving you can create winrar archives with 10% recovery record, so it can self-verify and self-repair easily.
I have daily image backups of the OS drives with a history of 60-90 days.
my Windows is also stable, except when the SSD kills itself upon a sudden power loss…
we don’t backup. when our HDD goes wrong, we just post on r/DataHoarder
.rar with 10% recovery record
use USB 3 dock. USB 2 is slow.
please ask this every day
you should start to create your backup plan if you don’t already have yet.
it does not change the data transfer speed. It just enables write cache which is a small(ish) memory buffer that stores the data while it is written to the disk in the background.
Everything will need the same time to write to the drive, only difference is, that the software that writes to the disk will think the data is written, while in reality it is only in the memory and the disk is still working in the background. That data will be lost if you disconnect the drive without properly ejecting it, or the windows crashes, or sudden power loss, or anything happens.
It has some effects with small parallel random writes, but it doesn’t do much with sequential performance of large files. You can easily compare the performance between the settings with your real-life usage, but don’t expect much difference.
cool. looks like it was a “real” error then. For me there are random disappearing errors from time to time, that’s why I wrote about surprises. Needs some learning, but Victoria is pretty good for fiddling with these errors after you finally understand what it does.
you can run a full read scan with HD Sentinel, Victoria, or anything else you like. That will read each sector and also show which file is at that sector. Anyway, on SSD everything works magically different, so expect some surprises.
I don’t understand why everybody is fanatic about RAID in home environment… Until you don’t have a bulletproof backup, you shouldn’t even think about it