[Date Prev][Date Next][Subject Prev][Subject Next][ Date Index][ Subject Index]

Re: Backup--OT



Really? I haven't seen incorrect copying for decades.

Then you probably haven't been looking for it. Incorrect copying is hard to spot when the OS or program does not report it. You just assume that your 100GB of user files is perfectly copied in the absence of any notice to the contrary. Yet I have found over the years that the occasional file gets degraded to the point of unusability. Obviously, I am only able to check a tiny fraction of my files.

Here's some general information on the problem:

http://www.techsupportalert.com/Brilliant-File-Integrity-Verification-Utility; eudora="autourl"> http://www.techsupportalert.com/Brilliant-File-Integrity-Verification-Utility

and here's a highly rated disk duplicator with some form of verfication; however it is several times more expensive than the one you are using, and I have not yet discovered what method or choice of verification methods (hash or bit-by-bit) it uses:

https://www.diskology.com/djstandard.html; eudora="autourl"> https://www.diskology.com/djstandard.html

The problem with disk verification is that it is slow; and every technique used to speed it up also compromises its accuracy.

He recommends RAID. I was wondering if that wouldn't be a better solution than the disk duplicator. But the RAID enclosure would have to connect by some means to the computer, and wouldn't that slow things down?

When you say RAID here, you probably mean RAID 1 (two disks that simultaneously mirror one another). That is the safest though the least efficient (in terms of disk space) form of RAID.

There is no requirement whatever for a RAID enclosure. In a desktop, you simply use a RAID controller, or many motherboards have them built in. There are a few laptops with space for two drives and with hardware RAID capability but these are growing scarce. For a laptop that will permit two disk drives but that does not have a hardware RAID controller, either MacOS or Windows will do software RAID 1.

Performance hit with hardware RAID can be negligible.

Running a system from an external drive without loss of speed would require a lot of high-end hardware and specialist intervention. I would think a Thunderbolt connexion would help.

RAID 1 itself is not a perfect guarantee against data corruption - - see the Wikipedia RAID article which raises a number of important issues. (And note that while the simultaneous aging of mechanical hard drives is raised, the simultaneous aging of SSDs is just as likely to bring problems.)

In sum, I feel that any collection of personal data at any time will inevitably reflect some data corruption. I think the best defense is to hold time capsules: periodic backups on M-Discs DVDs or blu-rays. If a file is found to be corrupt in 2014, it may be possible to find an incorrupt version on, for example, a 2012 M-Disc. (Or even, if you're very lucky, conventional optical media.)

The problem of detecting what files, in a lifetime's aggregation of user files, are corrupt, seems to be nearly insoluble. That's why I find the time-capsule approach attractive.


--Harry


For anyone who wants further fuel for storage paranoia, Robin Harris's articles on NTFS corruption and SSD corruption are good starting points:

http://www.zdnet.com/blog/storage/how-ssds-can-hose-your-data/1423; eudora="autourl"> http://www.zdnet.com/blog/storage/how-ssds-can-hose-your-data/1423

http://www.zdnet.com/blog/storage/how-microsoft-puts-your-data-at-risk/169; eudora="autourl"> http://www.zdnet.com/blog/storage/how-microsoft-puts-your-data-at-risk/169

(further searches will reveal how Apple missed the boat on supporting a reliable file system; also his stuff on MDISC which apparently finally has a '1000-year' blu-ray - - topics discussed on this list last year)

At 20/04/2014 19:24, you wrote:

That's interesting - - I didn't think you'd be easy to convince. And as I say, you've gotten me to doubt. I am assuming that there is some kind of verify mechanism?

Just that I put the backup drive in the machine and it boots.


Regarding ShadowProtect, I just now had a need accidentally downloaded something from Linkbury which is one of the most annoying things ever. I had backed up an hour ago so I thought it would be a cinch to restore from ShadowProtect. But, on my new Dell system with the 8 partitions and heaven only knows what sort of BIOS, I simply cannot get the recovery environment to work. It will boot, via a USB external CD, but then it hangs. There are many ways around this: I could just do it all on another system, but I am disturbed that I cannot do it on the host system. This is the first time I have had a problem with ShadowProtect. Harry, you brought it upon me!

The Shadow knows!


At 18/04/2014 17:53, you wrote:
Bill,

You've convinced me not to use any software solution. The best I can think of in using the disk duplicator is to get or fabricate some kind of cable that would allow me to keep the SSD *outside* of the computer. I've searched for these, but I don't think they exist. The idea would be to plug one end of the cable inside the laptop where the SSD connects and the other end of the cable to the SSD. Its purpose would only be to avoid having to unscrew the little door on the Thinkpad inside of which the SSD normally lives. Ideally the cable extender would be a Y-shaped one, so that the SSD would always be plugged into both the computer and the drive duplicator.

As to the one I use, it's available for $51 from newegg:

http://www.newegg.com/Product/Product.aspx?gclid=CJPqpNy_6r0CFQ2hOgoda2oAJw&Item=N82E16817422030&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-Hard+Drive+Enclosures-_-N82E16817422030&ef_id=U0QBfgAABVdzqSDN:20140418165301:s; eudora="autourl"> http://www.newegg.com/Product/Product.aspx?gclid=CJPqpNy_6r0CFQ2hOgoda2oAJw&Item=N82E16817422030&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-Hard+Drive+Enclosures-_-N82E16817422030&ef_id=U0QBfgAABVdzqSDN:20140418165301:s

Regards,
Harry

Hi Harry,

1. The files that ShadowProtect creates are not the image itself. (You will often choose to use some degree of compression.) They are the data SP uses to create the image on the target media. I don't know how to explain it any better than that - - it is confusing. I am not confident that I am using the vocabulary correctly.

2. So: let's say you have a hard drive that has failed, and you need to restore your system to a new hard drive from the SP backup you made an hour before the crash.

You need the SP program to take your backup data and write the image onto the new hard drive. I have done this in two ways.

a. by booting the SP disk on my target computer's CD (on a new system you could just as well boot with a USB stick), and restoring to new hard drive from an external with the SP data

or

b. by using a different computer that has SP installed. On this computer, I will have two external drives. Drive 1 is the SP backup; Drive 2 is the blank you want to image. In this case, I tell the SP program to restore Drive 2 from Drive 1. (Physically speaking, I do not place Drive 2, the new hard drive, into an actual enclosure; rather, I use Newertech's Universal Drive Adapter which connects, without any enclosure or fuss, any drive to a USB 2/3 port.)

You've forced me to think about this deeply unpleasant subject again, and I offer these observations:

1. SP seems to work. By contrast, I have not been happy with more or less recent versions of Ghost and Acronis. I was particularly upset when Ghost lost the ability to live clone one hard drive to another - - perhaps that has been fixed? Apart from that, both Ghost and Acronis have always been buggy for me.

2. There definitely are gotchas in SP. One of its characteristics is that there are some partitions that will only backup on a full backup. They will not work on an incremental backup job. Therefore, if you don't want to see a backup failed message, you have to remove those partitions from the incremental backup job. I don't like this.

3. More and more, I can see the point in using a hardware-only solution such as you have, and would be grateful if you pointed me to the website for the product. I do strenuously object to the requirement that you have to remove the source drive from the computer each time you want to do a backup, but if this is the price one has to pay for total duplication security, I might be willing to do it on an infrequent basis.

4. In addition to SP, I use http://www.memopal.com/ for continuous-to-cloud backup of data files. I have found this works very well so far. One thing I like about this system is that it backs up as soon as the file is saved to host disk, and saves multiple versions. This feature has saved my butt on many occasions. However, it would be tiresome to have to do a full restore of data files from this service.

5. Participating in this thread and facing my current problems with SP and my 7-partition 'plain vanilla' new system, I realize that, in order to avoid error messages, I need to schedule two separate backups.

a. A non-incremental weekly backup of the entire hard drive, including all the invisible partitions.

b. An incremental hourly backup of the only partition (i.e. 'c') which actually has any changed data.

I have not done this yet, so, with my current system schedule, SP will report success for my weekly, full backups, but will report failure for the incremental backups. It is important to the note that the incremental backups have not failed. If I go into the 'details' tab, I will discover that the incremental backup of my 'c volume' - - which is the only volume on which any data has change - - has been successful. The parts of the job that reportedly 'fail' are some of the hidden partitions; and the error message always is the same: incrementals not supported on this volume.

I hope this has been helpful. Showing is better than telling, so I suggest downloading a trial copy from the storagecraft website.

Again, this only follows my own experience. I started using SP about five or six years ago, on v. 3, after reading Ed Mendelson's review in PC Mag. Ed is not just any magazine reviewer. This polymath is also Trilling Professor at Columbia and Auden's executor and, of course, a notorious WP/DOS maven. I believe he is the most knowledgeable and incorruptible of all the computer magazine writers. https://en.wikipedia.org/wiki/Edward_Mendelson; eudora="autourl"> https://en.wikipedia.org/wiki/Edward_Mendelson .

At 18/04/2014 03:22, you wrote:
Bill,

This sounds great. But I'm confused about one thing. If the image file is bootable, what's doing with the CD? You say, "in case of disaster." Meaning what? That your regular hard drive won't boot? I don't fully trust booting from a CD (have had problems: the Lenovo Thinkpad I have doesn't have an internal CD drive, so have to use an external USB CD drive and change the boot order, but it doesn't always work--something about the unreliability of non-powered USB CD drives, I think.

So why can't you boot from the bootable image? And how specifically would you go from invoking SP on another computer to getting a drive to insert in the now defunct computer?

Thanks so much,
Harry

Harry, I am totally with you in the idea that the only Windows backup worth having is a complete bootable one. However, I do it differently: automated and in software. I use ShadowProtect. This creates a bootable image file (I split it into 640MB segments) that is incrementable and automatable (I do a full backup every week; an incremental every 3 hours).

So how do you get a bootable drive out of this? In case of disaster, you boot up with a ShadowProtect CD (or simply invoke ShadowProtect on another computer) and restore your image to your target hard drive.

The result will be a perfect duplicate.

I have done this several times with complete success. ShadowProtect is the only such system that has ever worked for me. (Acronis is not a patch on it.) I have complete confidence in it.

Are there any gotchas? I would say that you have to watch the numerous hidden partitions (more and more!) that modern computers are beginning toi have. (For example there are seven partitions on my plain vanilla Dell XPS 15 late 2013. Why so many? Heaven knows.) Once or twice, I have lost one of these partitions, but it has not affected me in any visible way.

SSDs are great but they can benefit from maintenance and they do in the end fail.

Why not use both? Continue to use the hardware solution but only once every couple of months? Meanwhile use the software solution regularly. You will then have the convenience of up-to-the-minute backups (as long as the backup drive is connected of course).

These images are complete total, total, total duplicates of your hard drive state. There is no need to worry about losing any value whatever.

NB: This is just one user's experience over the past few years.


At 17/04/2014 22:11, you wrote:
I have a technical question that maybe one or more of you would be so kind as to help with.

I've been bitten by disaster too many times not to be very concerned with backup. I have now what is the ideal solution, except for one flaw: I take the SSD drive out of my Lenovo Thinkpad (fairly easy to do, but requires unscrewing one screw), and put it with the backup drive into a toaster-like drive-duplicator from Aluratek that duplicates the SSD, sector by sector, onto the backup drive. The result is a completely substitutable, bootable dupe of my SSD (which I then replace in my Thinkpad).

The only problem is that I can't do this, of course, as a scheduled task.

I am doing physical drive duplication because via software, you can't produce a bootable drive. But is making a clone image good enough? I have tried Acronis, EaseUS, and Carbonite for making "images," but they aren't bootable. As I understand it (through a glass, darkly), you boot your system some other way, then "restore" the image. It's all smoke and mirrors to me. I don't trust "booting some other way," even though the Thinkpad has a system recovery partition on the main drive (i.e., my SSD). So, am I being a scaredy-cat? Should I rely on images and just "get over it" re my bafflement at what the restore process is? Would the end result be not just the return of my data files but of all my OS settings, including the registry?

A final thought: is the image, like a virtual machine, just one file that you only need a running computer to activate?

Thanks for the hand-holding.