On Feb 22, 2008, at 2:52 AM, Gregg Gorrie wrote: > Let's say you had a single 12 GB file that held the complete > contents of a > one hour DV tape. One flaw in that data file and everything is gone. That's true, but it shouldn't be. It has always bothered me that we just accept that any file-read failures result in complete file loss. The only reason its a complete file loss is because the software just isn't written to work around the error. There's absolutely no reason computer software shouldn't just show garbage for the bits that don't read properly (wether its video or text) instead of just assuming a total loss. As an example, I've had Stuffit Expander unarchive an archive containing dozens of files and hit an error after retrieving the bulk of them. Instead of telling me the archive could not be fully retrieved, it erases all the files it has successfully unarchived and throws the error up. Fortunately other apps aren't so brain dead in their approach and leave the files that could be retrieved in place. A single read error shouldn't mean a complete failure to read any part of the file for a good OS/Software combination. /rant -Mike