Here is a better link with the originals and a link to katrillions of raw images.
And that's your answer. A filesystem like FAT32 or ISOFS that's likely to still be implemented in future OSes and a recovery files which let you rebuild anything that suffers from bit rot.
That does NOT eliminate the need for good code design, actually, having a good design phase is absolutely crucial to this approach, since else your juniors have to design. That would be
... let's say sub-optimal.
Even more important than good code design is good data format design. No amount of good code can compensate for the defects in a bad format.
I mean, why can't we keep unproductive people around? What's the limiting factor?
Without patents we would have trade secrets.
Then outlaw trade secrets by requiring that the schematic diagram, source code, recipe, etc, of anything sold to the public must be made public.
What, in your opinion, makes NTFS a pain in your ass?
For example performance with very large files.
Subject: Re: [Bug-ddrescue] ddrescue. NTFS-3g eating 100%. Solved by switching to ext3
I had a ddrescue imaging to a file on an ntfs partition (mounted with ntfs-3g) on a USB drive on Ubuntu 9.04 LiveCD. It was going slower and
slower, although number of errors was not increasing. I tried all the ddrescue options I could find, but nothing helped - it was working for 5 days already and was slowing down so much so it would never end. By the time I stopped it it copied 112Gb out of 223Gb.
Then I noticed that ntfs-3g was eating 100% cpu. So, I created an ext3 partition on the same USB drive, copied my image file and log there and restarted ddrescue.
Boy! it finished in a couple of hours!
The best book on programming for the layman is "Alice in Wonderland"; but that's because it's the best book on anything for the layman.