One can whine and wax poetic all one wants, but since we don't have a good archival format, the practical solution today is continual refresh of data: periodically copying data to fresh, and technologically up-to-date media. It's not sexy, but it does address three of the four points at the end of the linked piece (end-to-end data integrity, format migration and secondary media formats). The unaddressed point, access audit trails, makes no sense given the premise stated at the beginning of the piece that "No matter what anyone tells you, there is data that does not need to be on primary storage".
Yes, this is expensive. Yes, it would be nicer (cheaper) if a one-time single format could address the archive problem.
P.S. There is also this gem from the piece:
creation of a collision-proof hash
Of course the whole point of a hash is a mapping from a high-cardinality space to a low-cardinality space, and thus collisions are always a possibility. Collisions are minimized when a good hashing function uniformly distributes the resulting hashes, but given a large enough collection of source documents (no more are needed than the cardinality of the hash space), collisions will occur.