To me, an archival system is write once because, from personal experience, I've lost as much stuff I wanted to keep by overwriting it as by any other method. The main factor is pure longevity of course, but other factors are ruggedness (could it withstand moisture? a small fire? getting knocked off a shelf in an earthquake?), compactness (dvds and tape are sure a lot better than punched cards or floppy disks), and cheapness. I would say it doesn't have to be particularly fast, but it should have random access ability, which leaves out tape.
The way the media is written doesn't have to be the way it's read. Trying to think how I might do it if I had the scientific/engineering chops, I conceptually start with old fashioned photographic film. The negative is exposed to light when the picture is taken. The negative is still very fragile until it gets chemically fixed. After that it can safely be 'read' (exposed to light) while an indefinite number of positives are made. In a hypothetical computer data archiving system, the fixing operation could, for example, be a chemical reaction that is triggered somehow immediately after or while the data is set, and it could be triggered mechanically or in response to heat or UV radiation or a magnetic/electrical charge, or something exotic that I haven't thought of, while at the same time, some other effect (mechanical, electrical, photonic,...) is causing chemicals to react, as in photography, or perhaps tiny nanoparticles to either accumulate in a region or disperse or maybe molecules/particles just rotate slightly in one direction or another due to a magnetic field or polarisation of light. What matters is that the end state is stable and non-destructively detectable.
It doesn't really seem like it should be all that hard to do, so what's the problem? Not a big enough market? Not glamorous enough? Are the current solutions just considered good enough? Or is it actually a much tougher problem than I imagine?