Solutions depending upon space travel etc. seem both expensive and dependent on future technology not somehow making recovery too inexpensive. Ditto other high-tech solutions.
I have a notion of a different strategy, but cannot figure out all the necessary details. Suppose we could derive a strong encryption technology that could not likely be broken within the time period of interest. (This is uncertain and questionable!) That encryption should be arranged to depend upon a _long_ key, assume for discussion a concatenation of a large number of numbers that _cannot_ be known before the target date, How to define years in advance a large set of numbers that will magically appear at some specific future time? Two suggestions of indeterminate brittleness (where "brittleness" means the probability that the depended-upon machinery will no longer exist).
Pick some large number of U.S. and world cities -- perhaps in the 1000's --- and on the magic date concatenate the ordered set of max/min temperatures reported by some identifiable set of weather reporting entities. Provide fallback (default values?) for cities that no longer exist, or which are no longer reported, or whatever. Specify fallback for reporting organizations that disappear. The intent of the fallback definition is to provide algorithmic keys regardless what has happened to the data-generating organizations over time.
Obviously, this computation becomes more brittle the longer civilization runs. One would not want to depend upon temperature reported in the NY Times, because the NYT might not be around in another century, or might not bother reporting weather since that data is more available on whatever has replaced the web. But it ought be possible with enough careful thinking to devise a dataset definition that could be interpreted unambiguously after reasonable lengths of time.
As backup, several such dataset definitions should be defined. For example, use the stock market: The first N digits of the closing price a large number of stocks (or their well-defined successors) with defaults to ignore data (stocks) that no longer exist. The stock market might not exist in 100 years, not NOAH, but enough well-defined fallbacks could be defined. It might not matter if any particular fallback is no longer well defined, fallback to the next fallback. It doesn't matter much if this fallback to different collections of time-dependent data branches or requires expensive multiple tries. The principle of decryption is that its computation is much much less expensive than brute force attack.
So, on the target release date, the vault machine goes out on the internet (or is "manually" passed the necessary set of numbers, since whatever has replaced the internet won't be accessible by even 25-year-old systems) and if the thousands of collected digits match, it should decrypt the payload. It is almost certain that any data disambiguation algorithm will become ambiguous over time. But if the ambiguities don't branch into too many separate paths, they can each be followed to see if any one works. Assume that processor time is very very inexpensive.
This sort of solution presumes that the vault machine can determine the time, so it couldn't be tricked into thinking that the time has expired. Some sort of high-capacity power backup and wipe-on-intrusion machinery is required. Technical details left to my SlashDot colleagues. Determining enough likely-surviving data sources over 25, 50, or 100 years is a very interesting techno-sociological problem!