I'd come up with some way of hinting at and explaining the encoding. Here's an idea:
The surface starts with a visible circle, 1 mm in diameter. The next circle is a bit smaller. The next is smaller still, and so on until the size of the bit is reached. This would draw somebody examining the device to trying to see where this detail ends.
Next to this there's a visible, etched ASCII table, with the binary representation for each letter, and an example text that's unlikely to be lost to time, with its binary version.
In the real size bits, there is a progression of: 0, 1, 10, 11... This illustrates how data is encoded. At this point, the etched alphabet should make sense.
Next there is a diagram showing how the data is organized in blocks.
Then there's a diagram highlighting the location of error correction data, and the way it's calculated.
Then there are more diagrams of the logical structure -- a simple filesystem, maybe just a tar file, with one file after another.
After all this, there's finally the data. To make it extra obvious, the blocks can be made to have visible separate, so that the grouping is obvious.
The idea is that you could start looking at the visible details, get drawn to the hidden ones, and have plenty clues along the way to figure out what it all means. And all this could be on every device with plenty room to spare for the real data.