Can you explain this? I would have thought the signal would remain digital and at its original sampling rate.
It's the encoding that the HDMI decoding device supports. It does support uncompressed stereo PCM audio in the spec, but most devices will use some form of compression in the transmission, in order to reencode it as DTS or Dolby Digital for your receiver. A lot of devices do this by default, without user input, if they're connected to a receiver that can handle more audio channels. While this can usually be disabled/reconfigured, a lot of users won't actually think about that.
Case in point -- I'm watching an MKV on my WDTV Live! as I type this. I ripped the DVD myself, and know for a fact that the audio channel I'm listening to right now is AC3 stereo (though the file does have an English-language 5.1 channel). The stereo's surround sound/5.1 light is active. The device is upmixing the audio to 5.1 surround before it's sending it to the receiver.