If I've understood correctly, HDMI displays support DVI signaling, which they can fall back to in the case of an adapter being used to convert an old non-HDMI aware DVI output to HDMI.
However, in HDMI mode the displays receive HDMI packets, which can also encapsulate audio. What if the closed source driver detects that there's an adapter being used and changes the DVI port's output from DVI signals to HDMI packets, thus enabling audio support? If it doesn't detect the adapter, then it assumes it's connected to an old DVI display and just signals with the DVI standard, which the HDMI display still understands just fine, but there's no audio.
Though how audio works with the open source driver and cheap adapter, that I don't know. Perhaps the open source driver uses a different method to detect that there's an HDMI display connected and HDMI mode should be used.
As it often is with these kinds of internet storms, the culprit is not necessarily malice, just perhaps overly cautious engineering.