they sure knew how to engineer a damn solid network.
That's what regulated, cost-oriented prices in a monopoly do. Gold plate everything, spare no expense in the research of perfection, and earn a fixed percentage on it. Nowadays, we spend money on advertisement instead, because it's much more efficient at recruiting clients than quality in a competitive market.
In the end, others manufacturers achieve to do what Apple doesn't: a small, standard connector with good A/V support. It is Apple's choice when upgrading its connector to degrade a working feature, adding latency and artifacts, while producing a proprietary connector. You see it as a good thing because cables are useless; I don't.
There are 9 pins in a full size USB3 connector, and 8 pins in a Lightning connector. But when the lightning connector has two data pairs, USB3 has a bidirectional pair for legacy, and two single-direction pairs for high-speed traffic. HDMI, and Displayport respectively have 3 pairs (+ 1 differential clock) and 4 pairs.
The real question is the nature of the signal on those pairs. USB2 is 480Mb/s with a lot of protocol overhead, HDMI has 3.40 Gb/s with only error correction, and USB3 is 5 Gb/s, but still has (parts of) its inefficient protocol. Depending from what Apple is doing, it could route only the high-speed signaling of USB3 on the Lightning connector's two pairs, and provide the same performance as a standard USB3 cable.
However, since Apple keeps all information about Lightning under wraps, only insiders can tell. And until now, all we've seen is quite underwhelming, with USB2 data cables, and now this adapter.
And from what we see here, it's markedly worse than the alternatives Apple shunned, but that were based on standards (MHL, USB3), because those would have prevented Apple from imposing drastic licensing conditions on accessory manufacturers.
With its limited pin count, it's not a surprise that the Lightning connector does not have the bandwidth to transfer uncompressed video. But it's disappointing for it to be so bad at compression, with the MPEG artifacts shown in the article, plus latency issues with encoding/decoding. On that point, the old connector was better, and micro-USB3 would have had enough bandwidth to avoid the issue completely.
If lots of people who are not you bought them, it wouldn't be an "indie" studio, would it?
Minecraft is a good example of indie game. It has no editor, the game is not sold on the physical retail market. It only sold 9,531,112 copies.
The rule was originally designed for movies, by the way, but the French movie rating is much more relaxed than the games rating. For example, the last James Bond movie did not get any restriction at all, it would be PEGI 18 if it were a game. But the movie rating boards in Europe use different standards.
At then end; it looks like Nintendo took the most restrictive of those rules, and applied them to everyone, as if the WiiU was a TV channel. This will hurt them in more liberal markets. It does not help that Nintendo of Europe is headquartered in Germany, which has the most extreme restrictions on video games, and still requires a separate, different, ugly, enormous, unremovable logo on game packaging and game disks. And this is after the PEGI rating board mainly standardized on rules very close to the German ones...
They have a pending patent on it, and they call it Forced Statistical FDMA