In terms of color theory, nothing stops is potentially being real. If you expect to hook this up to some random source and get an improvement, though ... good luck. It's not going to happen. With an appropriate 10-bit or 12-bit wide-gamut source, though, it's certainly capable of better results.
The input may be 3-color (RGB), but if it's defined with a wide-gamut space like Adobe RGB, possibly with up to 16 bits of precision per colour channel, then it can represent a huge range of colours. It can do this by defining near-"perfect" primary colours and assuming perfect control over blending of those primaries.
A regular TV, though also an RGB device, has a very different gamut. That's largely because the primary colours the TV uses aren't as bright/saturated or as "perfect" as those in the Adobe RGB space, but it also can't blend its colours as well. Most likely it only uses 8 bits per colour channel, so it has a much more limited range of graduations, further forcing the colour space to be narrowed to avoid banding due to imprecision.
The regular TV must "scale" a wide-gamut input signal in a colour space like Adobe RGB to display it on its own more limited panel. It can do this by "chopping off" extreme colours, by scaling the whole lot evenly, or several other methods that're out of scope here. Point is, that they're both RGB devices, but they don't share the same colour space and must convert colours.
So, if the yellow pixel (another primary) expands the gamut of this new TV, then yes, even though it too only takes an RGB signal, it's in theory better, because it can convert a wide-gamut RGB input to its own RGBY space for display with better fidelity than a TV with the same RGB primaries but no Y channel colour achieve.
Another device might still be plain RGB, but for each of the red green and blue primaries it might have much better (closer to "perfectly red" etc) colour. This device might have an overall wider gamut (ie better range of colours) than the RGBY device, though it's likely that the RGBY device's gamut would still be capable of better yellows. (If you're struggling to figure out what I mean, google for "CIE diagram RGB CMYK" to get a feel for it).
Attaining better results through adding a channel and/or having better R,G,B primaries presumes properly colour-managed inputs to gain any benefit, though. In reality, video colour management is in a pathetic and dire state - inputs can be in any number of different colour spaces, there's no real device-to-device negotiation of colour spaces, and it's generally a mess. If you feed a "regular" narrow gamut source through to a TV that's expecting a wide gamut signal, you'll get a vile array of over-saturated over-bright disgusting colour, so this is important. Since this device would rely on wide-gamut RGB input to have any advantage, it'll need a 10-bit or 12-bit HDMI or DisplayPort input with a source that's capable of providing a wider gamut signal (say, BluRay) and is set up to actually do so rather than "scaling" the output video gamut to the expections of most devices.
The fact that most inputs only support 8 bits per channel (and thus aren't very useful for wide-gamut signals because they'll get banding/striping in smooth tones) really doesn't help.