The latency problem is of course the most apparent and thus the most discussed but there are others.
One I wonder about is what kind of servers they are supposedly using. The problem is that modern games demand a modern GPU to look good. The kind of processing needed cannot be done on any sort of reasonable processor in realtime. Also, GPUs aren't really set up to work in parallel these days. What I mean is if you try to have a system with multiple GPUs and running multiple 3D games on them, you are going to find that doesn't really work. That sort of thing is coming, DX11 generation hardware is much better at multi-tasking and such, but it requires apps to be rewritten for it and still isn't there.
So what it comes down to is that to run a modern 3D game, well you have to have a desktop system more or less. You need to have a system running Windows with a powerful GPU at its disposal, and it needs to be tasked to running that one game.
Well that isn't a situation I'm seeing as working real well for a hosted business model. You have a whole bunch of individual desktop machines set up that then load up the software and whatever handles the encoding.
If they are claiming they are doing it with "virtualization" then I'm saying they are "lying." As it happens, doing virtualization related things is a big part of my job, so I'm fairly up on the tech. When it comes to 3D with VMs there are two things that are true of every technology that supports it:
1) It doesn't work real well. It is on the slow side, and there are bugs of various sorts. It is for sure usable, but nobody is going to confuse it for being 100% good to go, and newer games are the thing it has the most trouble with.
2) It requires a 3D card on the host. All of the virtualizaiton solutions do 3D by processing the guest 3D calls and translating them in to 3D calls to the host. 3D hardware is then needed to do the actual rendering.
I'm afraid I don't buy that these random guys have a more advanced technology than VMWare, Sun, Microsoft, and so on. If you could easily virtualize a system and emulate full modern 3D in software, well they'd be doing it. Hell, MS would be interested in doing it non-virtualized. Be a cool selling point of a new Windows if you didn't need a GPU anymore.
So the only way I see this working is lots and lots of systems with big graphics cards in them. This I do not see as a profitable proposition, even assuming all the rest of it works flawlessly.