As to consistent hardware, you're saying this like this is hard or special or something. PC game makers do just fine with variable hardware. Yes, consoles have consistent hardware but that doesn't mean much. That just means you have one version of the operating system with one set of drivers that are slightly better debugged than what the PC people deal with. So what.
No, really, the memory hierarchy is a crippling difference and the game has to be reworked anyway - though DirectX 12 makes that easier.
As an example, Playstation 2 has been a bitch to emulate, because there's some extremely fast eDRAM in there for the GPU to use, and stupidly high fillrate to brute-force graphical effects in wasteful and interesting ways (later hardware such as Radeon 9700 pro and PS3 used pixel shaders instead). Only with recent and high end GPUs can we do the same shit (we'll talking hundreds of gigabytes per second of bandwith needed)
So, to run any console game (Xbox One or PS4) that exploits stupidly low latency and high bandwith between the CPU and GPU - because they're on the same die - we'll probably need future PC hardware where both the CPU and GPU are on the same chip. Alright, all Intels are like that but bandwith, GPU power and features don't match.