Actually, this is nothing new. Most consoles, at launch, would lose on paper to their counter-part mid-range PC at launch. So what?
1.) There's more to hardware than the listed specs. For instance, Xbox360 had lots of architectural features that made it a poor choice for general computing but excellent for a typical gaming load. One example was a cache architecture tuned for streaming data (i.e. reading of a disc or streaming in geometry and textures). Also, consoles typically require less overhead in terms of things like OS footprint, etc. than PCs. Thus, despite the numbers, consoles are typically able to keep up with PCs that out match them in terms of specs alone.
2.) Developers can focus their efforts on a single architecture when developing for consoles. This entails huge performance gains from various optimizations that would not be feasible when developing for the heterogeneous PC architecture landscape. (Certainly there are exceptions to this i.e. Game A gets optimized code for GPU B, but this is certainly not a guarantee when buying the game). Furthermore, developers have much longer to find particular optimizations for the hardware given that the console life-cycle is typically 5 - 10 years. Notice how much better the last titles in a console generation look compared to those at launch.
Finally, $300 to $400 doesn't buy you much in terms of a gaming rig. I know, I just tried to build one on that budget.