Larrabee isn't exclusively for ray-tracing though, rather Intel's goal is to bring back the flexibility of software rendering -- on hardware that is actually up for the task. The initially planned 16-core Larrabee has better-than-GPU bitwise logic and branch handling, a 16-wide FP64 vector unit in every core, and a separate high-performance texture sampler block. While it's a good fit for ray-tracing, you are also quite free to implement deferred region based rasterising of shaded and textured polygons, sparse voxel octrees (St. Carmack), Renderman-style micropolys with unlimited shader programs, teapots as the geometry primitive with fractals as the only texture format, solid color vector graphics with 512X supersampling, whatever you want for your game engine. It's been somewhat a consensus lately that ray-tracing has scaling and performance challenges of its own, it's not the unquestionable Holy Grail as it has been held. (Not that you implied that.)
Intel hasn't communicated a narrow-minded agenda and arguably their all-star Larrabee research/software team is too good for that too.
Agree about Intel's dominating fabbing edge. However, while Nvidia is sailing troubled seas right now, ATI is on a roll (despite AMD).
Off on a tangent: How I wish Intel had used their PowerVR license to implement Series 5 as their integrated graphics instead of saying "NIH!" and burdening the world with the hopeless GMA series.
BTW, LArrabee might be good for the PS4 CPU as well, but Sony has too much invested in Cell (now with the FP64 versions and all) and the dev tools for it. So Cell + Larry is going to an interesting hybrid if it happens. :-)