Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel

Intel Reveals More Larrabee Architecture Details 123

Ninjakicks writes "Intel is presenting a paper at the SIGGRAPH 2008 industry conference in Los Angeles on Aug. 12 that describes features and capabilities of its first-ever forthcoming many-core architecture, codenamed Larrabee. Details unveiled in the SIGGRAPH paper include a new approach to the software rendering 3-D pipeline, a many-core programming model and performance analysis for several applications. Initial product implementations of the Larrabee architecture will target discrete graphics applications, support DirectX and OpenGL, and run existing games and programs. Additionally, a broad potential range of highly parallel applications including scientific and engineering software will benefit from the Larrabee native C/C++ programming model."
This discussion has been archived. No new comments can be posted.

Intel Reveals More Larrabee Architecture Details

Comments Filter:
  • Good old SIGGRAPH (Score:5, Insightful)

    by Gothmolly ( 148874 ) on Monday August 04, 2008 @08:58AM (#24465189)

    With the supposed death of Usenet, the closing of PARC, and the general Facebookification of the Internet, its nice to see a bunch of nerds get together and geek out simply for the sake of it.

  • by yoinkityboinkity ( 957937 ) on Monday August 04, 2008 @09:03AM (#24465249)
    With more and more emphasis going toward GPUs and other specialized processors, I wonder if this is to try to fight that trend and have Intel processors able to handle the whole computer again.
  • Re:Good news (Score:5, Insightful)

    by morgan_greywolf ( 835522 ) * on Monday August 04, 2008 @09:04AM (#24465271) Homepage Journal

    This is good news for Mac mini and MacBook users.

    How so? Has Apple announced that it will adopt Larrabee for the Mac Mini or the MacBook? No. All you have are rumors and speculation by MacRumors and Ars Technica. When Apple says they will adopt the Larrabee GPU, then you can say that it is good news for Mac users of any stripe. Until then, it's just Intel news, not Apple news.

  • by shplorb ( 24647 ) on Monday August 04, 2008 @09:17AM (#24465433) Homepage Journal

    Bearing in mind all the other promises Intel has made about their previous graphics offerings, I'm rather inclined to think that once again this will underwhelm. Especially considering all the crap that's been coming out of Intel about real-time raytracing. (It's always been just around the corner because rasterisation always gets faster.)

    That's not to say that it's an interesting bit of tech, but from what I've seen so far it looks like the x86 version of Cell. Of course though it's a PC part and won't be showing up in any consoles anytime soon, so as a console developer it doesn't really do anything for me. I'm mostly interested in how they'll handle memory bandwidth.

    I also expect that nVidia will put out something within 12 months that will stomp its guts out.

  • Re:Good news (Score:3, Insightful)

    by morgan_greywolf ( 835522 ) * on Monday August 04, 2008 @09:20AM (#24465459) Homepage Journal
    Is it not also good news for Windows users, Linux users, and *BSD users? I mean, it's likely that these OSes will also be made to make use of Larrabee when the technology is released, right? Yet, it's not news for any of those platforms or Apple users unless/until those platforms are able to make use of the new GPU technology. Everything else is just speculation, especially so for Apple, who might easily decide not use Larrabee. Since Apple is the only legit supplier of Mac OS X hardware, it's definitely not news for Apple users until Apple says it is. OTOH, Windows, Linux and *BSD users can get their hardware from any supplier.
  • Re:Good news (Score:1, Insightful)

    by Anonymous Coward on Monday August 04, 2008 @09:21AM (#24465473)
    damn! You could poweer a 10-20 ARM or PPC multiprocessor unit. And the architecture wouldn't suck cowboy neal's sweaty balls.
  • by TheRaven64 ( 641858 ) on Monday August 04, 2008 @09:24AM (#24465515) Journal
    I think Larrabee is quite believable. They are quoting performance number that make sense and a power consumption of 300W. The only unbelievable idea is that a component that draws 300W is a mass-market part in an era when computers that draw over 100W total are increasingly uncommon and handhelds (including mobile phones) are the majority of all computer sales with laptops coming in second and desktops third.
  • by Futurepower(R) ( 558542 ) on Monday August 04, 2008 @09:50AM (#24465923) Homepage
    Your comment, "... as they're already big on integrated graphics." is true for some values of "big". Intel has been big in integrated graphics the way a dead whale is big on the beach.

    Basically, once you discover what Intel graphics has not been able to do, you buy an ATI or Nvidia graphics card.
  • Re:Good news (Score:1, Insightful)

    by Anonymous Coward on Monday August 04, 2008 @10:07AM (#24466163)
    With the vector floating point (VFP) coprocessor it's not too shabby.
  • by Anonymous Coward on Monday August 04, 2008 @12:04PM (#24467993)

    As soon as it actually exists somewhere other than Intel's laboratories, they're usually pretty forthcoming on details (to the point we even have specs on how to use their graphics hardware, which is more than we can say for e.g. nVidia.)

    OTOH, Larrabee is still Labware, and should be thought of as such. Unless you're willing to sign away your life in NDAs, don't expect to know too much yet.

  • by TheRaven64 ( 641858 ) on Monday August 04, 2008 @12:10PM (#24468091) Journal
    This is SIGGRAPH. They've been having the 'ray tracing versus rasterisation' debate for about three decades there. If you put anything definitive into your paper then you are likely to get a reviewer who is in the other camp, and get your paper rejected. If you say 'speeds up all graphics techniques and even some non-graphics ones' then all of your reviewers will be happy.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...