Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel

Intel Reveals More Larrabee Architecture Details 123

Ninjakicks writes "Intel is presenting a paper at the SIGGRAPH 2008 industry conference in Los Angeles on Aug. 12 that describes features and capabilities of its first-ever forthcoming many-core architecture, codenamed Larrabee. Details unveiled in the SIGGRAPH paper include a new approach to the software rendering 3-D pipeline, a many-core programming model and performance analysis for several applications. Initial product implementations of the Larrabee architecture will target discrete graphics applications, support DirectX and OpenGL, and run existing games and programs. Additionally, a broad potential range of highly parallel applications including scientific and engineering software will benefit from the Larrabee native C/C++ programming model."
This discussion has been archived. No new comments can be posted.

Intel Reveals More Larrabee Architecture Details

Comments Filter:
  • by Churla ( 936633 ) on Monday August 04, 2008 @09:23AM (#24465503)

    I don't think so. I think the fact is that with the right architecture (which Intel is trying to get into place) which exact core on which processor handles a specific task should become less and less relevant.

    What this technology will hopefully provide will be the ability to have a more flexible machine which can task cores for graphics, then re-task them for other needs as they come up. Your serious gamers and rendering heads will still have high end graphics cards, but this would allow more flexibility for the "generic" business build PC's.

  • by Kjella ( 173770 ) on Monday August 04, 2008 @09:28AM (#24465587) Homepage

    It almost certainly won't work. In the past, there has been a swing between general and special purpose hardware.

    Except with unified shaders and earlier variations the GPU isn't that "special purpose" anymore. It's basicly an array of very small processors that individually are fairly general. Sure, they won't be CPUs, but I wouldn't be surprised if Intel could specialize their CPUs and make them into a competitive GPU. At the very least, good enough to eat a serious chunk upwards in the graphics market, as they're already big on integrated graphics.

  • Re:OpenGL (Score:5, Interesting)

    by Ed Avis ( 5917 ) <ed@membled.com> on Monday August 04, 2008 @09:30AM (#24465621) Homepage

    The Quake engine uses OpenGL (or its own software renderer, but I doubt anyone uses that anymore),

    Isn't the point of Larabee to change that? With umpteen Pentium-compatible cores, each one beefed up with vector processing instructions, software rendering might become fashionable again.

  • by Futurepower(R) ( 558542 ) on Monday August 04, 2008 @09:33AM (#24465667) Homepage
    Today at a coder's party we had a discussion about Intel's miserable corporate communications.

    Intel's introduction of "Larrabee" is an example. Where will it be used? Only in high-end gaming computers and graphics workstations? Will Larrabee provide video adapters for mid-range business desktop computers?

    I'm not the only one who thinks Intel has done a terrible job communicating about Larrabee. See the ArsTechnica article, Clearing up the confusion over Intel's Larrabee [arstechnica.com]. Quote: "When Intel's Pat Gelsinger finally acknowledged the existence of Larrabee at last week's IDF, he didn't exactly clear up very much about the project. In fact, some of his comments left close Larrabee-watchers more confused about the scope and nature of the project than ever before."

    The Wikipedia entry about Larrabee [wikipedia.org] is somewhat helpful. But I don't see anything which would help me understand the cost of the low-end Larrabee projects.
  • Re:Good news (Score:3, Interesting)

    by Yvan256 ( 722131 ) on Monday August 04, 2008 @09:59AM (#24466063) Homepage Journal

    The power brick for my Core 2 Duo Mac mini is somewhere around 80 Watts I think. And I'd assume the actual usage is lower than that. Let's say 50~60 Watts for the whole computer (CPU, GPU, hard drive, optical drive, RAM, FireWire, USB, etc).

    If Larrabee takes 150~300 Watts, then it's just insane, no matter how many cores it has.

  • by Austerity Empowers ( 669817 ) on Monday August 04, 2008 @10:11AM (#24466219)

    Except the part where GPUs have 256-512 bit wide, 2GHz + dedicated memory interfaces and Intel processors are...way, way less. Add that to the ability to write tight code on a GPU that efficiently uses caching and doesn't waste a cycle, compared to the near impossibility of writing such code on the host processor which you share with an OS and other apps... meh.

    There might be some good stuff that can be done with this architecture, but I am not convinced it's a competitor to GPUs pound for pound. You have to really believe ray-tracing is the future, and that some of the multi-texturing shenanigans that drive memory bwidth in GPUs are in the past. That's a big leap of faith. I'd prefer to believe once they build it, we'll find a great use for it.

    Still it's nice to see something new happening.

  • by ponos ( 122721 ) on Monday August 04, 2008 @10:11AM (#24466221)

    What most people don't seem to realize is that Larabee is not about winning the 3d performance crown. Rather, it is an attempt to change the playground: you aren't buying a 3d card for games. You are buying a "PC accelerator" that can do physics, video, 3d sound, dolby decoding/encoding etc. Instead of just having SSE/MMX on chip, you now get a complete separate chip. AMD and NVIDIA already try to do this with their respective efforts (CUDA etc), but Larabee will be much more programmable and will really pwn for massively parallel tasks. Furthermore, you can plug in as many Larabees as you want, no need for SLI/crossfire. You just add cores/chip like we now add memory.

    P.

  • Re:Good old SIGGRAPH (Score:3, Interesting)

    by Duncan3 ( 10537 ) on Monday August 04, 2008 @02:34PM (#24470479) Homepage

    Other areas of CS have multiple conferences throughout the year. Graphics has only one, and that's SIGGRAPH. If your paper is not accepted at SIGGRAPH, you are considered to have done nothing worthwhile that year. You could win every special effects award in Hollywood, but no SIGGRAPH paper = no cred.

    That's just how it works.

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...