Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Because when I think graphics, I think intel (Score 1) 288

Larrabee isn't exclusively for ray-tracing though, rather Intel's goal is to bring back the flexibility of software rendering -- on hardware that is actually up for the task. The initially planned 16-core Larrabee has better-than-GPU bitwise logic and branch handling, a 16-wide FP64 vector unit in every core, and a separate high-performance texture sampler block. While it's a good fit for ray-tracing, you are also quite free to implement deferred region based rasterising of shaded and textured polygons, sparse voxel octrees (St. Carmack), Renderman-style micropolys with unlimited shader programs, teapots as the geometry primitive with fractals as the only texture format, solid color vector graphics with 512X supersampling, whatever you want for your game engine. It's been somewhat a consensus lately that ray-tracing has scaling and performance challenges of its own, it's not the unquestionable Holy Grail as it has been held. (Not that you implied that.)

Intel hasn't communicated a narrow-minded agenda and arguably their all-star Larrabee research/software team is too good for that too.

Agree about Intel's dominating fabbing edge. However, while Nvidia is sailing troubled seas right now, ATI is on a roll (despite AMD).

Off on a tangent: How I wish Intel had used their PowerVR license to implement Series 5 as their integrated graphics instead of saying "NIH!" and burdening the world with the hopeless GMA series.

BTW, LArrabee might be good for the PS4 CPU as well, but Sony has too much invested in Cell (now with the FP64 versions and all) and the dev tools for it. So Cell + Larry is going to an interesting hybrid if it happens. :-)

Comment Re:Time to tighten our belts (Score 1) 410

Just an anecdote in support of your post, back in the first oil crisis days when Chrysler went in deep financial trouble, Lee Iacocca had to fight tooth and nail for months to get Chrysler a Congress backing for a large consolidated loan that effectively saved them.

They paid it back markedly ahead of the schedule.

It's odd that good history like that doesn't count for the bankers. (But I don't know if something negative has happened since. I just read Iacocca's autobiography and actually ended up disliking him personally for his "making lots of money is everything" world view, but I can't help admiring many of his accomplishments at Ford and then Chrysler.)

Comment Re:Changes (Score 1) 207

Actually CoreBoot both is a Linux and can run Linux. ;-)

However, I prefer the new name too. Well, maybe "core" is a bit all over the place nowadays, but "CoreBoot" still rolls off the tongue better than "LinuxBIOS" and the name doesn't need to be a description. (Enthusiasts know already, casual users couldn't care less anyway.)

Comment Re:Did Intel graphics improve when I wasn't lookin (Score 1) 158

The sad part is that they have had a PowerVR license for years but they have insisted on their abominable "Extreme Graphics" and GMA concoctions for PCs. PowerVR's approach (deferred tile-based rendering) is the most bandwidth economical out there (and fillrate and pixel shading economical but comparatively less so with the advanced Z-buffering optimizations of ATI and Nvidia during the past few years) so it would have been a perfect fit for integrated graphics. Their Series 5 design was DX9 compliant, no less. Sad that the story stopped at Series 3 Kyro cards, not counting their triumph in the embedded/mobile scene. Who knows what me might have now in PC northbridges. All in all, looks like a case of NIH syndrome, the PowerVR tech was certainly good enough.

Comment Re:Never ending chase... (Score 1) 158

TFA (the Intel PDF) points out that ray bundling is problematic whenever a part of the bundle scatters differently from the rest, to the extent that it may be cheaper in the end to avoid bundling, at least with some workloads/situations. It's a good read, feels quite honest and free of usual marketing brouhaha.

No comment on SSE4 except that I would expect that Intel wanted to accelerate a wide variety of SIMD-suitable workloads; the reason you allude to wasn't immediately obvious. Care to expand on that? :-)

Comment Re:Coming to a disaster near you. (Score 1) 452

I bought a pair of 30GB 75GXPs and the one from IBM's Ireland factory performed flawlessly. The one from IBM's Hungary factory failed and got replaced with another Hungarian, which also failed so the retailer swapped both for a pair of bigger WDs.

It seems the DeathStar problem had something to do with the new assembly line in Hungary. They seem to have fixed the problems, Hitachi drives have been rock solid ever since.

Comment Re:But isn't that the idea? (Score 1) 676

Now try and bring it up with an old keyboard and no Windows key. (I know... No system with a keyboard that old can run Vista, but it is hypothetical, OK?)

No need to be hypothetical there. My desktop system that I built three years ago can run Vista with all bells and whistles very well, but I prefer my old clickety-clicky '94 IBM keyboard to anything else available today. (The tactile feedback is just better than any sub-$100 keyboard I've tried.) So there. :P

Comment Re:It's 2009 (Score 1) 676

But in his blog post, Kohei says he signed the JCA when he started writing the solver.

I don't know if he retracted it later when he moved to Novell, but you seem to be working under the wrong assumption that he refused it outright. :-) The exact sequence of events does not come quite clear from the conversation though, I'm certain I'm missing significant details from the big picture.

Another thing, The Summer of Code stunt Sun pulled looks really weird, whatever happened elsewhere. They are rightfully accused of arrogance and disrespect, with that kind of total lack of communication toward a voluntary developer.

Comment Re:128 cores (Score 1) 194

Indeed! The "128 cores" (in reality the shader ALUs that Nvidia calls "Stream Processors") caught my attention too. And this from Computer World. *sigh*

More weirdity in there: "Why two screens? Most people are using two monitors at their desktop. [...]," said Wes Williams, worldwide product marketing manager for ThinkPads.

Uh, I'd think this Lenovo manager has very small values for "most people".

Slashdot Top Deals

Doubt is not a pleasant condition, but certainty is absurd. - Voltaire

Working...