Please create an account to participate in the Slashdot moderation system


Forgot your password?

Comment: Re:Extrapolate? (Score 1) 78

by hairyfeet (#49635085) Attached to: AMD Outlines Plans For Zen-Based Processors, First Due In 2016

Uhhhh...just FYI but Intel has come right out and admitted it rigged the benchmarks so you can trust them about as much as the infamous FX5900 benches with its "quack.exe" back in the day.

For those Intel fanboys, I have a simple question for you...if Intel is REALLY so far ahead, why would they risk the fines and antitrust by rigging benchmarks? If they are really THAT far ahead what would be the point of spending all those millions and taking all that risk? For bragging rights? If you have any common sense the answer should be obvious, a company with that large a warchest would NOT do such a thing if it were not necessary and the only reason Intel would take all that risk and go to all that trouble would be that their numbers do not justify their price and if the real numbers come out, like what the guys at Phoronix show is that they are a LOT closer than they would lead you to believe which means their 200%-300% price premiums are not in any way justified and would not exist in a free none rigged market!

I don't give a shit if you like AMD, Intel, fucking SPARC it really should not matter because this should PISS YOU OFF because time and time again we have seen market rigging benefits nobody BUT the one doing the rigging! It certainly doesn't benefit those that like Intel CPUs, unless you consider it a fricking tithe to your church, because if it wasn't a rigged market 1.- the benches would show the chips a lot closer, 2.- More consumers would refuse to pay 300%+ for the Intel chips, and 3.- the PRICE WOULD GO DOWN, so you should be royally pissed right about now, cuz if you own an Intel chip you paid too much!

Comment: Re:Don't break user space! (Score 1) 287

by Runaway1956 (#49635061) Attached to: Why Was Linux the Kernel That Succeeded?

Monolithic kernel. You have a somewhat valid argument there. There are modules that I just don''t use, that are routinely compiled into the kernel. Simple solution: Compile it yourself, without the modules. Strip the kernel down to exactly what you need, and compile it native to get rid of all the 32-bit support. You're left with a monolithic kernel, of course, but the monolith is much smaller.

Comment: Re:To think I once subscribed to this site (Score 1) 227

Fair enough - browse the document here:

I just looked at it again, and I find environmentalists listed. Animal rights activists. Hacktivists. Note that this is merely a "reference aid" - there is other material that accompanied this little handout.. You may choose the red pill, or the blue pill. How deep does the rabbit hole go?

Comment: Re:Just in time for the End of the Line (Score 1) 78

by Kjella (#49634317) Attached to: AMD Outlines Plans For Zen-Based Processors, First Due In 2016

None of those other nodes pitches involved dimensions of which quantum mechanical tunneling was the dominant effect, nor of gate thickness being one atom. But that's what 10nm is.

Not even close. They have on the research stage made functional 3nm FinFET transistors, if they can be produced in the billions is unlikely as it requires every atom to be in the right place but 10nm still has some margin of error. The end of the road is in sight though...

Comment: Re:Extrapolate? (Score 3, Interesting) 78

by Kjella (#49634193) Attached to: AMD Outlines Plans For Zen-Based Processors, First Due In 2016

Anyone care to extrapolate from current benchmarks as to how this new processor will compare to Intel's desktop offerings? I would like to see Intel have some competition there.

FX-8350: 2012
"Zen": 2016

The 40% jump is more like 0%, 0%, 0%, 40%.

If you compare a 3770K (best of 2012) to a 4790K (best of today) you get a ~15% frequency boost and another ~10% IPC improvements. If the leaked roadmaps are to believed Skylake for the desktop is imminent which will bring a new 14nm process and a refined micro-architecture at the same time as Broadwell missed their tick for the desktop, so in the same timeframe Intel will have improved 30-40% too.

Anyway you asked about AMD and I answered with Intel but it's a lot easier to get a meaningful answer without getting into the AMD vs Intel flame war. In short, even if AMD comes through on that roadmap they're only back to 2012 levels of competitiveness and honestly speaking it wasn't exactly great and AMD wasn't exactly profitable. They're so far behind that you honestly couldn't expect less if they weren't giving up on that market completely, which honestly thinking I thought they had. And I wonder how credible this roadmap is, I remember an equally impressive upwards curve for Bulldozer...

Comment: Re:Finally a replacement (Score 1) 78

by hairyfeet (#49633997) Attached to: AMD Outlines Plans For Zen-Based Processors, First Due In 2016

Same here only the 1035T and with the 3.2GHz Turbocore this baby has NO problem playing the latest games and on medium settings Handbrake hits over 160fps.

So while I might look into one of these if they hit 12 cores or better right now I'd say grab one of the AMD hexacore or Octocores if you haven't, because once you remove the benchmarks Intel admits they rigged you'll find the FX8s trade blows with chips costing more than double the price, numbers which the GCC compiled Linux becnhmarks attest to.

Comment: Re:Just in time for the End of the Line (Score 1) 78

by Mal-2 (#49633933) Attached to: AMD Outlines Plans For Zen-Based Processors, First Due In 2016

Pundits have been saying that at least since 90nm, you know. Then 65, and again at 45, 32, 33, and now at 14.

And those limits were dodged by some new process: SOI, copper interconnects, what have you. He's not saying we won't see 10 nm, he's just saying there's a good chance it will have to be something other than CMOS.

Comment: Re:Finally a replacement (Score 1) 78

by Mal-2 (#49633925) Attached to: AMD Outlines Plans For Zen-Based Processors, First Due In 2016

Moving up to a 1090T or 1100T Black Edition could be nice. 3.2 or 3.3 out of the box, but 3.8 or 3.9 is almost a given without worrying about voltage bumps and awesome cooling. My 1090 was throttling at 3.8 when running SuperPi on the stock cooler, but have no such problems with a Hyper212. I would need more cooling to run all-day-every-day at 4.0.

Granted, 2.8 to 3.8 may not make a lot of real world difference when memory bandwidth comes into play, and as you pointed out, the GPU has a lot to do with it (when it's relevant at all, which it isn't when, say, running six sessions of WinRAR at once).

AMD can hold their own in the mid-high end, though they aren't cranking out the high-margin "extreme" processors. Those are more of a status symbol anyhow, for both manufacturer and consumer, and I doubt that even Intel makes enough of them to be a big chunk of its gross income. That's why they're so damn expensive. Why make them any cheaper if you can't make them rapidly enough as it is?

Comment: Re:Snowball effect (Score 1) 287

by Kjella (#49633767) Attached to: Why Was Linux the Kernel That Succeeded?

It's not a big mystery. Linus released a primitive kernel that worked, at the right time, with the right license, and then diligently kept rolling up contributions and releasing the result.
  These days he writes very little code himself; almost all he does is manage patches. I'm not sure how much code he wrote in the early days, but I think his diligent application of patches sent to him helped Linux to become stable and useful.

He wrote huge parts of it himself and in 2006 about 2% was still written by himself. I can't find how many LOCs it had then, but it was 5 million in 2003 and 11 million in 2009 so 8 million-ish. That means in the ballpark of 160.000 lines of code over 15 years, along with managing the whole project. And when that's not enough, he bootstraps what's possibly the most widely used source control management system today.

Now I've met people who are absolutely brilliant, they're rare. I've met people who truly excels at making everybody pull in the same direction, they're rare too. But I've never met one that's both, he could have been overly possessive and not let anyone else work on his pet project. It's one thing to say you want contributions, it's another thing to mean it in practice. Or he could have been the one pointing out a direction with nobody to do the heavy lifting.

Most of us don't even want to do both, the more I have to rely on others to get something done the more I realize how much I'd hate it if everything I did was manage other people. Those who want to run the business/organization/project get out of the doer role quickly, those who don't avoid management and get into some kind of technical guru role, to use a military analogy more like the special forces than a general. If you find one that both can do both and want to do both, you've hit the jackpot.

Comment: Re:What's that ahead? (Score 2) 206

by drinkypoo (#49633113) Attached to: Self-Driving Big Rigs Become a Reality

I was driving in Nevada one dark, moonless night, when out of nowhere came a cow in the middle of the road... I'd like to see how an autonomous vehicle would deal with that.

That's out of nowhere to you, but the computer is going to be able to see in the dark far outside the range of your headlights. Its headlights are going to be a convenience to other drivers, and an IR source for its night vision — which will have automatic gain control far outside the range of your pupils. It'll also likely have radar and lidar so even if it can't see the cow, it'll know it's there.

Comment: Re:One Criterion Missing (Score 1) 404

by Roger W Moore (#49632115) Attached to: No, NASA Did Not Accidentally Invent Warp Drive

New science is not always required if something odd is noticed.

True but this is a little different from your example. There is no fundamental law of physics saying that you cannot build an instrument large enough to observe distant planets. In the absence of such a restriction building that instrument is down to human ingenuity. However there is a fundamental law of physics which says that momentum is conserved.

As a result this force is either due to some interaction with the surroundings that the experiment has forgotten to account for or is due to new physics in the form of new particles/interactions or violation of conservation of momentum - which is an extremely fundamental law of physics. There really are no "loopholes" to squeeze through.

My personal feeling is that it will turn out to be some effect which they forgot to account for although I cannot help but hope that it turns out to be something far more interesting...which is why it is so easy to fool ourselves when doing experiments.

Beware of bugs in the above code; I have only proved it correct, not tried it. -- Donald Knuth