Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Slashdot Deals: Cyber Monday Sale! Courses ranging from coding to project management - all eLearning deals 25% off with coupon code "CYBERMONDAY25". ×

Comment Re:We're almost at the end with current tech (Score 2) 115

10 years ago, Intel was hinting at a massively parallel future (80 core processor rumored in development at the time)

I think the 80 core processor Intel was developing at the time eventually turned in to the Knights Corner aka Xeon Phi chip. Originally Intel developed this tech for the Larrabee project, which was intended to be a discrete GPU built out of a huge number of X86 cores. The thought was if you threw enough X86 cores at the problem, even software rendering on all those cores would be fast. As projects like llvmpipe and OpenSWR have shown, given a huge number of X86 cores this isn't as crazy of an idea as it initially sounds... but still a little crazy :) Ultimately Intel cancelled that project and decided to use that tech for super computing instead of graphics. A result of this is Intel retained the "Gen" design for their graphics core, which is a more traditional GPU design.

Comment Re:If it's not GPL (Score 4, Informative) 158

If it's not GPL'ed, it's not open source. And we all know what abhorrence MS harbors for GPL...

The Open Source Initiative has certified the MIT license as a valid open source license. Look I'm not a huge MS fan either, but they are using a real OSS license here. Just because MIT isn't copyleft doesn't mean its not OSS.

Comment Re:Portability (Score 0) 437

Speaking as someone who writes firmware for a living, Rust will never be a replacement for C in its current state. C has one property that sets it apart from every other language that is higher level than assembly. It is possible to write a C program that does not need *ANY* C run-time library support. Our firmware runs on the bare metal without any OS whatsoever, so running Rust in firmware requires building all the services that the Rust run-time requires (heap, threads, etc.) and porting the Rust run-time to it.

So unless we want to write heap, threads, etc. in assembly... C is our only option for bootstrapping Rust. Since we are using C for that part of our code base... why not use it for all? The firmware I work on doesn't do a bunch of string manipulation of other things that makes a higher level language like Rust nice. Why add all that additional complexity to support Rust? By the way, anyone writing an operating system kernel is going to run in to the same thing with Rust.

It would be possible to bootstrap Rust on top of a basic kernel written in C and maybe even use it for some of the kernel code (drivers for example, like how OSX uses C++ for I/O Kit drivers), but Rust will never replace C. It would be nice if someone designed a new **Systems Programming Language** that can run without a run-time and had some of the features that newer languages do... but it seems the only thing people think about when designing new languages these days is the web and/or cloud computing.

Despite what the Rust developers claim, Rust is not a real Systems Programming Language in its current state since it requires a run-time. Its OK to have an optional run-time and for some features to stop working when its not there... but it must be optional not required.

Submission + - Broadwell Desktop CPUs Not Actually Discontinued (

nateman1352 writes: Contrary to the report published by IT world and linked in a previous Slashdot story, Anandtech reports that Intel will continue selling desktop Broadwell CPUs:

IT World published an article earlier this afternoon stating that Intel was discontinuing their two desktop Broadwell socketed SKUs, the Core i7-5775C and the Core i5-5675C. The two SKUs are notable because they are to date the only socketed Broadwell processors on the desktop, and they are also the only socketed desktop Core processors available with a GT3e Iris Pro GPU configuration – that is, Intel’s more powerful GPU combined with 128MB of eDRAM.

The idea that these processors were discontinued came as quite a shock with us, and after asking Intel for more details, the company quickly responded. Intel has made it very clear to us that these processors have not been discontinued, and that the company continues to manufacture and sell the processors as part of their current Broadwell lineup.

Comment Re:It's no ARMv8 (Score 1) 54

Both the A8X and the Broadwell Core M have a TDP of ~4.5W, so they give us a good comparison between the latest and greatest ARM vs. x86 CPUs:

Lets compare against the nVidia Tegra K1 as well, which has a TDP of 5W vs. the Core M's 4.5W:

As you can see, Intel is actually competing well against the best ARM can offer in their own backyard. The A8X does ~5% better in multi-threaded workloads, it has 3 cores vs. Core M's 2 cores. Single threaded, the A8X is ~26% slower than Core M. Despite having 4 cores, the Tegra K1 has ~16% worse multi-threaded performance than the Core M with 2 cores and ~53% slower single threaded performance. When you consider most GUI/web applications are single threaded (which is mostly what people use tablets for) Broadwell Core M is the best tablet chip on the market right now. Its only going to get better with Skylake.

At the same time, ARM hasn't been able to really touch Intel's home turf in the high performance market.

On the topic on instruction sets, honestly the most important difference between x86 and ARM is having an x86 design gives you a distinct advantage in the market for computers that run Windows. Given that there is no disadvantage for x86 in any other market segment (Android, Chrome, Mac, etc.) why would Intel switch to ARM when there is only upside to x86 and no downside?

Comment Re:From the 2nd article (Score 1) 242

>Law of supply and demand affects salaries. Companies that have not learned this, can't find qualified candidates, because they're not paying enough.

Companies are completely aware of supply and demand. Not being able to find good candidates is the excuse they give. The real reason why companies want H1-B's is because it increases the supply of high tech labor. Increasing the supply of any good or service, keeping the demand constant will reduce the price of the good or service. This is basic ECON-101. In this case, the service being sold is high tech labor. Increasing the supply of people seeking high tech employment reduces the average wage for high tech work.

Note that this is the reason why the same people pushing for more H-1B's are also donating to public schools to improve their high tech curriculum. They are actively seeking to increase the labor supply using whatever means possible. Whats surprising is how willing they are to make such a long term investment in education "philanthropy." It will take probably ~10 years for the education investment to bear fruit. Hard to call it philanthropy when its done for an entirely self serving reason :).

Comment Driver Differences (Score 2, Interesting) 96

I think what this benchmark really tells us is two things:

1. nVidia has not optimized their driver stack for DX12 as much as AMD has optimized for DX12
2. The performance difference between AMD and nVidia is likely a software issue, not a hardware issue (nVidia's driver has a more optimized DX11 implementation than AMD's). However, it is possible that nVidia's silicon architecture is designed to run DX11 workloads better than AMD's.

Bullet #1 make sense, AMD has been developing Mantle for years now so they likely have a more mature stack for these low level APIs. Bullet #2 also makes sense, AMD/ATI's driver has been a known weak point for a long time now.

Comment Consumer Hostile (Score 1) 82

Freemium is pretty disgusting really. Instead of just buying the game, you have to keep paying for the game constantly. You pay every time they add a new sword/gun/zombie killing plant a la "micro transactions." Honestly its almost as bad as slot machines in a vegas casino. There is a funny tongue in cheek game called DLC Quest about this... which you only have to pay for once :)

Comment Re:Intel is behind (Score 2) 84

28nm is still the cheapest node in per transistor terms.

That's not really true anymore. 14nm is cheaper for Intel to manufacture than 22nm (but Intel is the only company thus far with a mature, cost effective node at 14nm.) Remember that all the problems Intel had with ramping 14nm to high volume every other silicon fab will also experience.

Really what this tells us is that if you look at Intel's past two nodes (22nm and 14nm) they both have had about a 2 year, 6 month development cycle instead of the 2 year cycle we are used to. I think this is more just Intel being open with their customers and dealing with the 2.5 year cadence instead of the 2 year cadence by back-porting some of the new features that were originally going to debut in cannon lake to skylake. I think we can probably consider kaby lake to be skylake 2.0, hopefully more than devil's canyon was (actual new micro-architectural and/or chipset changes not just some new thermal compound and higher MHz.)

Honestly I think we will all be happier to have kaby lake next year than another 2 year wait like haswell and skylake.

Comment The Kitchen Sink (Score 1, Informative) 208

Remember when everyone made fun of Mozilla because it had everything including the about:kitchensink in it? Remember how Firefox was supposed to get rid of all that bloat and modernize the web browser? Guess Mozilla is back to bundling a ton of junk together in to one package.

Only this time its far worse, at least with Mozilla it was useful stuff like a web browser and an HTML editor. This time we get junk of dubious value like Firefox Hello and Pocket which would be much better kept as downloadable extensions. Of course, it is painfully obvious that the reason they are not separate extensions is because of the financial upside they get from bundling them. Same thing with the "sponsored tabs."

I guess they just view Firefox as a cash cow that they need to milk to keep funding non-browser projects like that POS called Firefox OS, the Mozilla Science Lab, and all those grants they have given out over the years.

Oh and most their paid programmers/QA staff make little more than minimum wage. Just because Mozilla is "cool" doesn't make it okay to pay vastly under market value for their employee's services. Unfortunate to see how much Mozilla has become poisoned with mission creep and lack of a clear direction to the point that they have to lower themselves to the level of Java and bundle sponsored junk.

Submission + - AMD's Project Quantum Gaming PC Contains Intel CPU 1

nateman1352 writes: In a case of legitimate irony, AMD has confirmed that they will offer Intel CPUs in their Project Quantum Gaming PC.

Recently, AMD showed off its plans for its Fiji based graphics products, among which was Project Quantum – a small form factor PC that packs not one, but two Fiji graphics processors. Since the announcement, KitGuru picked up on something, noticing that the system packs an Intel Core i7-4790K "Devil's Canyon" CPU. We hardly need to point out that it is rather intriguing to see AMD use its largest competitor's CPU in its own product, when AMD is a CPU maker itself.

Comment Re:Universal App APIs are too limited (Score 4, Interesting) 186

I don't use the Universal App API. So I have to ask. How is it worse than the model used by the Android and iOS API? Why wouldn't it be adequate for an app like Skype.

For basic calling functionality yes you could definitely get by with an Universal app. But remember that they sell a bunch of USB Skype phones that plug in to your desktop and have a keypad for dialing numbers and sometimes a LCD screen for contacts and/or video calls. There is pretty much no way you are getting stuff like that working with a Universal app.

Comment Universal App APIs are too limited (Score 5, Interesting) 186

The limited APIs and strict sand-boxing on universal apps limits the amount of actually useful software you can write for it. "Universal" really means lowest common denominator between our phone and desktop os. If all you care about running on your computer is cut the rope and angry birds then its fine. If you want an actual full featured computer... not so much.

Machines certainly can solve problems, store information, correlate, and play games -- but not with pleasure. -- Leo Rosten