Comment Re:Hooray for Space-X (Score 4, Insightful) 32
The rigtht way to do capitalism, as opposed to the way it's generally practiced in the US, with the cart in front of the horse.
The rigtht way to do capitalism, as opposed to the way it's generally practiced in the US, with the cart in front of the horse.
Get your hands off of MY computer!
Please, by all means run a Cathedral OS on your own - it can even use the Linux kernel - I don't mind.
But quit insisting that I do so, too.
I run Gentoo, one of the less-used distributions. I chose it exactly because it was a geeky, nuts-and-bolts distribution. After all, at the time Linux was a hobby, and if you're in it for that kind of fun, go for it.
At the same time, I generally advise against using Gentoo. Unless you know why you want to use it, don't. New users should use something like Ubuntu, which I've installed for several people, or more recently Mint, which I've also done. We use RedHat at work because it's "Enterprise" and has a support contract, which bean-counters like.
But if Linux were a monoculture which kept me isolated from the nuts-and-bolts, I'd be running something else.
We will not solve the problem with illegal immigration until we figure out how to do something sane instead of the War on Drugs. Right now the unintended consequence of the War on Drugs is that south of the border, drug lords are about as well (if not better?) funded as the governments, destroying the local economies. Some of the people seeking jobs in those economies end up coming to the US in search of work.
Seconded. When the power and everything else is out, I call to report it on my land line.
I dunno. It just thinks like the rate of change has slowed down a lot over the last decade. Or maybe I'm just getting old.
10+ years ago, performance was more than doubling every two years through a combination of higher clocks, die shrinks, extra transistors, fundamental breakthroughs in logic circuit designs, etc. Right now, mainstream CPUs are only ~60% faster than mainstream CPUs from four years ago because clocks are stuck near the 4GHz mark, die shrinks are becoming much slower in coming, nearly all fundamental breakthroughs have been discovered and modern hardware is already more powerful than what most people can be bothered with so there is a general lack of demand for significantly faster low-mid-range CPUs to make things worse.
Progress is slowing down and I can only imagine it getting worse in the future.
Maybe there's another reason that our infrastructure is crumbling...
(Line up the conspiracy theories.)
Now we need the quasi-obligatory response that this is really a government problem, and if government weren't in there mucking about with needless regulation the free market would address the problem and we'd all be in broadband utopia at reasonable prices.
Maybe he was talking FADD.
In a float addition, you need to denormalize the inputs, do the actual addition and then normalize the output. Three well-defined pipelining steps, each embodying one distinct step of the process.
But as you said yourself, CPUs (and GPUs) generate a lot more heat. They are already challenging enough on their own, imagine how hot the CPU or GPU at the middle of the stack would get with all that extra thermal resistance and heat added above and below it. As it is now, CPU manufacturers already have to inflate their die area just to fit all the micro-BGAs under the die and get the heat out.
Unless you find a way to teleport heat out from the middle and possibly bottom of the stack, stacking high-power chips will not work.
At best, you could stack memory and CPU/GPU for faster, wider and lower-power interconnects.
For most of those issues, the solution is simple: if you forget cables and adapters so often that it is a major hassle, you might want to buy some spare cables and adapters to suit most scenarios. Type-A plugs are not going to disappear overnight (USB 3.0 Type-A maps directly to Type-C so Type-A on PCs, power adapters and anywhere else where shaving cubic millimeters does not matter is not going anywhere) so an A-to-C cable should have you covered in most cases where you cannot do C-to-C... assuming Type-C devices even give up Type-A power adapters.
My guess is the transition will be mostly from A-to-microB to A-to-C. Most people are not going to bother with microB-to-C adapters; they will just get a straight A-to-C cable.
The whole point of Type-C is to address the ugly kludge that is the current micro-USB3 connector that almost no phone or tablet adopted because the connector is huge - over twice as wide as micro-USB.
As for the EU and others with mandated micro-USB charging, I bet they will include Type-C as an acceptable or even preferred alternative in short enough order.
Broadwell-H might be Intel's shipping name but the roadmap name has been Broadwell-K for about a year. That's why you see Broadwell-K used everywhere.
The fact that K-series chips (the enthusiast unlocked chips) will be from the Broadwell-K lineup likely contributed to most computer enthusiast sites choosing to stick with the old roadmap name instead of adopting Intel's new production codenames.
The CPU side might be different but the GPU side remains the same and in GFXBench, the results will likely end up similar, give or take whatever they gain/lose on the CPU.
If Nvidia wanted to go all-out with this Transmetaism, the logical thing to do would be to put together a custom ART runtime that merges with their online recompiler/optimizer.
"The only way I can lose this election is if I'm caught in bed with a dead girl or a live boy." -- Louisiana governor Edwin Edwards