Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:So Android, then (Score 1) 28

I have literally no reason to run FreeBSD, the also-ran of Desktop Unix.

I had history with SunOS and Xenix when I tried to install FreeBSD the first time around. At the time, there was no meaningful install documentation, and my attempts to get assistance from the community were met with noob mockery.

I installed Slackware with zero assistance and zero problems and have never felt a need to look back, and I doubt I ever will.

Comment Re:Learned x86 first ... explains so much of world (Score 1) 150

Yeah, those are all the things people say to excuse the complexity, but the reality is there's no real benefit of the complexity.

I wasn't saying there's a benefit so much as that it was inevitable at the time. Today RAM is cheap and fast enough to where some instruction compression is basically worthless. But on the other hand, the complexity also really isn't a problem any more because of advances in gcc, register renaming, and the fact that all modern CPUs have umpty-million gates so the decoder is not a big part of the processor.

Comment Re:Year Of Linux On The Desktop (Score 2) 150

Many systems running Windows are used in the same situations. Windows is perfectly fine when put in a system with controlled updates, controlled choice of drivers

It isn't, though. For example all kinds of things which you can do on Unix[likes] without disturbing users require a reboot on Windows. This creates real and measurable impact to users.

The Windows kernel is actually quite a tight piece of code and that is reflected in the fact that most people haven't seen a bluescreen in a decade, it's just the userland above it sucks balls.

Yes, but that stuff matters! Also, there was all that time when pretty much the whole graphics driver ran in the userland because Microsoft couldn't get any performance any other way, which is where most of the blue screens came from. They were a LOT scarcer in NT 3.51 than in any version of Windows since until, ironically, Vista.

Comment Re:Learned x86 first ... explains so much of world (Score 1) 150

Of course, that explains why they could get away with complexity, but that doesn't explain why it's trash.

It's not so much getting away with complexity as that it would have cost more to have less of it in the instruction set. But if you want to speak to that specifically, part of the reason the instruction set is trash is that the architecture is trash — by modern standards, anyway. For example, from a certain point of view, x86 has zero general-purpose registers because some of its instructions (a ton of them really) require that operands and offsets go into specific registers. But this also made the processor simpler, because it didn't have to be able to use other registers. It also made the code smaller, because some instructions are shorter than others.

This was a problem for performance until register renaming was implemented, which IIRC was also a Cyrix development (when it comes to x86 anyway.) With RR, though, the performance penalty of having to move things between registers so that you could execute successive operations was reduced, or with superscalar processors, more or less eliminated since those "moves" can be processed at the same time as other operations and only take one cycle to complete.

Anyway it ultimately was because intel made the processor as simple as possible through the 486, as this was back when an ISA really was an ISA — the instruction set was defined by the architecture, unlike now where one multi-cycle instruction is translated by the decoder into multiple single-cycle instructions in basically all designs, except for genuine RISC. They made up most of the performance drawback with their compiler, which for many years was the most efficient for x86 as it had optimizations to work around the deficiencies of the design. These days, gcc produces more efficient code for all processors than icc.

Comment Re:Learned x86 first ... explains so much of world (Score 1) 150

x86 was wonky because a big complicated decoder would have taken up a lot of silicon at the time, relative to the rest of the CPU. Instructions weren't decomposed into RISCy micro-ops like they all are today. The first x86 processor to do that was a Cyrix chip I think? AMD's first x86 processor which did it was the Am586, for Intel it was Pentium. Today an x86 decoder (which is a relatively complex beast compared to decoders for other instruction sets) is a very small piece of the CPU.

Comment Re:That's why Linux wins. Quality. (Score 1) 150

Where waste of any type matters (Lean, Toyota).

Pity about Toyota's code, which we know to be trash after the code reviews (not NASA's worthless one, but the good one from the Barr Group) revealed that they not only don't follow industry best practices, they don't even follow their own documented guidelines.

Slashdot Top Deals

This login session: $13.99

Working...