Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment the solder contract (Score 1) 366

I won't argue he's wrong, but I think as fast as CPUs change you'd have to have across the board reductions in workload capacity by a significant number (ie, the 30% touted initially) to be able to claim harm and justify a recall.

Man, have you ever drunk the software caveat-emptor Kool-Aid whole cloth.

The specifications for hardware and software are not that it will work as specified.

The specification is that, with sufficient user cleverness (and sweat streaming off a bulging forehead, and unbroken vigilance) it is possible to almost get the hardware or software to attain its putative performance ratings without catching one or more incurable black hat diseases along the way.

This ethos dates from the 1980s and 1990s when the upgrade cycle (both hardware and software) was three years at most, and the whole drama could barely play out in that time frame.

I bought my Sandy Bridge Xeon in January 2013 and I'm not sure I'd take a free swap to Intel's newest equivalent, because the hassle of swapping over probably exceeds the expected gain (my average uptime, when the power company isn't replacing faulty-from-birth concrete power poles, has averaged well over a year, and there hasn't been a single hardware or OS glitch that I know about since I eliminated one suspect hardware card, back in the first few months).

I make fairly heavy use of BSD jails, and I've always regarded process isolation as more important than raw performance.

Punting isolation from hardware to software is a Sisyphean burden. One important binary on your system that wasn't compiled with the right compiler (on the right day) and the game is lost. Worst, if there's a 'retguard' gadget variant of the attack, you also have to use the right linker (on the right day), because security is (potentially) no longer link-order independent. Lovely, lovely, isn't that lovely.

This giant, amorphous, hard-to-discern attack surface is now my problem because "durf durf, complicated, too many transistors" and "we can't read widely circulated papers pointing out that one of our optimizations is full of shit" and "caveat-emptor grandfather clause—so suck it!".

If there was a paper from long ago pointing this out (as I've seen stated in this thread, but haven't check out myself) then there's going to be an MFT of legal discovery pointed at Intel to find out what they knew and when they knew it.

The GHz wars were a great thing, because it rapidly erased your worst corporate blunders.

The beleaguered end-user responded to constant threat by refusing to buy computer systems with CPUs and memory soldered to the mainboard (only then it was a bug in an ASUS northbridge—newly introduced when they bumped the much-loved board from an A to B revision—that caused the disk controller to scribble randomly over my hard drive, taking out three different OS boot partitions in a single bound).

Once vendors start soldering CPUs and memory to main boards, it's like, so long puberty, you're apparently an adult now.

Adults in every other industry recall or replace products that don't work as least marginally to original specifications (without forcing the end-user to jump through a thousand flaming hoops).

It's high time to set aside childish ideas that this technology (alone) can demand unlimited user backflips before any restitution is warranted or justified.

Comment Re:"I want repaired processors for free" (Score 1) 366

I don't have the slightest problem with Theo standing up for his principles, but to do so without expecting there to be some rather obvious blowback should there be a similar situation in the future is rather naive, to say the least.

What evidence do you have that Theo didn't expect this to play out exactly as it has?

Is he obligated to say "I saw this coming" if he saw this coming?

You seem to think that Job #1 in these situations is maintaining an unbroken signalling posture that you're truly inside the loop.

There's a name for that.

Insecure.

Comment off-topic quickie (Score 1) 271

I got to thinking about Google's clever Retpoline from the other day.
Google Says CPU Patches Cause 'Negligible Impact On Performance' With New 'Retpoline' Technique

The problem is, this is not invariant under peephole optimization. These instruction sequences need to be handled by the compiler through a very literal minded end-game code generation pass.

Which got me to thinking about RETGUARD gadgets.

RETGUARD, the OpenBSD next level in exploit mitigation, is about to debut
Retguard: OpenBSD/Clang

I know, both of those sites are horrible, but Google fails me here.

Are speculative gadgets a problem here? If so, Google's clever patch is going to need a sump pump bolted on the side.

And then you get into the whole problem of deterministic compilation in order to be certain that the executable you build contains the necessary mitigations (or some tricky post-compile analysis I sure don't wish to develop myself).

What a giant mess.

Comment the many forks of speculation (Score 4, Informative) 271

So you decide to speculate a future instruction.

It happens to be a load.

The address is [ebp+eax]. A recent instruction had the same address field, so you speculate that it remained the same.

Now you need to translate the address. The translate might be in the TLB, but you check, and for some reason it isn't.

So you decide to speculatively trigger TLB load.

Finally, you get a physical address back. A previous write instruction is not yet translated, but it seems unlikely it will translate to the same address, so you decide to speculate the load and you make a cache line request from L1.

It might be in L1, but it isn't. So you decide to speculate again, and request it from L2. Not in L3, either, so finally you speculate the load all the way to external memory. When the cache line returns, you speculatively cache this at all levels. Then you speculatively store the value into the target register. The final step was the least dangerous, because you can dump this later, no harm to the abstract state. But the concrete side effects on the TLB and the three layers of cache are not so easily reversed. In theory, the concrete state doesn't leak into the abstract state. Because we simply don't like to think about time (time, above all things, being never simple; hint: functional programming has no time, only progress).

Not all speculative architectures are created equal. There are many opportunities for an architecture to Just Say No.

With cache coherence, you have the MESI protocol (and its bewildering shoe full of second cousins).

One could apply the same concept of "exclusive" to the page tables, an exclusively mapped page being one mapped only onto into the current process and security context. If TLB speculation hits a different kind of beast, abandon speculation. Same thing with cache fill. Concrete side effects thereby only accrue from speculation to exclusive resources. Share-nothing usually solves most problems in computer science (except performance, which is mainly defined in the time domain).

I'm gong to abandon the back of my envelope here, One has to think really damn hard to take this to the next logical level, and frankly, I don't have a damn to spare right this very minute.

But please, advance the conversation beyond:

[_] has speculation
[_] does not have speculation

Because that is Intel's diabolical trap, for as long as their PR department can continue to get away with tugging their wool in broad daylight.

Comment Re: Idiotic Moderation (Score 1) 120

Intel seems to be the one cutting corners - for decades. You do remember the FDIV and FOOF bugs in early Pentiums?

I recall the FDIV bug quite well, and it had nothing to do with cutting corners. The design of the circuit was correct. In the transfer to manufacturing, some relatively insignificant bits in a hardware lookup table were truncated erroneously. The rarity of the failures allowed the mishap to escape detection in the validation phase.

Intel's test probably should have been stronger in this area, but that's an awfully easy thing to say in hindsight concerning the validation of extraordinarily complex designs.

Nostradamus: "There's a horrible bug in this design, and if you double your test coverage from stem to stern, you'll probably find it."

Intel: "Gee, thanks, Nostradamus. Invest another $10 million and wind up a year late. I think we'll pass on the engineering, and expand our PR team by one full-time professional bullshitter."

Nostradamus: "So be it. For what it's worth, I also wrote this nice quatrain on the horrors of speculation."

Intel: "We'll pass."

Nostradamus: "No, you won't."

Intel has been many things over the years (with a weird, clockwork heel-turn), but skimping on validation is pretty much the last thing on my list of Intel malfeasance.

i860
RDRAM
Caminogate
Itanium
general crisis-management ethos

Oral History of John H. Crawford 2014 Computer History Museum Fellow — 2014

I recall that as a great read. From my own notes:

Big numeric coprocessor redesign as part of the Pentium. This lead to the world-famous Pentium FDIV bug. He claims that transcendentals were easy to test on existing software, but most software took extraordinary efforts to avoid division, so that coverage was extremely thin at this testing layer by comparison.

I think that discussion also covers the i860, a litany of terror.

Intel i860

The Intel i860 (also known as 80860) was a RISC microprocessor design introduced by Intel in 1989.

It was one of Intel's first attempts at an entirely new, high-end instruction set architecture since the failed Intel i432 from the 1980s. It was released with considerable fanfare, slightly obscuring the earlier Intel i960, which was successful in some niches of embedded systems, and which many considered to be a better design. The i860 never achieved commercial success and the project was terminated in the mid-1990s....

On paper, performance was impressive for a single-chip solution; however, real-world performance was anything but.

One problem, perhaps unrecognized at the time, was that runtime code paths are difficult to predict, meaning that it becomes exceedingly difficult to order instructions properly at compile time. For instance, an instruction to add two numbers will take considerably longer if the data are not in the cache, yet there is no way for the programmer to know if they are or not. If an incorrect guess is made, the entire pipeline will stall, waiting for the data.

The entire i860 design was based on the compiler efficiently handling this task, which proved almost impossible in practice. While theoretically capable of peaking at about 60-80 MFLOPS for both single precision and double precision for the XP versions, hand-coded assemblers managed to get only about up to 40 MFLOPS, and most compilers had difficulty getting even 10 MFLOPs.

The later Itanium architecture, also a VLIW design, suffered again from the problem of compilers incapable of delivering optimized (enough) code.

Another serious problem was the lack of any solution to handle context switching quickly. The i860 had several pipelines (for the ALU and FPU parts) and an interrupt could spill them and require them all to be re-loaded. This took 62 cycles in the best case, and almost 2000 cycles in the worst. The latter is 1/20000th of a second at 40 MHz (50 microseconds), an eternity for a CPU. This largely eliminated the i860 as a general purpose CPU.

Just think about that for a moment. The same company designed the i860 and the Itanium.

Intel ISA designs are traditionally a seething cauldron of shallow expediency, hubris, paranoia, and greed. And they've never been able to decide whether the software layer is a curse or a blessing. Why else do you think they in-sourced MINIX at the chip level, and told no-one?

But they've always been competent enough at process and validation to survive their worst impulses everywhere else in the value chain.

Recently, I got modded off-topic for failure to spell out the obvious about the ongoing rat race between sugar and alcohol as the worst metabolic offender. Probably because I confused a moderator by including long quotations supporting my position, as I've done here.

On-topic: When someone else ties a short bow around your thinking for you.

Off-topic: TMI / TMW (too many words).

Comment Re:Nice try (Score 1) 375

Those workloads with significant performance losses are more or less completely artificial, e.g. average users don't create hundreds of thousands of files day in and day out and even in this case only SSD disks are affected. Considering that SSD disk operations are sometimes several orders of magnitude faster than those for spinning disks this performance loss is still nothing to worry about.

So it won't affect by Poudriere build server at all. Or my Jenkins build server. Or my Zabbix network monitor.

Nice to know.

"Average users" are generally mentioned right before the poster goes ape-shit on pancake empathy. I mean, the average user doesn't hit a data center more than two or three times per day?

Probably the denominator of concern is CPU server-cycles on the 400 ms deadline program (which completely excludes JavaScript running on moribund tabs, and thus counts your average user not at all).

Does anyone know if Google's public cloud is CPU-partitioned from their private cloud?

Google won't be impressed if they lose a full datacenter worth of peak compute due to lower generated work while operating in proximity to the thermal wall.

Comment tongs and hammer (Score 1, Offtopic) 145

Alcohol below the 'hangover' level is about as bad for you as sugar.

Sugar: The Bitter Truth — 2009, 7.5 million views

The Hacking of the American Mind with Dr. Robert Lustig — 2017

John Yudkin: the man who tried to warn us about sugar — 2014

If you look up Robert Lustig on Wikipedia, nearly two-thirds of the studies cited there to repudiate Lustig's views were funded by Coca-Cola.

Many serious people now believe that excess fructose (which is metabolized in the liver through much the same pathway as ethanol) is the largest single causal component to the metabolic syndrome epidemic, which is itself one of the largest single causes of runaway healthcare costs in the United States.

Salt, Sugar, Fat: How the Food Giants Hooked Us by Michael Moss — 2013

He interviewed hundreds of current and former food industry insiders — chemists, nutrition scientists, behavioural biologists, food technologists, marketing executives, package designers, chief executives and lobbyists.

What he uncovered is chilling: a hard-working industry composed of well-paid, smart, personable professionals, all keenly focused on keeping us hooked on ever more ingenious junk foods; an industry that thinks of us not as customers, or even consumers, but as potential "heavy users".

How the Food Makers Captured Our Brains — 2009

As head of the Food and Drug Administration, Dr. David A. Kessler served two presidents and battled Congress and Big Tobacco. But the Harvard-educated pediatrician discovered he was helpless against the forces of a chocolate chip cookie.
...
Foods rich in sugar and fat are relatively recent arrivals on the food landscape, Dr. Kessler noted. But today, foods are more than just a combination of ingredients. They are highly complex creations, loaded up with layer upon layer of stimulating tastes that result in a multisensory experience for the brain. Food companies "design food for irresistibility," Dr. Kessler noted. "It's been part of their business plans."

Sugar is the tongs and the hammer.

As Lustig once said (from memory): given the choice between sugar and alcohol, I'll take alcohol, because you can only drink yourself under the table once a day.

Comment ambition with a higher thread count (Score 1) 318

At most, it is a recognition that since one doesn't have a degree in some science that deals with climate, that punting to experts who do is a rational choice.

And all the massed & collective & thoroughly reiterated human experience of boy scouts & girl scouts, experts & executives, naked gentry & landed gentry claiming slightly more than they can reasonably chew to make a name for themselves and get ahead in life ... dead fucking worthless.

Apparently.

Expertise is just ambition with a higher thread count.

Supposing the thread is visible, rather than risible (for which purpose one should always keep a child on hand whose brain & eyesight is not yet damaged by that mercury stuff).

Comment motion blur (Score 1) 289

There are three cups, upside down on the table, each one labelled on the inside.

The first cup is labelled "avoidance", the second cup is labelled "evasion", and the third up is labelled "shell game".

And they all wing around the table so rapidly only a meth-addicted Rain Man could tell them apart from any intermediate camera angle.

Comment Re:five to 30 per cent slow down (Score 1, Offtopic) 416

I find it hard to believe that a virtual memory change will result in a 5-30% slowdown for Intel processors. Maybe for a few extremely specific (likely edge-case) tasks, but if there was a legitimate 5-30% performance decrease, you can bet there would be a far different solution in the works that would suitably fix the problem.

Of course, if microcode update fails, there's always the hail Mary unicorn ass-pull.

I assure you, every Intel employee is kneeling on the carpet this very instant, facing the most auspicious astrological direction like ten of thousands of well-aligned human magnetic domains, while praying in unison like a telepathic Tourettic cuckoo clocks to any qualified god who will take their call.

If you've been harbouring a time machine, looking for optimal market conditions, I suggest you pull it out of your ... hiding place ... any minute now, because the market is ripe, ripe, ripe and bidding will begin with a B.

What a fucking nightmare.

Comment the anti-paradox of the funding cycle ... (Score 1) 187

Gerontologists call this the paradox of old age

The anti-paradox of the funding cycle is to refer to anything that puzzles a Mayfly as a paradox.

Young people are stressed because Darwin cares.

After Darwin ceases to care, there's little remaining reason to filter world through the mindset of a eww, gross Valley Girl (old people must hate life because creepy).

Unless you want to fund a silly grant application.

Then you haul out the word "paradox" to show that 20 years of formal education can't fix stupid.

Imagine that.

Comment Re:But is it right to do this? (Score 1) 244

If we continue to grow endlessly on this planet, heedless of our impact, we are no better than amoeba.

Worse, actually, because amoeba don't moralize deplorably into their beer.

    if (unchecked_species_privilege)
        cout << "Humans are the "
              << alignment ? "worst" : "best"
              << " of all possible species.\n";
    else
        throw tantrum;

Our moral character is all over the map, but our salience is never in doubt (or there's hell to pay).

Comment GTA III (Score 1) 247

But I bet the kind of bugs that put Python over average are the first kind, and that Python is below average on the second kind. Which is a good tradeoff.

It would be no different in C++ with end users programming in the modern idiom on top of mature application libraries that support and encourage the modern idiom.

C++ is the GTA III of programming languages.

cout << "Open, world!" << '\n';

Comment Wobegon woes (Score 1) 404

There are a dozen things that reduce or eliminate congestion and make driving more efficient and safe that almost no one does.

I agreed with everything else you said, except for that one observation.

How can you tell that no one else is doing these things? Smart driving tends to look unexceptional, in the same way that elite goal tenders often make the game look easy.

Does perhaps your claim boils down to this: there are more idiots on the road than obvious, distinctive geniuses like Dominik Hasek?

Dominik Hasek

"He tends goal the way Kramer enters Seinfeld's apartment, a package of flailing arms and wild gesticulations that somehow has a perfect logic." That's how Sports Illustrated's Michael Farber described him, name-checking two popular 1990s TV sitcom characters. In an era when goalies gravitated toward the "butterfly" style — a structured and systematic approach to their position — Hasek got superior results by appearing to be largely improvisational, to the point of having no style at all.

I often regulate my breaking according to the sum of my following distance plus the following distance of the car behind me.

I doubt that one other driver in 100 has even the vaguest perception that I'm attuned to the space around me in all pertinent directions.

I even find this hard to detect in others, and I often look for the most subtle clues. Sometimes I comment to my wife "the driver of that car in the left lane used to drive a motorcycle". Motorcycle drivers—surviving sample—tend to view the entire width of their lane as a resource to be exploited at all times. In a vehicle, that gets compressed down to about one foot of lateral freedom, according to traffic conditions near by (more than a foot starts to alarm people about your lane management).

The number one rule of truly proficient driving: never give the other drivers around you an excuse to think about one thing more than they already failing to fully internalize, unless that one thing is the only thing you want that driver to attend to (in packed conditions, one almost never wants to have this effect on another driver, but it can be quite useful—wielded judiciously—in lesser congestion).

There's a huge correlation between skill and stealth, hence the Wobegon woes.

Illusory superiority

Illusory superiority itself is full of shit, because different people have different standards of competence. Some people value being polite, others value defensive driving, still others value weaving and wending to leave the slings and arrows of outrageous fortune in their rear-view mirror (which is generally covered in black tape).

Surprisingly (not!) we all think we're above average in the attributes we most value.

Slashdot Top Deals

I like work; it fascinates me; I can sit and look at it for hours.

Working...