You know, some of us remember driving cars that didn't have airbags, antilock brakes, traction control, rear view cameras, auto felch, auto transmission, etc.
I was following you until I got to auto felch...
You know, this started off with a one liner, and then you brought Gene Ray into this as almost a pseudo-Godwin. It's clear you feel strongly about begging the question, so beg away.
Got me there on the punctuation inside/outside the quotes. My fingers trip me up on that one, as I'm a programmer. In most programs, you want the punctuation outside, and in most typeset environments, you want it inside. Mea culpa and big whup.
I'm personally not all that much of a prescriptivist. For example, I think the rules around "whom" and "comprise" were invented to give certain anal retentive sorts their own perverse set of jollies. I'm a big fan of the singular they as well. And sure, many style guides likely have a problem with singular they.
My point isn't that I live and die by style guides because I consider them the be-all and end-all of language. My point is that nobody reputable stands up for Gene Ray, so comparing my observation of misused English to Gene's bizarro rants is not a valid comparison. Rather, it smacks of a straw man argument bordering on an ad hominem attack.
In contrast, you can find many, many reputable sources that stand up for the fact that "beg the question" does not mean "raise the question," just as "irregardless" has never been a valid English word. (Some begrudginly do recognize the common misuse of the phrase in modern English.) Show me someone, anyone reputable backing up Gene Ray. Even if you disagree with all of those resources, you can do so from a different standpoint than "You're entitled to your position just as Gene Ray is." (I realize that wasn't your argument, but it is the main thing I was objecting to.)
If all you have to do to shut down someone you disagree with is compare their opinions to Gene Ray's absurd rants, well then, arguments can be very short indeed. Do we need a new variant of Godwin's Law here?
Comparing me to Gene Ray is a "nice knock-down argument"? You have some mighty low standards. I don't think anyone reputable has gone in print defending Gene, but just about any style guide you can point to that mentions "beg the question" tells you not to use it to mean "raise the question."
But, thanks for playing.
English is not context-free. Begging the question, for example, is an expression with multiple meanings, the correct one of which must be deduced from said context.
Sure, but that's irrelevant. That's how most people make sense of other people who are otherwise not making sense. "Begging the question" has never (correctly) meant "raising the question." Next you'll tell me "I could care less" means "I couldn't care less." Oh, "but context!" is a cheap, meaningless argument.
But hey, feel free to take it up with any of these other folks:
* Dictionary of Modern Legal Usage. If you misuse it in court, I'd love to see you say "But context, your Honor! And for my next argument, I'm going argue about what the meaning of the word 'is' is."
* Zoe Triska: "In the long run, misusing phrases like 'begging the question' doesn't make you sound smarter. It makes you sound dumber."
* The New York Times, which felt the need to come clean on their occasional abuses of the phrase.
Where were we? Oh yeah, context. Sure, from the context surrounding the phrase, everyone will be able to figure out what you meant. And a good fraction of them will know you're using it incorrectly and think less of you for it. As Zoe said above "In the long run
Knock yourself out. I could of gone on irregardless, but I could care less. I won't wait with baited breath for your reply, because for all intensive purposes I'm done. (Context: See how dumb misused English sounds?)
Back to the topic at hand:
But even if you overcame that problem, the light would still be bouncing between the walls of the core, and thus traveling a longer distance than the mere length of the fiber.
That's true, especially for multi-mode fiber. For single mode fiber, the fiber plus cladding act more like a wave guide, because the diameter of the fiber is small relative to the wavelength of the light.
I don't claim to be an expert though. I've just been reading up online.
In any case, the mere fact you have to bend the fiber optics at all implies the light contained therein isn't going in a straight line between repeaters.
Raises the question, maybe, but it certainly does not beg the question.
In any case, the speed of light in fiber optics is dominated by the glass or plastic, not any air that might be somehow still be in the fiber. So far as I know, that quantity is zero or close enough at least. For fiber optics to work, you need total internal reflection. To get total internal reflection over a decent range of angles (so that you can actually bend your fiber optic cable), you needs a sufficiently high index of refraction. It turns out that the higher the index of refraction, the slower the speed of light in the medium.
It wasn't a sensor bit error that it failed to guard against. The control values I referred to are those in RAM, used by the software. The RAM apparently wasn't parity protected, and a bit-flip in the right word could cause uncontrolled acceleration. It wasn't the only thing that could cause havoc; there were race conditions and stack overflows in the code, apparently, and those were more likely the sources of actual, observed UA.
This lengthy article at EE Times digs into some of the details. The main quote, though, is on page 3:
Memory corruption as little as one bit flip can cause a task to die. This can happen by hardware single-event upsets -- i.e., bit flip -- or via one of the many software bugs, such as buffer overflows and race conditions, we identified in the code.
There are tens of millions of combinations of untested task death, any of which could happen in any possible vehicle/software state. Too many to test them all. But vehicle tests we have done in 2005 and 2008 Camrys show that even just the death of Task X by itself can cause loss of throttle control by the driver -- even as combustion continues to power the engine. In a nutshell, the fail safes Toyota did install have gaps in them and are inadequate to detect all of the ways UA can occur via software.
I don't think that article pointed out this other detail:
Although the investigation focused almost entirely on software, there is at least one HW factor: Toyota claimed the 2005 Camry's main CPU had error detecting and correcting (EDAC) RAM. It didn't. EDAC, or at least parity RAM, is relatively easy and low-cost insurance for safety-critical systems.
This particular set of problems at Toyota was very interesting to us at work. I'm just now starting to work with our safety critical team that sells hardened controllers into the automotive market. They include all sorts of hardware failsafes, including ECC, lockstep execution between parallel cores, etc.
That may be true, but you do see cell towers and cellular basestations, which are similar in a lot of ways to data-centers. It's just that their data is phone calls and whatever data you're streaming over high-speed links.
I recall seeing Sun Microsystems had a facility in Denver when I drove through there around a decade ago. You'd think they noticed.
And yet the Slashdot summary makes it sound like something new. I know at work we always quote our error rates with a location and elevation (eg. New York, sea level), and I understand that's the standard way to do it.
This stuff comes up in deep embedded systems too. Think "ABS brake controller," etc. BTW, this is part of why Toyota got in so much trouble with its drive-by-wire system—it had no parity checking on critical control values. Granted, in an automobile, you have plenty of other sources of potential bit errors, such as extreme temperatures, power issues, exposure to strong fields, etc. But, you gotta protect against them all.
I remember a quote, attributed (likely incorrectly) to Seymour Cray: "Do you want it fast, or do you want it accurate?"
If you want absolutely exact arithmetic, code it entirely with arbitrary precision exact integer arithmetic. All rational real numbers can be expressed in terms of integers, and you can directly control the precision of approximation for irrational real numbers. Indeed, if your rational numbers get unwieldy, you can even control how they are approximated. And complex numbers, of course, are just pairs of real numbers in practice. (Especially if you stick to rectangular representations.) If you stick to exact, arbitrary precision integer arithmetic and representations derived from that arithmetic that you control, then you can build a bit-exact, reproducible mathematics environment. This is because integer arithmetic is exact, and you have full control of the representation built on top of that. Such an environment is very expensive, and not necessarily helpful. You can even relax the order of operations, if you can defer losses of precision. (For example, you can add a series of values in any order in integer arithmetic as long as you defer any truncation of the representation until after the summation.)
If you venture into floating point, IEEE-754 gives you a lot of guarantees. But, you need to specify the precision of each operation, the exact order of operations, and the rounding modes applied to each operation. And you need to check the compliance of the implementation, such as whether subnormals flush to zero (a subtle and easy to overlook non-conformance). Floating point arithmetic rounds at every step, due to its exponent + mantissa representation. So, order of operations matters. Vectorization and algebraic simplification both change the results of floating point computations. (Vectorization is less likely to if you can prove that all the computations are independent. Algebraic simplification, however, can really change the results of a series of adds and subtracts. It's less likely to largely affect a series of multiplies, although it can affect that too.)
And behind curtain number three is interval arithmetic. That one is especially interesting, because it keeps track at every step what the range of outcomes might be, based on the intervals associated with the inputs. For most calculations, this will just result in relatively accurate error bars. For calculations with sensitive dependence on initial conditions (ie. so-called "chaotic" computations), you stand a chance of discovering fairly early in the computation that the results are unstable.
Your argument only makes sense if you fix the target platform, compiler and compiler options for the comparison. In fact, it's trivially provably correct: If the compiler beats my assembly code, I can simply replace my own assembly code with the assembly code the compiler generated and force a tie.
However, that misses the point: I can write the fastest possible assembly for a given platform, but it might not be the fastest way to do something on a different (but compatible) platform. But the C code, without modification, could potentially beat my unmodified assembly code when compiled for that other platform. The compiler has the flexibility to tune its output for the target, while my assembly code is fixed for one target. And if you include platforms with a different underlying assembly language, the C code wins by default because my assembly code doesn't even run.
For example consider 4x4 matrix multiplication. Do you use nested loops or just unroll it manually? Compilers tend not to fully unroll all the nested loops. The compiler may do better scheduling on fully unrolled non-looping code.
Ironically, a vectorizing compiler would prefer you give it the loops instead. If you gave it any hints at all, it should be regarding pointer aliasing (ie. the C99 restrict keyword, for example), pointer alignment and minimum trip counts if any of the loops have variable trip counts. Manually unrolling makes its job much, much harder, usually.
Do you create temporary variables to preload a row or column, or do you just access each variable in memory directly? The former may generate better code on a RISC architecture and the later on a CISC architecture.
If you provide good pointer aliasing qualifiers, I'd hope both produce about the same regardless of CISC or RISC with modern compilers, instruction schedulers and register allocators.
These are the sort of things I think of when referring to helping the compiler, giving it hints. When that mythical smart compiler arrives that is able to figure out the preceding on its own, it will simply ignore your hints, the hints will do no harm. Until this mythical compiler arrives, the hints may help.
When was the last time you used the register keyword in C and it had a meaningful effect? Depending on which aspect of the code you consider, the "mythical" compiler you refer to may be less mythical than you think.
Look up the history of the Stepanov Benchmark as it applies to C++ programs, for example. It was once a hot topic among C++ compiler writers, because it exposed how awful C++ abstractions were to run time. Now most compilers ace it, sometimes producing faster code with the C++ abstractions than the C baseline they're measured against.
Ok, maybe not 20 years old, but 17 years old. Software I wrote in 1996 is still used today to verify chips built in the team I'm in at work. And that code compiles just fine. I haven't developed actively on it in about 14 years. No substantial tweaks to keep it current, either. I don't think it will compile as a 64-bit executable, but given that even Firefox is available as a 32-bit executable by default tells me that that's not a "historical" mode.
I was speaking with a team at work. They're talking about finally replacing some 30+ year old code in their code base with more modular, modern code. Sure, the whole package around it has continued to evolve, but some pieces date back to the first Reagan Administration. High level languages made that sort of continuity possible.
Now granted, the team whose code I'm referring to is a compiler team. Maybe, just maybe, they put more faith in compiled high level languages than your average programmer.