Become a fan of Slashdot on Facebook


Forgot your password?

Comment Re:Sad (Score 2, Insightful) 107

If you use C/C++ right, you do not end up writing a JIT compiler for a language never intended for it. This is a bug in v8. Now, we don't know where, but that's the kind of code that does things no one sane should ever do. It is supposed to take shortcuts and patch things on the fly. It's of course fully possible that this exploit is not in a performance-critical path, and then your comment is rather well placed. But I do think that anyone writing C/C++ in this context is a fool himself. It is for all practical purposes impossible to use C without doing bare pointer addressing. It is highly possible to use C++ without doing it, even though such use is not terribly widespread.

Comment Re:replicate earth air purification (Score 3, Informative) 112

It's not like putting a sliced tomato on the kitchen sink in a humid climate will prevent other parts of your kitchen from attracting any mold spores around. Bacteria and prokaryotes are mostly incapable of macrosopic movement (especially in air). They are also able to rapidly expand populations. Therefore, a "colonist" doesn't choose to move to the best spot, foregoing a worse one. They will try everywhere. If they gain a foothold, that foothold is likely to just unleash further colonists into the less hospitable, but still slightly viable, habitats.

Comment Re: Cut to the chase (Score 1) 134

Well, for making this "frame rate" theory relevant, the question is not only if anything happens at or close the frame rate, but what is the frame stepping function? And, throwing relativity into the mix, in what reference frame?

A discretized spacetime would mean that the continuous solutions to the Schrödinger/Dirac equations are actually approximations that are better expressed by some discrete time stepping scheme. That could have macroscopic consequences. Especially so if for some weird reason Nature has a rather simple first-order scheme at its frame rate core. But, it does also mean that we would get slightly different results from different objects in free fall, depending on their overall speed relative to the reference frame. This would control the factor between the "local passage of time", and the actual number of "Planck time frames" used by the process. In addition, the discretization of time almost necessitates a discretization of space. This not only means that space has some small grid (not likely either, based on current theory). It also means that there are some absolute directions in space and that some physical processes would behave slightly differently (even if aggregated along macroscopic distances) if they are algined to these directions, or not.

Comment Re:And then we know ... what exactly? (Score 5, Interesting) 134

Well, electron states being quantized has helped us to (truly) understand chemistry and create transistors as well as LEDs. By realizing that things are only allowed to make certain transitions under certain conditions, you can "cheat" and build up high-energy states that are far more stable than they really should be. I am not saying we would get macroscopic anti-gravity or a "Faraday cage for gravity", but this is kind of the space where we would get more specific explanations for how you might be able to accompish those things in theory. For very delicate experiments (similar to the one described!) and possibly sub-nanoscale manufacturing procedures, an understanding of a quantized nature of gravity influences might be useful, if only for better understanding the noise in measurements and tolerances.

Comment Re:It would have to be. (Score 2) 134

Even if mass would be quantized, the Newtonian equation is m1m2/r^2. Even with discrete mass quanta (which is also false, see other replies), you would get a continuous spectrum of resulting forces. Inserting relativity here changes the expressions, but it would really just muddle things. So, no, there is no specific reason to believe gravity to be quantized - outside of an actual theory of quantum gravity.

Comment Re: Stupid (Score 1) 153

If it is just a bug, then we should expect a quick fix and firmware release from VW. If, however, it was a conspiracy, and there is no way that VW EGR technology can ever be made to pass the NOx requirements (without additional hardware - AdBlu tanks), then VW is screwed.

My point is that a "too good to be true" bug could easily have quite devastating consequences if it's just fixed. If they remove 'false' in the putative "if (isInTest() || (isNOxReductionNeeded() && false) enableEGR();" line and this increases fuel consumption or reduces maximum torque a lot, they cannot simply release that fix.

Embedded automotive control systems and scientific research are quite different domains, but in science I've repeatedly been close to thinking I had solved a problem, just to realize that my benchmark was off and the code was not really working at all. I have not published any of those results (AFAIK) yet, but I've reviewed and seen publications with blatant errors. When you have reached the kind of result you hoped for and believed likely, you are not on guard anymore. Fixing the blatant error might very well mean that the whole work is pointless. The error is trivial, the consequences are not.

Comment Re:Correct Conclusion, Wrong Rationale (Score 1) 153

The sensors required to detect "test mode" and software driven EGR control hardware are already part of any modern car so there was no decision to "add" them to accomplish this cheat. But there had to be a strategic decision to not to add SNCR, and that is a decision that could only be made at a very high level.

Yes, but that was not a conspiracy. It was very clear in the specs and even highlighted as an advantage. The question is "who knew and approved, at what point, that this design would not work out in practice". Even the idea that smart design of the control regime would make it possible to achieve low emissions without SNCR is not, in itself, equivalent to fraud. It's only when this design becomes all about "detecting test conditions" things get really, really bad.

Comment Stupid (Score 3, Informative) 153

The linked article makes the point that the sensors and hardware would not be necessary. I think the writer seriously underestimates to what extent a modern car with protection systems will try to juggle different constraints. Things like non-driving wheel rotation (defeated by being on a lab stand) are needed for breaking systems and possibly to some extent to moderate throttle control for stability. Wheel movement patterns are also needed and useful, even if you don't actually have electric power steering.

Regulating the exhaust gas recirculation somehow also makes sense. You might go totally on and off, but you would certainly want to keep it at a sensible level. You want good acceleration and full combustion of fuel while still not emitting to much nitrous oxides. It makes total sense to me that you might want to design your control system to try to judge not only the current emission levels, but also the overall driving pattern (steady straight ahead, repeated stop and go, etc) with some kind of state machine to try to find the best EGR regulation regime. This requires sensors and ways to regulate the feature.

My most innocent guess about how something such as this might have happened was an intent to find a good regime that would give nice bursty performance, while keeping nitrous oxides low overall. Progressively, the control regime was pushed until it ended up in the corner where the case of EGR being properly activated under real-world conditions basically does not happen. Some parts of it might even in the end be a bug between the intended state transitions and the actual ones. Like all bugs that give performance that seem too good to be true on the metrics you really care about (fuel consumption and enjoyable driving), no-one investigated.

Do I think it happened this way? It's hard to say. Probably not. But, in one way, it's even more frightening than an evil conspiracy. It's easy to say "I wouldn't take part in a conspiracy by my employer". It's harder to say "I would never be pressed to write code with goals that could not be fulfilled, eventually find a hack that seemed to work, and maybe ignore investigating why it worked so well"...

Comment Re:If a high IQ were better for the individual (Score 1) 385

Unless it, say, causes a higher energy usage or makes you slightly more prone to parish from an infection. The selection pressure for most of our evoluationary history might just be a tad different than it is today. It works the other way, too, of course. Other threads note the increased risk of getting depressed from all the "bad news you can't fix". A high intelligence might make it harder to just shrud that off, while you could more easily filter it out with lower intelligence. (Just like kids can hear some conversations and really don't take note of the full depth of what's being said.) This phenomenon might be worse today than it used to be.

Comment Re: Propheteering (Score 1) 131

What is this fusion "ore" you are talking about? Even if we restrict ourselves to deuterium or even tritium, the ocean reserves are plentiful even in the "multiple orders of magntiude" energy consumption case. Longterm, exponential growth will require space exploration and I am all for it in short-term, but let's keep to the facts.

Comment Re:A victim of applications and history (Score 1) 129

This seems to come out of the peculiar microsoft feature of being able to be an administrator user but without administrator privilege most of the time except when needed, and a lot of work to make this escalation happen in an non-intrusive fashion or be faked depending on context. It's a really complicated beast that no other platform tries to do.

MS up to and including XP (excluding the DOS based family) basically had the same as everyone else, you either were an administrator or you weren't, with facilities to 'runas' an elevated user to handle as-needed. The problem being they had tons of software from the DOS based system failing to use the right section of the registry and filesystem, requiring people to go through pains to run as administrator to run a lot of applications. This meant that most XP users just logged in as administrator.

To mitigate it, they embarked upon crafting this fairly complex thing to make running as administrator user safer most of the time. It's funny because at the same time they started doing more and more to allow even poorly designed DOS-era software to run without administrator. They create union mounts to make an application think it can write to it's application directory even when it cannot (and do sillier things like make 'system32' a different directory depending on whether a 32 or 64 bit application is looking). I do the atypical usage of a non-administrator user full time with UAC prompts nagging me about passwords if needed, and nowadays it doesn't nag any more than sudo does in a modern linux desktop. If I understand this behavior correctly, this usage model might be immune to this risk factor.

While impersonation and other techniques is used a lot more and including larger portions of the API, impersonation itself has been along since NT 3.1. Are you a file server process serving a request from a client? Just create an impersonation context for the user who sent the request and pass that along to the file system. You only need to make sure that you create the right context and tell other services on whose behalf you are doing this. This is not identical to setuid and similar, most importantly because a single thread can keep many impersonation contexts.

That this is part of the application compatibility cache service is almost coincidental, the real problem is in the fact that impersonation services are used, but used incorrectly. Impersonation was part of the original NT design, and for relatively good reason.

Comment Re:What about long-term data integrity? (Score 4, Informative) 438

you first need to copy that data into another block, erase the original one, write all data back and erase your "tmp" block. The churn on blocks happens a lot faster than what you'd think.

If that's the case, then why are they not copying the data to ram contained on the drive itself? Seems like an awful waste of cycles with a relatively simple fix. Is it just a cost issue?

Any wear levelling worth its salt will not do what the grandparent wrote. You simply do not change one page in a block. If you write a single page, that is handled by mapping that page to another (free) block and maintaining a mapping table for which LBAs are currently stored in what blocks. However, if you are doing single-sector writes, or in turn repeated I/O flushes of the same sector, you still see a lot of write amplification. To keep data integrity, the mapping tables also need to be kept updated in a correct way (or at least uniquely recoverable by scanning through all blocks after a hard power off).

Comment Re:Book Analogy (Score 1) 260

But, well, the difference is that Oracle has actively asked everyone else to quote references to their book. Google has produced a product that only respects those references that Oracle has encouraged anyone to use. If Oracle starts pursuing anyone *writing* Java code for copyright infringement ("hey, you called all methods of ArrayList, in the order they are declared"), that's a different thing.

Has Google copied Javadocs? Those texts are not necessary for technical interoperability. Thus, it would be a very different thing. Public symbols should be just that, public.

Slashdot Top Deals

10.0 times 0.1 is hardly ever 1.0.