Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Comment: Re:Positive pressure? (Score 1) 337

by lgw (#48934523) Attached to: Why ATM Bombs May Be Coming Soon To the United States

The chip requires a PIN to be entered. If you don';t do that correctly within three times, the card is rendered useless.
And this does not have to be three consecutive times.

So even if you have the card, you are unable to do any purchases with it.

Turns out: not so much. As was predicted by the security community, there are flaws, and after a couple years the flaws were exploited, and the PIN is retrievable. This cycle has repeated (is chip-and-PIN in its 3rd generation now? it's at least the second).

Chip-and-PIN means only that the bank makes you liable for your stolen money, claiming "the card couldn't possibly have been stolen because magic". It solves a problem for the banks, and makes it worse for the consumer - shocking, I know.

Comment: Re:Maybe if Adobe fixed their broken updater... (Score 1) 194

by lgw (#48929217) Attached to: Adobe's Latest Zero-Day Exploit Repurposed, Targeting Adult Websites

Just because the shady back-alley freeware does it, does not in any way make a good excuse for a AAA software vendor to do so

And AAA vendors don't. Adobe products are simply shady back-alley freeware as proven by their installer. Java too, of course.

Comment: Re:Up next, automatic intelligence rating... (Score 4, Insightful) 217

by lgw (#48927721) Attached to: Anonymous No More: Your Coding Style Can Give You Away

For lack of mod points let me just say: beautiful!

It's like this in any engineering discipline:
* The apprentice doesn't do things by the book, for he thinks himself clever
* The journeyman does everything by the book, for he has learned the world of pain the book prevents
* The master goes beyond the book, for he understand why every rule is there and no longer needs the rules

Or put another way - the apprentice thinks he knows everything, the journeyman known how little he knows, the master knows everything in the field, and still knows how little he knows.

Comment: Re:Poor Alan Kay (Score 1) 200

by lgw (#48927583) Attached to: Bjarne Stroustrup Awarded 2015 Dahl-Nygaard Prize

It's a problem when the default ASSERT macro expands to code with such #ifdefs (no joke - that was the norm everywhere I worked with C/C++). At one place it got so bad that we made using the ASSERT macro a firing offense (not sure why we couldn't just fix the macro, some corporate thing no doubt).

And I've been there and done that with the "no resource leaks" in C++. When you provide library code that's easier to use than doing it the wrong way, it's easy to enforce the standard in code reviews (since then it's only the new guy who hasn't seen how easy the tools are yet).

For example, if you have a good FileHandle class, it's simple to educate people to write FileHandle foo = fopen(...); instead of FILE, and then that's it, the file closes when you exit scope. Works perfectly as a member variable as well - no need to remind people that the destructor isn't called if the constructor throws, as members are always cleaned up.

Comment: Re:Incredible! (Score 1) 192

by lgw (#48923829) Attached to: Computer Chess Created In 487 Bytes, Breaks 32-Year-Old Record

A really minimal system, like a virtual machine running a site, can be reduced to far less than that. For instance, Mirage OS (

We've seen a web server running on a Commodore 64. Wasn't that a 12 kB OS? It's been a while, but IIRC the OS was in memory from D000 to FFFF.

I've worked on mainframes where the "recovery OS" fit on one tape block - a hard 32 kB constraint (used for disaster recovery - it would load a program that also had to fit in 32 kB which would restore a system from backups). The normal OS wasn't much bigger. Most device drivers weren't memory resident, for example, and shared 4 kB by swapping in and out, which could lead to some mighty odd behavior by today's standards.

Comment: Re:Poor Alan Kay (Score 1) 200

by lgw (#48918683) Attached to: Bjarne Stroustrup Awarded 2015 Dahl-Nygaard Prize

Without exceptions, you would put in an assertion

Oh? You check for errors in code that gets #ifdef-ed out in a production build? What could possibly go wrong with that plan? (Or do you mean first the check, then the assert, following every function call, further hiding the few lines of business logic in a huge function).

It's quite easy to write "all exception safe all the time" code in C++, in ways that even the junior guys can't screw up. It's not obvious what that coding standard looks like. That's the big problem with C++. Many have never even seen it done right - it's very understandable why business largely moved to managed code.

People see RAII and think "oh, instead of allocate at the top and free at the bottom, I'll allocate in the constructor and free in the destructor". No, you're still doing it wrong if you have any non-trivial destructors outside of a bit of well-reviewed library code.

If you're doing it right, the only avenues for screwing up resource management are adding stuff to a global object and forgetting it there, as with every language.

Comment: Re:DVD (Score 1) 249

by lgw (#48917749) Attached to: Ask Slashdot: Best Medium For Personal Archive?

The cloud makes a great backup. If what you're archiving is small, encrypt it and upload it to a variety of cloud file companies with free offerings - Cloud Drive, OneDrive, DropBox, etc.

For a moderate amount of data, use (encryption and) Amazon Glacier. If you don't know the trick: Amazon offers mail us a hard drive as an upload format for S3 and Glacier, and it's as good as way to do offsite backups as any.

I wouldn't use the cloud as my only archive, but as the offsite copy it's probably more disaster-survivable than most other choices most of us have available. (And affordable if we're talking a few hundred GB of personal stuff, not the entire multi-TB geek archive of "binaries").

Comment: Re:Poor Alan Kay (Score 1) 200

by lgw (#48914793) Attached to: Bjarne Stroustrup Awarded 2015 Dahl-Nygaard Prize

If you don't check for an error due to sloppy coding, you get a failure sometime later which can be quite hard to debug. If you don't handle an exception, your program exits, and if you can repro the problem under a debugger, any good debugger will break where the exception is thrown - immediately debuggable. Which approach better protects customer data from bugs?

If you check for errors after every call, your program become 80% error checking, 20% business logic. Needless cluster that obfuscates code.

For a large enough C program you re-invent exceptions anyway. The return code from every function becomes the error code. The first thing you do after every function call is check for errors, and either handle the error locally if you can, or return it up the stack if you can't. If you make some handy macros for doing that, you might as well call them "try" and "catch" and "throw", since you're just doing what the compiler does with exceptions, except in a manual, tedious, and eror-prone way.

Really, this was an intelligent argument 20 years ago. The experiment was tried, the data is in, almost everyone moved to languages with exceptions because they make it easier to get it right, not out of some group masochism.

Comment: Re:Poor Alan Kay (Score 2) 200

by lgw (#48896749) Attached to: Bjarne Stroustrup Awarded 2015 Dahl-Nygaard Prize

True, because it is basically terrible for everything, it is terrrible also for using it in the same way as C.

We get it, you don't like C++. I don't like strawberry ice cream.

Yes, RAII is nice. But only *some* memory and resource leaks go away, basically the ones which are trivial, because allocation and deallocation simply follow lexical scope. Ofcourse, this is only trivial in languages which do not have exceptions. Exceptions make this simple thing very complicated, and without RAII it is indeed almost impossible to avoid resource leaks in C++. But without exceptions, it is not so much of a deal. In other words, RAII had to be invented after the fact to make exceptions usable in C++ because - again - some feature were introduced without much thought.

Exceptions are absolutely the right way to do error handling. This was controversial last century, maybe? But it's more than simple RAII - if you have non-trivial destructors, you're likely doing it wrong. Shared_ptr combined with scoped objects fixes the non-trivial ones, and basically everyone uses shared_ptr for everything now. Perhaps over-used, but it gets it right.

This is only a tragedy for people who have to use C++ or think they have to. There is nothing more liberaring than to realize that all this complexity of C++ is completely unnecessary.

I haven't paid much attention to D, but C++ is in a space where none of the othe mainstream languages are. C is quite overused for lack of expertise in C++ - and Java likely is as well.

Comment: Re:Poor Alan Kay (Score 3, Insightful) 200

by lgw (#48896099) Attached to: Bjarne Stroustrup Awarded 2015 Dahl-Nygaard Prize

You can wrote very fast an elegant code in C++ just as easily as in C - it's just a different tool set. C++ is not for writing code using the same approach one uses with C; It's terrible for that. But once you understand scoped objects, all memory and resource leaks go away (well, you can attach something to a global structure and forget about it, but you can mess that up in any language). That alone is a huge win.

C++ has one terrible, fundamental flaw: the learning curve is too high. There's just about nothing where the "right way" is obvious, or even common. And so few people get to real expertise that there's not a common library that collects all those right ways and makes them easy to learn! It's a tragedy, really.

Comment: Re:Interstellar missions... (Score 1) 211

by lgw (#48895995) Attached to: At Oxford, a Battery That's Lasted 175 Years -- So Far

You could set up a mirror array to focus all the light of the Sun into a point. You still couldn't heat up an object there hotter than the surface of the Sun - it would be radiating heat away fast enough to stay at that temperature.

Temperature is a potential: like torque, or voltage difference. It limits what you can do, no matter how much light you focus, just like torque limits the force you can apply no matter how much power you have, or similar with voltage and current. For mechanical and electrical power, getting more potential (with the same total power, less losses) is easy - just add a gear or a transformer.

With light it's also possible, but it's not optics, and it's pretty rare - fluorescent materials which absorb multiple photons of a lower frequency and emit one of a higher actually do exist, and could passively raise the temperature of part of a system, (much to the horror of thermodynamicists). It doesn't violate any conservation rules, any more than a low-temperature heat engine driving a high-temperature electric heater does. But that's not at all what's happening with mirrors and optics, which like are putting your batteries in parallel, not in series.

Comment: Re:Interstellar missions... (Score 1) 211

by lgw (#48895947) Attached to: At Oxford, a Battery That's Lasted 175 Years -- So Far

When light pressure is the dominant force, balancing gravity, and the energy of the system is dominated by the energy of the photons and electrons, conduction isn't playing a big role, percentage wise. The difference between 5/2 power and 4th power means the latter dominates at millions of Kelvin, no?

Comment: Re:Interstellar missions... (Score 1) 211

by lgw (#48891843) Attached to: At Oxford, a Battery That's Lasted 175 Years -- So Far

Photons also have a temperature in the sense that it's the maximum temp you can raise a blackbody to (or maintain it at) no matter how many photons at that frequency you use (the blackbody curve is for an ideal gas, so the idea doesn't extend well to ionizing radiation).

what exactly the physical mechanism is by which kinetic energy causes photons to be emitted

At the level I understand it: knock two molecules together, and sometimes you get a electron elevated to a higher energy state instead of an elastic collision. The difference between that energy state and the ground state is the energy of the photon emitted, and thus it's color.

But that model is really for gasses - get hot enough and the electrons take their time returning to a ground state, or start flying off, and the rules change - the phase transition to plasma - and the kinetic energy of the nuclei starts to fade as the dominant heat energy. Hot enough and you have a sea of free electrons constantly exchanging energy via photons (which don't get very far at the density required for such temperatures). Those conditions are far, far better insulator than empty space, but at the boundary these very high energy photons simply escape: a very different heat-to-light mechanism than in a gas (and simpler than "normal" plasma, which I don't understand at all).

Empty space is transparent, which is another way of saying it provides no insulation at all for radiative heat. At the center of the Sun it's so opaque that no English word can really do it justice, but it takes millions of years for the heat at the center of the Sun to reach a layer where convection is meaningful.

Comment: Re:Bullshit (Score 2) 211

by lgw (#48891779) Attached to: At Oxford, a Battery That's Lasted 175 Years -- So Far

Well, put a AA in a box and come back in 175 years, and try it out. Then we'll see how impressive that is.

Oh-ho, smart guy, see how may of those you'd sell!

It is impressive though. Torpedoes need a high-power battery that can be stored for many years and still be at 100% when needed. They used to cheat, though, and use a wet cell with the chemicals stored separately - mix everything together when it's time to load, and you're ready to go. No leakage unless there's actual leakage. I wonder what they do today - a dry cell with no leakage would be safer and easier.

Money is its own reward.