Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Up next, automatic intelligence rating... (Score 4, Insightful) 220

For lack of mod points let me just say: beautiful!

It's like this in any engineering discipline:
* The apprentice doesn't do things by the book, for he thinks himself clever
* The journeyman does everything by the book, for he has learned the world of pain the book prevents
* The master goes beyond the book, for he understand why every rule is there and no longer needs the rules

Or put another way - the apprentice thinks he knows everything, the journeyman known how little he knows, the master knows everything in the field, and still knows how little he knows.

Comment Re:Poor Alan Kay (Score 1) 200

It's a problem when the default ASSERT macro expands to code with such #ifdefs (no joke - that was the norm everywhere I worked with C/C++). At one place it got so bad that we made using the ASSERT macro a firing offense (not sure why we couldn't just fix the macro, some corporate thing no doubt).

And I've been there and done that with the "no resource leaks" in C++. When you provide library code that's easier to use than doing it the wrong way, it's easy to enforce the standard in code reviews (since then it's only the new guy who hasn't seen how easy the tools are yet).

For example, if you have a good FileHandle class, it's simple to educate people to write FileHandle foo = fopen(...); instead of FILE, and then that's it, the file closes when you exit scope. Works perfectly as a member variable as well - no need to remind people that the destructor isn't called if the constructor throws, as members are always cleaned up.

Comment Re:Incredible! (Score 1) 204

A really minimal system, like a virtual machine running a site, can be reduced to far less than that. For instance, Mirage OS (http://www.openmirage.org).

We've seen a web server running on a Commodore 64. Wasn't that a 12 kB OS? It's been a while, but IIRC the OS was in memory from D000 to FFFF.

I've worked on mainframes where the "recovery OS" fit on one tape block - a hard 32 kB constraint (used for disaster recovery - it would load a program that also had to fit in 32 kB which would restore a system from backups). The normal OS wasn't much bigger. Most device drivers weren't memory resident, for example, and shared 4 kB by swapping in and out, which could lead to some mighty odd behavior by today's standards.

Comment Re:Poor Alan Kay (Score 1) 200

Without exceptions, you would put in an assertion

Oh? You check for errors in code that gets #ifdef-ed out in a production build? What could possibly go wrong with that plan? (Or do you mean first the check, then the assert, following every function call, further hiding the few lines of business logic in a huge function).

It's quite easy to write "all exception safe all the time" code in C++, in ways that even the junior guys can't screw up. It's not obvious what that coding standard looks like. That's the big problem with C++. Many have never even seen it done right - it's very understandable why business largely moved to managed code.

People see RAII and think "oh, instead of allocate at the top and free at the bottom, I'll allocate in the constructor and free in the destructor". No, you're still doing it wrong if you have any non-trivial destructors outside of a bit of well-reviewed library code.

If you're doing it right, the only avenues for screwing up resource management are adding stuff to a global object and forgetting it there, as with every language.

Comment Re:DVD (Score 1) 251

The cloud makes a great backup. If what you're archiving is small, encrypt it and upload it to a variety of cloud file companies with free offerings - Cloud Drive, OneDrive, DropBox, etc.

For a moderate amount of data, use (encryption and) Amazon Glacier. If you don't know the trick: Amazon offers mail us a hard drive as an upload format for S3 and Glacier, and it's as good as way to do offsite backups as any.

I wouldn't use the cloud as my only archive, but as the offsite copy it's probably more disaster-survivable than most other choices most of us have available. (And affordable if we're talking a few hundred GB of personal stuff, not the entire multi-TB geek archive of "binaries").

Comment Re:Poor Alan Kay (Score 1) 200

If you don't check for an error due to sloppy coding, you get a failure sometime later which can be quite hard to debug. If you don't handle an exception, your program exits, and if you can repro the problem under a debugger, any good debugger will break where the exception is thrown - immediately debuggable. Which approach better protects customer data from bugs?

If you check for errors after every call, your program become 80% error checking, 20% business logic. Needless cluster that obfuscates code.

For a large enough C program you re-invent exceptions anyway. The return code from every function becomes the error code. The first thing you do after every function call is check for errors, and either handle the error locally if you can, or return it up the stack if you can't. If you make some handy macros for doing that, you might as well call them "try" and "catch" and "throw", since you're just doing what the compiler does with exceptions, except in a manual, tedious, and eror-prone way.

Really, this was an intelligent argument 20 years ago. The experiment was tried, the data is in, almost everyone moved to languages with exceptions because they make it easier to get it right, not out of some group masochism.

Comment Re:Poor Alan Kay (Score 2) 200

True, because it is basically terrible for everything, it is terrrible also for using it in the same way as C.

We get it, you don't like C++. I don't like strawberry ice cream.

Yes, RAII is nice. But only *some* memory and resource leaks go away, basically the ones which are trivial, because allocation and deallocation simply follow lexical scope. Ofcourse, this is only trivial in languages which do not have exceptions. Exceptions make this simple thing very complicated, and without RAII it is indeed almost impossible to avoid resource leaks in C++. But without exceptions, it is not so much of a deal. In other words, RAII had to be invented after the fact to make exceptions usable in C++ because - again - some feature were introduced without much thought.

Exceptions are absolutely the right way to do error handling. This was controversial last century, maybe? But it's more than simple RAII - if you have non-trivial destructors, you're likely doing it wrong. Shared_ptr combined with scoped objects fixes the non-trivial ones, and basically everyone uses shared_ptr for everything now. Perhaps over-used, but it gets it right.

This is only a tragedy for people who have to use C++ or think they have to. There is nothing more liberaring than to realize that all this complexity of C++ is completely unnecessary.

I haven't paid much attention to D, but C++ is in a space where none of the othe mainstream languages are. C is quite overused for lack of expertise in C++ - and Java likely is as well.

Comment Re:Poor Alan Kay (Score 3, Insightful) 200

You can wrote very fast an elegant code in C++ just as easily as in C - it's just a different tool set. C++ is not for writing code using the same approach one uses with C; It's terrible for that. But once you understand scoped objects, all memory and resource leaks go away (well, you can attach something to a global structure and forget about it, but you can mess that up in any language). That alone is a huge win.

C++ has one terrible, fundamental flaw: the learning curve is too high. There's just about nothing where the "right way" is obvious, or even common. And so few people get to real expertise that there's not a common library that collects all those right ways and makes them easy to learn! It's a tragedy, really.

Comment Re:Interstellar missions... (Score 1) 211

You could set up a mirror array to focus all the light of the Sun into a point. You still couldn't heat up an object there hotter than the surface of the Sun - it would be radiating heat away fast enough to stay at that temperature.

Temperature is a potential: like torque, or voltage difference. It limits what you can do, no matter how much light you focus, just like torque limits the force you can apply no matter how much power you have, or similar with voltage and current. For mechanical and electrical power, getting more potential (with the same total power, less losses) is easy - just add a gear or a transformer.

With light it's also possible, but it's not optics, and it's pretty rare - fluorescent materials which absorb multiple photons of a lower frequency and emit one of a higher actually do exist, and could passively raise the temperature of part of a system, (much to the horror of thermodynamicists). It doesn't violate any conservation rules, any more than a low-temperature heat engine driving a high-temperature electric heater does. But that's not at all what's happening with mirrors and optics, which like are putting your batteries in parallel, not in series.

Comment Re:Interstellar missions... (Score 1) 211

Photons also have a temperature in the sense that it's the maximum temp you can raise a blackbody to (or maintain it at) no matter how many photons at that frequency you use (the blackbody curve is for an ideal gas, so the idea doesn't extend well to ionizing radiation).

what exactly the physical mechanism is by which kinetic energy causes photons to be emitted

At the level I understand it: knock two molecules together, and sometimes you get a electron elevated to a higher energy state instead of an elastic collision. The difference between that energy state and the ground state is the energy of the photon emitted, and thus it's color.

But that model is really for gasses - get hot enough and the electrons take their time returning to a ground state, or start flying off, and the rules change - the phase transition to plasma - and the kinetic energy of the nuclei starts to fade as the dominant heat energy. Hot enough and you have a sea of free electrons constantly exchanging energy via photons (which don't get very far at the density required for such temperatures). Those conditions are far, far better insulator than empty space, but at the boundary these very high energy photons simply escape: a very different heat-to-light mechanism than in a gas (and simpler than "normal" plasma, which I don't understand at all).

Empty space is transparent, which is another way of saying it provides no insulation at all for radiative heat. At the center of the Sun it's so opaque that no English word can really do it justice, but it takes millions of years for the heat at the center of the Sun to reach a layer where convection is meaningful.

Comment Re:Bullshit (Score 2) 211

Well, put a AA in a box and come back in 175 years, and try it out. Then we'll see how impressive that is.

Oh-ho, smart guy, see how may of those you'd sell!

It is impressive though. Torpedoes need a high-power battery that can be stored for many years and still be at 100% when needed. They used to cheat, though, and use a wet cell with the chemicals stored separately - mix everything together when it's time to load, and you're ready to go. No leakage unless there's actual leakage. I wonder what they do today - a dry cell with no leakage would be safer and easier.

Comment Re:I thought they're making money... (Score 1) 201

You'd think so, but it's a long, slow buildout to get that return, so growth! would be slow. Companies don't much care about stable profits, since that just means a stock price that stays flat, no it has to be about growth! Without growth! how does a CEO prove he's the guy to make your stock price go up?

It's the most infuriating thing about modern America, really - everyone's chasing capital gains, and dividends are often seen as a bad thing. For a long time there was a good tax reason for that - that's largely fixed now - but the culture is stuck on growth! regardless.

Slashdot Top Deals

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...