Do you want some? I think it might help.
Do you want some? I think it might help.
Perl: Can't disagree there, really.
PL/SQL: Right, because oracle enterprise usage is dying out and no-one creates new databases based on it anymore...
COBOL: This will continue to exist when there's nothing left but rats and cockroaches. Wanna know why? Because it *works*, it works on mainframes, and it works *fast*. The business processes it runs rarely change and the code is all a very well known quantity by know, so there's absolutely no need to change it any more than utterly necessary. Sorry to inform you, but it's going to be around for another 20 years at least.
(Hint: It would either be 'IF YOU = 1 THEN' or 'IF READ-THIS THEN').
Visual COBOL? Fujitsu did that - I have a demo disk from somewhere around 2001.
COBOL++? Well, OO COBOL has been in existence since 1996/7 that I'm aware of and doubtless from before that. Microfocus were the first to do it that I came across, but the above-mentioned fujitsu compiler also did OO.
Did you also know that COBOL.NET exists? Oh yes. Be afraid...
Then I was moved a little more.
And I was moved a little more still...
And of course, despite the damage she caused both the UK and the world at large, she will be given a state funeral...
You might want to try checking your facts before posting. Here's a hint: No she won't.
And of course, despite the damage she caused both the UK and the world at large, the Labour Party hacks will be out in force with nary a bad word to be said. (That's 'cause "New" Labour is just another party of capitalism, no longer socialism, if it ever was.)
She did some damage, she did some good (generally speaking, the ones who claim nothing beyond the damage are those who didn't suffer the three day working week and its ilk). Such is the way of politicians. She made some tough decision that had to be made, she made some bloody awful decisions that we are still feeling the repercussions of today.
In the end, she made a large impact on world politics exactly when a large impact was required. Hindsight is a wonderful thing, but I'm not sure what the world stage would be like now if we'd had one of the current spineless idiots in charge in the latter days of the cold war.
A month might make you able to write fairly complex stuff, but it won't give you time to learn the best ways to write efficient fast code and COBOL, despite its apparent simplicity, is remarkably easy to write nasty (self modifying if you wish) resource-hogging evilness. If you're on mainframes, it'll be longer than that before you've figured the full intrigues of things like Expediter, or, if you're really unlucky, core dumps, which can be your only way to debug.
I've worked with COBOL since the mid 90s, so I'm still considered a noob in the field, but I've seen some horrors written by people with twice the experience I have and I've rarely seen *good* code written by people with anything less than a year of it on their CV.
Bear in mind also that most COBOL is mainframe still, so chances are that as well as the language itself, you're going to have to learn DB2, JCL, CICS and suchlike. Mainframe assembly will also likely crop up in your radar and in certain financial institutions, PL/1 - all linked into one big horrible mess. You might think you'll learn COBOL in a week, but almost no company using it for mission-critical stuff will let you within a mile of their production systems until you've a couple of years under your belt.
My other half has a remote desktop system so she can work from home if required that uses a java plugin. Last week, all of a sudden, it didn't work, with just a 'plugin inactive' message on screen. Clicking on that took you to software update, which showed no available updates, because this is on a Snow Leopard machine that there wasn't an update for yet.
There was no explanation of what was going on (plugins showed as allowed in the preferences pane) or whether it was an issue with the remote desktop provider or Apple, or anything. Somewhat frustrating and took me far longer than it needed to have done to sort, including raising a ticket with remote desktop software provider, which we now have to cancel.
I do like Apple hardware and software under most circumstances, but this wasn't one of their better moves.
int factorial( int n )
return (n > 1)? n * factorial( n - 1 ) : n ;
Either high or sober, that's what I'd have gone for. That aside, I think it's fairly obvious that the mantra would be 'design high, code sober'...
The fixed hardware and low power of the Pi is just begging for a lightweight, low footprint OS
There is one already. It's called RISCOS. Sure, it needs some work (like pre-emptive multitasking and SMP, okay a *lot* of work), but it's small (the OS uses 6Mb of RAM) and it's very fast. And there's already a reasonable amount of software available for it, plus a working GCC implementation, so more can be ported.
It just needs volunteers. Preferably ones who will happily write hand optimised ARM assembler...
They still have a commanding market share in many areas...
And that's the exact reason you're unlikely to see them reinvent themselves the way Apple did. Apple did it because they had no choice - they were getting their asses handed to them in every sector they were in, they were haemorrhaging money and were on the verge of bankruptcy. It was a do-or-die move.
Microsoft have no need to copy them. They may not be raising the roof on the stock indexes, but they're still making money and because of that, inertia will mean that they'll never look at the kind of radical solutions that Apple did; it's easier to play the safe game and make smaller profits for less risk.
Does the job for most situations that require any form of ID.