I'm not an expect in functional programming, but I am an expert in other (object, etc) styles. While I appreciate the functional toolbox in languages such as Scala (which I use every day), I don't really see a way to do my day to day job in a purely functional way. Others have mentioned the I/O dilemma, but I think it goes deeper than that. Functional != Efficient for many of the tasks I perform, which are rather iterative. For many of my tasks, the overhead of the functional structures required are either much more memory intensive, or impose a run-time overhead that isn't acceptable. In the end, when what I have to do is move 300 fields from one data structure to another with edits, COBOL would be sufficient...
In related news, Mr. Greenspan has no clue about inequity in stratified markets. If you push on the top, you just compress the layers into smaller layers, with the bottom filling until it can absorb no more. Then you get slums, riots, and chaos. The only way the market works is with a strong middle class with buying potential. Without that there is no market, and hence no profits or growth. Once that contract is broken, it's not a long way to the bottom for most.
FAIL. Try Interplanetary.
Listen to the Beastie Boys...
Interstellar would be a cool trip also, and more probable of finding life than under 100 miles of ice on Europa. Of course there is that extra mileage charge on the rental, and the roaming fees would bankrupt you...
I was a big fan, and a game developer for the C64. Those were the days that a machine could be fully understood by an untrained person with a knack for programming. When the C128 came out, I was interested, especially in the 80 column screen and CP/M software compilers. But there were too many limits on the machine (no hard drive easily added, no real OS, etc.) and it didn't feel like enough of an advancement over the C64. My grandfather did buy one, and I had some time with his, but that never really sparked much either. My next machine would be the Amiga, and as soon as that become somewhat affordable by a college student (the A500), I never looked back.
This is all well and fine, until they herd us all into some kind of processing center and then hook us up like some kind of "D" cell in series to power the mastermind machine...
Richmond Science Museum, on the E&S DigiStar projector - we could play a space-war variant on the dome. No color, of course, but the resolution was pretty good if my memory serves me right. Plus the dials of the control panel were just about perfect for controls.
Drain the fuel, set it upright, patch it up, tow it to Atlantic City - Profit!
(Drop it Lake Mead - Profit!)
(Park it outside Boston - Lawsuit!)
Among my customer base? Yes, it's used internally. A lot of them are IT shops dealing with very old equipment, like 10 year old PC's. Some of them have internal intranet apps that only work on IE6. It will be awhile before those move.
I'll pop the cork when my customers get off IE6. Until then I need to sink development resources into maintaining and testing on IE6, no matter how painful it is.
Unfortunately my customers' IT departments are slow moving and not motivated in moving quickly off XP and IE6. Most of them are understaffed and underfunded and dealing with PC's that are sometimes more than 10 years old. I suppose they have more pressing problems, given that...
I think the fallacy in this argument is not that quality doesn't win out, but that quality isn't always important.
The problem is that the determination process is flawed.
I might make the decision that I need lesser quality (whatever that means) for an internal time-keeping application than I do for something customer-facing, such as my sales portal. The article is of course arguing that I shouldn't be making that decision based on initial cost but on longer-term factors, but on the management side of things as long as I've got a fixed budget rooted in the short-term I can't make that decision equally. Like many financial equations, X dollars today vs. X dollars tomorrow is in play.
There are two types of fools:
1. The fools who trust in the optimization skills of the compiler/JIT compiler
2. The fools who trust in their own optimization skills
Yeah, but there's rules for them:
1. Don't optimize.
2. Don't optimize YET.
Rule 1 is for type 1 - and is generally the best case. Then you can come along and after rule 2 has expired, make the improvement where it matters. Type 2 fools skip both rules and make a mess.
Jack was a charismatic person with an infectious personality. He always was genuine, and had a passion for teaching astronomy. I was traveling and visiting various planetariums up and down the East Coast, with a final stop in Miami to visit the Space Transit. Jack made me feel very welcome and gave me a ton of his time explaining what made his planetarium special. Eventually I came to know that it wasn't the equipment (although that draws the public in initially), but the people that make these programs successful. Jack Horkheimer brought the wonder of the universe down to earth for many people, and I'm glad to have known him, even if only for a short while.
... that THAT didn't go on for too long and they got 'em in a timely manner - I mean if that had kept up, millions of machines could have been compromised! I say, good thing they had LOTS of people investigating so we could catch these crooks before the damage was done.
(Yes, for the impaired, that's sarcasm!)
Two years to track this down?! Give me a break...
Not completely true - the old 68000 series of Macs had lots of different ROM revisions. Some worked with different versions of MacOS, but others didn't. The problem wasn't the ROMs however - it was memory. Remember back in '84-'87 128K-512K was fairly standard, so if you needed to use up a big chunk of that with OS code then you reduce the memory for user applications and graphics. Later versions of the AmigaOS could do tricks and map out various ROM routines into RAM, and even map out the entire ROM to faster RAM using the MMU, giving the machine a good speed boost in the process.
With mirrors! Seriously, I saw a "tank" 3D system back in the late 80's/early 90's hooked up to an E&S display system.