Comment Re:Easier to Analyze or Change == More Maintainabl (Score 2) 247
I hope not.
I hope not.
... There are things it just isn't well suited for...
Like writing computer programs?
1972 called, they're still pissed you returned their 'write-only memory' joke in such poor condition.
I've been saying that for 15 years. I just hope I don't need to wait another 15.
but I go back far enough to remember the horrible spaghetti code that people used to write
So do I, though I still think Dijkstra was wrong about that. I used to get skewered for expressing that opinion, however. Not that it matters much, I've found most developers are a bit over zealous when it comes to defending their treasured folk-knowledge.
In before the OOP craze, like you were, I thought it was just going to be a passing fad -- like countless fads before and after. I have no explanation for its sticking power, save the early popularity of Java and Microsoft's subsequent clone, C#. I figure it would have been dead before the new century had Sun and Microsoft hadn't tried to cash-in on it. It's a shame MS's ploy to fragment Java failed. It's one evil plan that might have done us some good!
Fortunately for us, It is weakening. Sacred cows are starting to look like the mistakes they always were. The hipster developers are even promoting composition over inheritance. (And not a moment too soon. I've seen a lot of talk about multiple inheritance lately. I thought we'd already learned our lesson about that!) A lot of young developers are even learning what modularity actually entails, and how OOP is inherently anti-modular. (It used to be a popular belief that OOP gave you modularity for free! It looks foolish in hindsight, I know, but that was the marketing buzz.) It gives me a bit of hope for the future.
So I'll keep my fun Dijkstra quote, to lend support to the next generation who will cast-off our mistakes. It looks like they're trending toward an imperative+functional era. It'll be interesting to see what comes out of that.
Yes, but it tends not to combust spontaneously.
Ah, so you're in on the conspiracy!
I considered that as well, and pulled down the picture from multiple sources.
I'm convinced that this is just a joke that I've missed. I've tried good displays, crappy displays, various lighting, brightness settings, backgrounds, room lighting, viewing angles and probably something I've forgot. I can not get that dress to look white and gold.
It reminds me of "The emperors new cloths".
Why are you telling me? I don't care. Neither, very likely, does Dijkstra. It's a just a fun throw-away comment he made.
Is the accuracy of the origin of either the term or the concepts essential to the quote? No.
Relax. It's not worth the time.
No, you didn't. You just suck at reading.
Nowhere do I imply that the icons are appealing -- only that they're rendered with skill that exceeds that of the average six-year-old. If a six-year-old had produced those icons, I'd be very impressed. As they're presumably the work of a professional designer, they're absolutely awful.
Well, it's very basic logic. It just takes a long time to explain, in very painful detail, to someone without a background in formal logic. To see my perspective, try to explain something like 4=2+2 in a post under the assumption that anything you write could be challenged, regardless of the relevance, by someone with no understanding of basic arithmetic. Oh, and without being able to type common operators. It'll be a pain the ass, take forever, and is very unlikely to produce positive results.
I think you replied to the wrong post.
You can't choose to believe or not believe.
Really. Don't just take my word for it. Try it out.
Not really. The image I have is of an isolated village in the south seas fruitlessly building runways.
Is your job running? You'd better go catch it!