Try Ars Technica. Really.
About your signature, let me recommend you Ars Technica.
Stop spreading non-sense.
There's no such merge and I challenge you to prove it otherwise.
Corruption in the other hand, is a cancer that no country can say it's free of.
That is what SuperKendall said ("Even if you obsfucated the string
Don't get me wrong, please.
I love KDE since 1.x. I've always hated GNOME since it was shipping with RH 5.2.
But I've been waiting for KDE and the whole Linux desktop experience to be good, and 11 years have passed.
Today, I have embraced Mac OS X for my personal desktop, and *love* it: I'm not looking back any more; I use OpenBSD for most of my servers of course; and for the PCs at work, I have succumbed to Ubuntu, a very customized Ubuntu that doesn't expose a whole desktop, but just a dock with only the applications the users need for work.
I fail to see why an easy to use desktop on Linux is needed any more, because all my three uses for a computer are already perfectly covered.
Not that I would reject using Linux as a desktop somewhere. Heck, I have used OpenBSD as a desktop. But just because I want to contribute to the project, or learn their internals. And for my users, it's better for them and for the company/institution that forget that they have a "Personal Computer": they don't; the machine in front of them is just a tool to get their job done. Then a whole desktop is overkill.
Wow, I'm surprised. But it actually explains some things.
I can't find a reference about it. Do you have link?
I would add "powerful" to your points (which I guess are limited to the desktop):
* Mac OS X: usable and powerful (great UI + great foundation)
* Windows: just plain convenient, thanks to the size of the install base and people familiar to it
* GNU/Linux: powerful, but not usable
Being said that, I'm actually using the three of them at work:
* GNU/Linux for the people that is responsible for a few and very specific tasks for which Ubuntu has been customized.
* Windows for the yet-to-be-converted PC because of in-house systems or 3rd party software that require Windows and is pending of getting an alternative
* Mac OS X for people that know better. Which means the IT department
.NET are definitely MORE powerful than objective C in a general purpose sort of way (objective C might have more depth in specific targeted areas like GUI widgets, but the breadth of massive frameworks like C# and Java is truly vast)
Having more libraries doesn't make a language more powerful; in any case, it would make it more productive.
The hard link you have in your mind between Objective-C (or the libraries available to it, since you seem to interpret it as the same thing) and GUI widgets is just supported by your ignorance.
Besides, I don't get what's the big deal about learning Objective-C. It's a real C superset and heavily inspired by Smalltalk. Ruby programmers would feel at home. Who wouldn't like to think of Smalltalk or Ruby while writing real world solutions at close-to-the-metal speed.
I'm honestly surprised how ignorant and lazy the regular slashdotter has become with the years.
Any self-respected geek should be already keeping up to date with Apple advancements which are and will be impacting techology in the years to come.
If you people haven't noticed already, Apple has been consistently releasing libraries and server software as open source projects for the rest to pick up , use and modify, with liberal licenses.
A friend of mine used to say (can't remember exactly... paraphrasing:)
* Microsoft wants all software to be theirs
* GNU wants all software to be free
* BSD wants all software to be better
And releasing GCD, gentlemen, is another master stroke by Apple, just like WebKit, Bonjour, LLVM, the list goes on, to share knowledge and advance technology by merit, not by forcing it down your throat thanks to the monopoly you have been handed.
The term "block" is familiar to Ruby programmers. It's an old concept which Ruby has made easy to use and hence popular and actually useful.
And here's another lesson which OpenBSD, Apple and Ruby have been putting to work without you noticing guys: any technology that is difficult to use, no matter how good it is, will not be used if gets in your way; the technology must be easy to deploy/use and unobstrusive to be actually used and useful.
Just remember SELinux and how many people just disable it, no matter how good it is (which I don't think it is, but that's for another rant). Then compare it with the technology that OpenBSD has been implementing for memory protection which is unobstrusive and ready to use with no extra configuration. Same with Ruby blocks, which more programmers are using and a lot of software is benefitting from it now, even though higher order functions and closures have been around for ages.
Having Ruby-like blocks in C and Objective-C is so COOL, you must appreciate that if you think you're serious at programming. Apple has already submitted it to be a standard. I believe MacRuby will benefit from this too, which is Ruby written in Objective-C, which implements Ruby classes as Objective-C classes, achieving incredible speed, taking advantage of Objective-C and LLVM technologies.
Now, I want my late '90s Slashdot back please, where you could more easily find insightful and informative comments. There's a lot of garbage and Microsoft apologists nowadays.
Apple filed a series of official answers to queries from the Federal Communications Commission, and provided the answers publicly on its Web site. In the responses, Apple stated that Google Voice was, contrary to media reports, not rejected from the App Store, but remains under review. In addition, it stated that the software has been delayed solely by Apple.
+1 Sarcasm or +1 Insightful?