Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Slashdot Deals: Prep for the CompTIA A+ certification exam. Save 95% on the CompTIA IT Certification Bundle ×

Comment I certainly don't anymore (Score 1) 207

Don't get me wrong, please.

I love KDE since 1.x. I've always hated GNOME since it was shipping with RH 5.2.

But I've been waiting for KDE and the whole Linux desktop experience to be good, and 11 years have passed.

Today, I have embraced Mac OS X for my personal desktop, and *love* it: I'm not looking back any more; I use OpenBSD for most of my servers of course; and for the PCs at work, I have succumbed to Ubuntu, a very customized Ubuntu that doesn't expose a whole desktop, but just a dock with only the applications the users need for work.

I fail to see why an easy to use desktop on Linux is needed any more, because all my three uses for a computer are already perfectly covered.

Not that I would reject using Linux as a desktop somewhere. Heck, I have used OpenBSD as a desktop. But just because I want to contribute to the project, or learn their internals. And for my users, it's better for them and for the company/institution that forget that they have a "Personal Computer": they don't; the machine in front of them is just a tool to get their job done. Then a whole desktop is overkill.


Bug In Most Linuxes Can Give Untrusted Users Root 281

Red Midnight and other readers brought to our attention a bug in most deployed versions of Linux that could result in untrusted users getting root access. The bug was found by Brad Spengler last month. "The null pointer dereference flaw was only fixed in the upcoming 2.6.32 release candidate of the Linux kernel, making virtually all production versions in use at the moment vulnerable. While attacks can be prevented by implementing a common feature known as mmap_min_addr, the RHEL distribution... doesn't properly implement that protection... The... bug is mitigated by default on most Linux distributions, thanks to their correct implementation of the mmap_min_addr feature. ... [Spengler] said many other Linux users are also vulnerable because they run older versions or are forced to turn off [mmap_min_addr] to run certain types of applications." The register reprints a dialog from the OpenBSD-misc mailing list in which Theo De Raadt says, "For the record, this particular problem was resolved in OpenBSD a while back, in 2008. We are not super proud of the solution, but it is what seems best faced with a stupid Intel architectural choice. However, it seems that everyone else is slowly coming around to the same solution."

Comment Exactly my thoughts (Score 1) 342

I would add "powerful" to your points (which I guess are limited to the desktop):

* Mac OS X: usable and powerful (great UI + great foundation)
* Windows: just plain convenient, thanks to the size of the install base and people familiar to it
* GNU/Linux: powerful, but not usable

Being said that, I'm actually using the three of them at work:

* GNU/Linux for the people that is responsible for a few and very specific tasks for which Ubuntu has been customized.
* Windows for the yet-to-be-converted PC because of in-house systems or 3rd party software that require Windows and is pending of getting an alternative
* Mac OS X for people that know better. Which means the IT department :)

Comment Re:Violates the developer agreement (Score 2, Interesting) 327

C# and .NET are definitely MORE powerful than objective C in a general purpose sort of way (objective C might have more depth in specific targeted areas like GUI widgets, but the breadth of massive frameworks like C# and Java is truly vast)

Having more libraries doesn't make a language more powerful; in any case, it would make it more productive.

The hard link you have in your mind between Objective-C (or the libraries available to it, since you seem to interpret it as the same thing) and GUI widgets is just supported by your ignorance.

Besides, I don't get what's the big deal about learning Objective-C. It's a real C superset and heavily inspired by Smalltalk. Ruby programmers would feel at home. Who wouldn't like to think of Smalltalk or Ruby while writing real world solutions at close-to-the-metal speed.

Comment How ignorant and lazy you are (Score 4, Insightful) 342

I'm honestly surprised how ignorant and lazy the regular slashdotter has become with the years.

Any self-respected geek should be already keeping up to date with Apple advancements which are and will be impacting techology in the years to come.

If you people haven't noticed already, Apple has been consistently releasing libraries and server software as open source projects for the rest to pick up , use and modify, with liberal licenses.

A friend of mine used to say (can't remember exactly... paraphrasing:)

* Microsoft wants all software to be theirs
* GNU wants all software to be free
* BSD wants all software to be better

And releasing GCD, gentlemen, is another master stroke by Apple, just like WebKit, Bonjour, LLVM, the list goes on, to share knowledge and advance technology by merit, not by forcing it down your throat thanks to the monopoly you have been handed.

The term "block" is familiar to Ruby programmers. It's an old concept which Ruby has made easy to use and hence popular and actually useful.

And here's another lesson which OpenBSD, Apple and Ruby have been putting to work without you noticing guys: any technology that is difficult to use, no matter how good it is, will not be used if gets in your way; the technology must be easy to deploy/use and unobstrusive to be actually used and useful.

Just remember SELinux and how many people just disable it, no matter how good it is (which I don't think it is, but that's for another rant). Then compare it with the technology that OpenBSD has been implementing for memory protection which is unobstrusive and ready to use with no extra configuration. Same with Ruby blocks, which more programmers are using and a lot of software is benefitting from it now, even though higher order functions and closures have been around for ages.

Having Ruby-like blocks in C and Objective-C is so COOL, you must appreciate that if you think you're serious at programming. Apple has already submitted it to be a standard. I believe MacRuby will benefit from this too, which is Ruby written in Objective-C, which implements Ruby classes as Objective-C classes, achieving incredible speed, taking advantage of Objective-C and LLVM technologies.

Now, I want my late '90s Slashdot back please, where you could more easily find insightful and informative comments. There's a lot of garbage and Microsoft apologists nowadays.

Comment Apple hasn't killed Google Voice (Score 1) 304

Apple hasn't killed Google Voice

Apple filed a series of official answers to queries from the Federal Communications Commission, and provided the answers publicly on its Web site. In the responses, Apple stated that Google Voice was, contrary to media reports, not rejected from the App Store, but remains under review. In addition, it stated that the software has been delayed solely by Apple.

Hardware Hacking

Apple Working On Tech To Detect Purchasers' "Abuse" 539

Toe, The writes "Apple has submitted a patent application for technologies which would detect device-abuse by consumers. The intent presumably being to aid in determining the validity of warranty claims. 'Consumer abuse events' would be recorded by liquid and thermal sensors detecting extreme environmental exposures, a shock sensor detecting drops or other impacts, and a continuity sensor to detect jailbreaking or other tampering. The article also notes that liquid submersion detectors are already deployed in MacBook Pros, iPhones and iPods. It does seem reasonable that a corporation would wish to protect itself from fraudulent warranty claims; however the idea of sensors inside your portable devices detecting what you do with them might raise eyebrows even beyond the tinfoil-hat community."

He: Let's end it all, bequeathin' our brains to science. She: What?!? Science got enough trouble with their OWN brains. -- Walt Kelly