Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Get Off My Lawn (Score 5, Insightful) 457

The more it changes, the more it stays the same, and the less I want anything to do with this industry anymore,

Why are you so jaded? If email works, then stick to it. It's not as if anyone is forcing you to follow the fashionistas? One of the beauties of Unix is that you can take a Unix programmer from the 1980s and drop him in 2014 and he'll still be productive. True, he wouldn't know anything about GUIs (which change with time) but the core has remained largely the same. The same can be said about all core technologies around today.

Comment Re:I don't know, has he? (Score 2) 365

True. Even if they do lose the lock on the non-Apple desktop and laptop market, they may still be able to do an IBM and reinvent themselves.

We'll finally know that Microsoft is at the tipping point when the Walmart, Futureshop, Best Buy, Staples, etc all devote floor space to non-Mac and non-Windows laptops and desktops (e.g. Linux, Chromebook, etc) and its common for businesses that currently buy both Macs and Windows start buying these non-Mac and non-Windows laptops and desktops.

We're still a long way from that scenario. But unlike 5 years ago, this scenario seems likely within the next 5 years. Google in particular has done a fantastic job of breaking the Microsoft-Mac hegemony. Whether it will turn into a Microsoft-Mac-Google hegemony or there will be space for other competitors is another story.

Comment Re:Once again reinventing Fred Brooks (Score 1) 381

That's not totally correct. If there is too much essential complexity, it is generally possible to find a simple approximate solution that is good enough. You just need to go back to the client to verify it's acceptable given the alternative.

Case in point, NP complete problems cannot be simplified, but it's nearly always possible to find a polynomial timee solution that's at is "good enough" for many cases.

Another case in point, the president of Walmart wanted to know the percentage of customers who entered the stores and bought items. The IT team came up with a sophisticated solution that involved RFIDs and face recognition at entrances to determine this tally. The president shook his head and instead asked the non-techie Walmart managers for a solution. They suggested "Greeters" who would do this tally and help people as a side benefit. While this solution is less accurate than the IT solution, it's good enough for planning.

Comment Misrepresentation (Score 2) 201

Larry wasn't swiping at Oracle and Microsoft any more than a person who is being picked on isn't bullying if he says "it's not fair".

As for negativity, it's not only here to stay, it is actually beneficial in some cases. Some companies add restrictions to their EULAs that state you are forbidden to comparing their product to others (e.g. via benchmarking). I'm sorry, it might be "negative" to say one product is better than another, but it's irrelevant. People want the best value for their money and not just "a good enough deal".

Imagine how poor the Linux kernel quality would be if Linus was too worried about offending contributors? Imagine where free software would be if Stallman wasn't so negative on even the hint of proprietary software?

Comment Don't hold out much hope (Score 1) 109

There's already making automatic decisions that harm their searches. For instance, autocomplete is now useless since it automatically gets you to "Are you feeling lucky?" and there's no way to turn it off. Worse yet, there are times when I automatically keep autocompleting by accident and I keep getting hit by "Are you feeling lucky?".

Comment Most teachers are women.... (Score 1) 690

Speaking from personal experience, I was at the bottom of the class until I got a male which point I went to the top of the class.

My explanation for this is simpler, men understand boys better than women understand boys, and I'm sure the opposite case is true. I also notice that men also tend to also be more lenient on girls and women tend to be more lenient on boys. This applies to teachers and parents as well. The old cliche "daddy's girl" and "mommy's boy" is grounded in truth.

As a result, with female teachers, boys are allowed to slack off more and the more touchy feely classroom will turn off many boys, whereas with male teachers, boys are challenged more and classes tend to be more business-like. Male teachers also tend to turn off girls (e.g. that's one reason why there are fewer girls in some fields)

If you want to help boys who are falling behind, you either need to get the balance back to 50% of each gender or segregate the sexes.

Comment Re:To Be Fair (Score 1) 404

Difference for the sake of difference hurts everyone.

Just imagine if steering wheels were patented each each car vendor were forced to come up with a unique way to steer a car that no other car vendor came up with.

Not only would this be a waste of energy, it would also make getting a driving license nearly impossible and it would make driving someone else's car or buying another vendor's car also practically impossible.

If something works, it *should* be copied. If it can be copied without much effort or it was implemented independently without knowledge of the patent, it has no business being patented. If it's being patented to lock people out of a technique, it has no business being patented.

Comment Re:It's not Optimism, (Score 1) 344

Not really a new idea. If you look at medieval maps of the world, you'd regularly see things like "Here be dragons" and other sorts of odd human-like creatures in the less explored areas of the map. Look at Gulliver's Travels. It contains more than a few non-human creatures with intelligence in the distance. Yes, those were islands on earth, but sea voyage was the space travel of that age.

More significantly, Saint Thomas Aquinas deal this this issue in the 12th century:

In short, there is no reason to believe that God is limited to just humanity and there are no implications to Catholic Theology if the universe is teeming with life.

Given that the universe is made for God's glory, it would be presumptuous of us to automatically assume that we're the only ones here, even if it ultimately turns out to be true. But given the size of the universe, this will likely remain an open question if we never achieve first contact.

Comment Re:AND it's no longer relevant. (Score 4, Informative) 243

I'm on 11.10 now after stalling at 10.10 until about a month ago. I figured GNOME 2.0 is on the way out so I'd eventually have to get used to some other environment.

I gave Unity a shot, but it was too slow. Unity 2D is pretty snappy and not too bad, but it's really meant only for people who run one application at a time. I don't so it was always getting in my way. I couldn't stand Kubuntu and Lubuntu felt awkward.

But Xubuntu is most definitely a viable option even if it is a step down from GNOME 2.x.

I would have settled on XFCE, but discovered to my surprise that GNOME Shell with extensions gives you 95% of everything GNOME 2.0 did and has almost the same look and feel. It's what I'm using now and I'd rather move to Debian than give it up (if Ubuntu stopped supporting it).

So I'd suggest to take the plunge and upgrade. You have at least two viable options awaiting you.

Comment Let's look at the predictions (Score 2) 219

Here's my take:
(1) People power will come to life.

Hmmm, most people who use a PC or tablet (unless they're playing a game) tend to sit quietly at location for hours on end. There's not a lot of opportunity to harness power. Now it might be possible for such harnessing to power cell phones and iPods, but unless cell phones use significantly less power, this is a no-go.

(2) You will never need a password again. Biometrics will.....

Yes, immediately after voice recognition and AI take over. Biometrics might take over for informal use, but it's too flawed (either too many false negatives or false positives) for widespread use. It's much more likely that a personal SKEY-type personal RFID might become available.

(3) Mind reading is no longer science fiction.

It's no longer science fiction today, but even if it is cheap enough, our minds are too scattered to have this as the primary mode of input.

(4) The digital divide will cease to exist. Mobile phones will make....

May parts of the world live on less than one dollar a year, virtually no infrastructure, and have virtually no need for technology that doesn't directly contribute to the bottom line (i.e. surviving). The digital divide will be around for years to come.

(5) Junk mail will become priority mail.

This might be come true, but it would be priority mail for mail services who want to gain extra income, not users.

Okay, let's assume that all people play by the rules of using this smart feature (and that there's enough gold at the end of the rainbow to end world hunger).

Smart junk mail is the modern equivalent of Microsoft Clippy. Yes Clippy tried to be helpful, and often did provide users with valuable information, but it was still hated precisely because it was unsolicited.

This is not to say that junk mail can't be made valuable. If mail could be pulled into three bands by mail providers, "Regular Mail", "Smart Mail", and "Junk mail". It has to be something that depends on the mail providers, not solely the mail publishers since we can't trust them. For most people, smart mail would be ignored unless you were looking for a deal. You could then call on it as a supplementary knowledge base.

Comment Re:So... (Score 1) 387

> Einstein didn't even believe them. They were ultimately proven
> true as technology advanced to the point that the relevant
> experiments became possible.

True, but until Einstein's theories were tested, they were just one mathematical model among many.

I think one problem many people have is the assumption that a model has anything to do with reality. They don't necessarily. There are plenty of weird models that work out but are not realistic. For instance, Nonstandard analysis has shown than using infinitesimals and infinities is valid for solving real physical problems, even though infinitesimals and infinities do not exist (see Heisenberg and Cosmology).

Similarly quantum theory doesn't necessarily posit that light is both a wave and a particle. It simply posits that light is, it does not correspond to our real world understandings of either waves or particles. Both are merely approximations that our feeble brains grasp onto for intuition's sake.

Comment Re:Voting? (Score 1) 225

True, but it's the nature of the problem.

Before the periodic table, the elements were grouped by non-essential properties like boiling points, conductivity, colour, etc. The problem with such groupings is that different people group things differently. With such groupings, some kind of voting or fiat declaration is the only way to define the things.

When the periodic table was created, there was finally a grouping based on essential properties, so no such voting is required.

If you want to avoid this problem, you have to come up with a similar non-arbitrary measure. For instance, the difference between a black hole and a really heavy star is pretty clear and depends only on a universal constant (speed of light). The difference between a Planet and Dwarf Planet is not (i.e. it's possible to turn a Planet into a Dwarf planet by adding debris in the path of the orbit). A better definition for a planet would be it is massive enough to have a stable satellite (whether or not such a satellite actually exists) and a dwarf planet doesn't.

With the galaxy, we need to do something similar. I don't know enough to propose such a definition, but if we don't come up with a real definition, we'll end up with the Pluto situation where a different vote will cause a lot more fuss than makes sense.

Comment Re:A Few Logical Problems (Score 1) 431

I think what the author of the article has a vague idea that the interface and use will be scalable.

Essentially, the tablet should function on its own, but if you wanted to, you could hook in a real physical keyboard (as some models currently allow) and possibly an external monitor (as nearly all laptops allow).

If you think about it, there's little need to have several devices when one will do. A decade and a half ago, IBM demoed a device (I believe it was called "the cube") which took this to the extreme. The idea was that all your data was located in a memory stick in a common format. If you plugged in the stick into a PDA, the PDA would work as if this were the main device memory. If attached to a laptop, the memory stick would be the main hard drive, and ditto for a desktop. As long as the PDA, laptop, and desktop organized the memory stick and its metadata in a compatible way, the transition should be seamless. The concept failed because this seamlessness is hard to achieve when Windows is in the picture. Power constraints is also a factor (PDAs need low power memory. Desktops need fast memory).

But we are getting to the stage when this vision is possible, except that memory and monitor will be the common device. I would be very surprised if in 5-10 years this wasn't the default.

Slashdot Top Deals

The cost of living hasn't affected its popularity.