Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:So, a design failure then. (Score 1) 165

It depends on your design goals.

In Asimov's story universe, the Three Laws are so deeply embedded in robotics technology they can't be circumvented by subsequent designers -- not without throwing out all subsequent robotics technology developments and starting over again from scratch. That's one heck of a tall order. Complaining about a corner case in which the system doesn't work as you'd like after they achieved that seems like nitpicking.

We do know that *more* sophisticated robots can designed make more subtle ethical systems -- which is another sign of a robust fundamental design. The simplistic ethics is what subsequent designers get when they get "for free" when they use an off-the-shelf positronic brain to control a welding robot or bread-slicing machine.

Think of the basic positronic brain design as a design framework. One of the hallmarks of a robust framework is that easy things are easy and hard things are possible. By simply using the positronic framework the designers of the bread slicing machine don't have to figure out all the ways the machine might slice a person's fingers off. The framework takes care of that for them.

Comment Re:Article shows fundamental lack of understanding (Score 2) 183

They won't see people switching to Swift uniformly. There are trillions of lines of code written in Objective-C, and programmers already know it and are comfortable with it. There are no tools for migrating code from Objective-C to Swift, much less the hodgepodge of mixed C, Objective-C, and sometimes C++ that quite frequently occurs in real-world apps, so for the foreseeable future, you'd end up just adding Swift to your existing apps, which means you now have three or four languages mixed in one app instead of two or three, and now one of them looks completely different than the others. I just don't see very many developers seriously considering adopting Swift without a robust translator tool in place.

I do, however, expect to see Swift become the language of choice for new programmers who are coming from scripting languages like Python and Ruby, because it is more like what they're used to. In the long term, they'll outnumber the Objective-C developers, but the big, expensive apps will still mostly be written in Objective-C, simply because most of them will be new versions of apps that already exist.

BTW, Apple never really treated Java like a first-class citizen; it was always a half-hearted bolt-on language. My gut says that they added Java support under the belief that more developers knew Java than Objective-C, so it would attract developers to the platform faster. In practice, however, almost nobody ever really adopted it, so it withered on the vine. Since then, they have shipped and subsequently dropped bridges for both Ruby and Python.

Any implication that Swift will supplant Objective-C like Objective-C supplanted Java requires revisionist history. Objective-C supplanted C, not Java. Java was never even in the running. And Objective-C still hasn't supplanted C. You'll still find tons of application code for OS X written in C even after nearly a decade and a half of Apple encouraging developers to move away from C and towards Objective-C. (Mind you, most of the UI code is in Objective-C at this point.) And that's when moving to a language that's close enough to C that you don't have to retrain all your programmers.

Compared with the C to Objective-C transition, any transition from Objective-C to Swift is likely to occur at a speed that can only be described as glacial. IMO, unless Apple miraculously makes the translation process nearly painless, they'll be lucky to be able to get rid of Objective C significantly before the dawn of the next century. I just don't see it happening, for precisely the same reason that nine years after Rails, there are still a couple orders of magnitude more websites built with PHP. If a language doesn't cause insane amounts of pain (e.g. Perl), people are reluctant to leave it and rewrite everything in another language just to obtain a marginal improvement in programmer comfort.

Comment Re:The protruding lens was a mistake (Score 2) 425

I don't think you've really grasped Apple's design sensibility. Job one for the designers is to deliver a product that consumers want but can't get anywhere else.

The "camera bulge" may be a huge blunder, or it may be just a tempest in a teapot. The real test will be the user's reactions when they hold the device in their hand, or see it in another user's hand. If the reaction is "I want it", the designers have done their job. If it's "Holy cow, look at that camera bulge," then it's a screw-up.

The thinness thing hasn't been about practicality for a long, long time; certainly not since smartphones got thinner than 12mm or so. They always been practical things the could have given us other than thinness, but what they want you to do is pick up the phone and say, "Look how thin the made this!" The marketing value of that is that it signals that you've got the latest and greatest device. There's a limit of course, and maybe we're at it now. Otherwise we'll be carrying devices in ten years that look like big razor blades.

At some point in your life you'll probably have seen so many latest and greatest things that having the latest and greatest isn't important to you any longer. That's when know you've aged out of the demographic designers care about.

Comment Re: Apple not in my best interests either (Score 1) 183

No, they're saying Apple switched because GCC's core wasn't designed in a way that made it easy to extend the Objective-C bits in the way that Apple wanted. And that could well be part of it—I'm not sure.

But I think a bigger reason was that Apple could use Clang to make Xcode better, whereas GCC's parsing libraries were A. pretty tightly coupled to GCC (making it technically difficult to reuse them) and B. licensed under a license that made linking them into non-open-source software problematic at best.

Comment Re:Severe (Score 1) 148

I'm not in the Netherlands, but would not classify the weather we had for the period December - March as a "winter". A year earlier there was ice rain and all sorts of obnoxious stuff but last year was autumn with shorter days. This year has to be colder.

Comment Re:Severe (Score 1) 148

You have a point there. We also had loads of rain. Luckily we coped well with it, no floods here, which is quite an achievement for a country which has one half lying below sea level. But snow is something we don't handle well anymore since the 1990s.

Comment Re:Where the pessimism comes from. (Score 5, Insightful) 191

I'd argue that we do try to write about the future, but the thing is: it's pretty damn hard to predict the future. ...
The problem is that if we look at history, we see it littered with disruptive technologies and events which veered us way off course from that mere extrapolation into something new.

I think you are entirely correct about the difficulty in predicting disruptive technologies. But there's an angle here I think you may not have considered: the possibility that just the cultural values and norms of the distant future might be so alien to us that readers wouldn't identify with future people or want to read about them and their problems.

Imagine a reader in 1940 reading a science fiction story which accurately predicted 2014. The idea that there would be women working who aren't just trolling for husbands would strike him as bizarre and not very credible. An openly transgendered character who wasn't immediately arrested or put into a mental hospital would be beyond belief.

Now send that story back another 100 years, to 1840. The idea that blacks should be treated equally and even supervise whites would be shocking. Go back to 1740. The irrelevance of the hereditary aristocracy would be difficult to accept. In 1640, the secularism of 2014 society and would be distasteful, and the relative lack of censorship would be seen as radical (Milton wouldn't publish his landmark essay Aereopagitica for another four years). Hop back to 1340. A society in which the majority of the population is not tied to the land would be viewed as chaos, positively diseased. But in seven years the BLack Death will arrive in Western Europe. Displaced serfs will wander the land, taking wage work for the first time in places where the find labor shortages. This is a shocking change that will resist all attempts at reversal.

This is all quite apart from the changes in values that have been forced upon us by scientific and technological advancement. The ethical issues discussed in a modern text on medical ethics would probably have frozen Edgar Allen Poe's blood.

I think it's just as hard to predict how the values and norms of society will change in five hundred years as it is to accurately predict future technology. My guess is that while we'd find things to admire in that future society, overall we would find it disturbing, possibly even evil according to our values. I say this not out of pessimism, but out my observation that we're historically parochial. We think implicitly like Karl Marx -- that there's a point where history comes to an end. Only we happen to think that point is *now*. Yes, we understand that our technology will change radically, but we assume our culture will not.

Comment Where the pessimism comes from. (Score 5, Insightful) 191

The pessimism and dystopia in sci-fi doesn't come from a lack of research resources on engineering and science. It mainly comes from literary fashion.

If the fashion with editors is bleak, pessimistic, dystopian stories, then that's what readers will see on the bookshelves and in the magazines, and authors who want to see their work in print will color their stories accordingly. If you want to see more stories with a can-do, optimistic spirit, then you need to start a magazine or publisher with a policy of favoring such manuscripts. If there's an audience for such stories it's bound to be feasible. There a thousand serious sci-fi writers for every published one; most of them dreadful it is true, but there are sure to be a handful who write the good old stuff, and write it reasonably well.

A secondary problem is that misery provides many things that a writer needs in a story. Tolstoy once famously wrote, "Happy families are all alike; every unhappy family is unhappy in its own way." I actually Tolstoy had it backwards; there are many kinds of happy families. Dysfunctions on the other hand tends to fall into a small number of depressingly recognizable patterns. The problem with functional families from an author's standpoint is that they don't automatically provide something that he needs for his stories: conflict. Similarly a dystopian society is a rich source of conflicts, obstacles and color, as the author of Snow Crash must surely realize. Miserable people in a miserable setting are simply easier to write about.

I recently went on a reading jag of sci-fi from the 30s and 40s, and when I happened to watch a screwball comedy movie ("His Girl Friday") from the same era, I had an epiphany: the worlds of the sci-fi story and the 1940s comedy were more like each other than they were like our present world. The role of women and men; the prevalence of religious belief, the kinds of jobs people did, what they did in their spare time, the future of 1940 looked an awful lot like 1940.

When we write about the future, we don't write about a *plausible* future. We write about a future world which is like the present or some familiar historical epoch (e.g. Roman Empire), with conscious additions and deletions. I think a third reason may be our pessimism about our present and cynicism about the past. Which brings us right back to literary fashion.

Comment Severe (Score 3, Insightful) 148

The Netherlands had a very warm winter last year so this one must be quite severe compared to the last one. It would be nice to have a thick layer of snow again for a change. Last time that happened is already a few years ago.

Comment Re:Will continue to be developed for other platfor (Score 2) 330

And you know what Mojang's opinion means at this point? Absolutely NOTHING. They can't tell their new owner to honor their intended promises, even if it were written into the deal. All they have to do is replace the boss with someone willing to change the company on Microsoft's behalf and POOF! It's happened with every other developer that's been bought out thus far that came out and said they were told/promised nothing would be changing.

Depends on how good their lawyers are. If they write into the contract a term that says that all rights revert to the original authors if the new owner violates such a term, then yes, they can force the new owners to honor those promises.

Comment Yep, music sales dropped from '99 to 2009 .... (Score 1) 610

That's also the time frame when MOST people I know became disinterest / disenchanted with the new music coming out, and reverted to listening to older material instead.

I'm not saying the ease of "pirating" music with digital tools doesn't contribute to loss of music sales. It MAY (but the ease of BUYING tracks has exponentially increased too, as well as a reduction to nearly zero in costs of distribution to people -- so I'm not sure).

But quite frankly, we've regularly witnessed trends in popular music that are long overdue, here in the 2000's. As just a random few I can think of off the top of my head? We had the "rise of the alternative girl bands" (Bjork, Sarah McLaughlin, Poe, Fiona Apple, PJ Harvey, Mazzy Star, etc. etc.) in the 90's. We had the brief burst in popularity of ska and neo-swing type music (Mighty Mighty Bosstones, Cherry Poppin' Daddies, etc.). Obviously, we had the huge effect of the Seattle grunge scene. Before that, we saw a rise in popularity of "modern country" and line-dancing, the era of Heavy Metal in the 80's, and a period where rock/rap fusion was popular. So what's really happened along these lines in the 2000-2014 time period?

Comment This is NOT a problem.... User stupidity is.... (Score 1) 610

Your music library WILL contain all of the stuff you choose to put in it. That's not going to change, because that's pretty much the POINT of it!

What we've got here are a bunch of whiny people who dislike U2, throwing fits over the fact that their latest album is now a part of their collection despite not wanting it there. Well..... so what? How does this really affect you in a negative way, in the grand scheme of things? You never have to add a U2 song to a custom playlist. It doesn't delete any of your other music you already have, or prevent you from adding something new that you want. It cost you absolutely nothing. And because of the way iTunes works, you don't even have to use any disk space keeping the downloaded tracks on your Mac or iOS device. You can delete them all and it just leaves a "marker" in the cloud, saying you have the ability to download it any time.

Heck, if THAT is so intolerable? Consider exporting your music library to a standard format like MP3 (iTunes gives you the ability to make an MP3 version of any of your songs by right clicking on them, even) - and use a different program as your music manager. You could still purchase new stuff via iTunes if you wanted, and just export a copy to the player you actually use.

As I understand it, this whole "promotion" cost Apple hundreds of millions of dollars to pull off -- and was likely only something negotiated courtesy of the recent acquisition of Beats and the inside connections they had with the music industry. I really don't think you're going to see this happening regularly.

Comment Re:When the cat's absent, the mice rejoice (Score 5, Insightful) 286

Well, I'd be with you if the government was poking around on the users' computers, but they weren't. The users were hosting the files on a public peer-to-peer network where you essentially advertise to the world you've downloaded the file and are making it available to the world. Since both those acts are illegal, you don't really have an expectation of privacy once you've told *everyone* you've done it. While the broadcasting of the file's availability doesn't prove you have criminal intent, it's certainly probable cause for further investigation.

These guys got off on a narrow technicality. Of course technicalities do matter; a government that isn't restrained by laws is inherently despotic. The agents simply misunderstood the law; they weren't violating anyone's privacy.

Slashdot Top Deals

The one day you'd sell your soul for something, souls are a glut.

Working...