Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Submission + - Ethical trap: robot paralysed by choice of who to save (newscientist.com) 1

wabrandsma writes: From New Scientist:

Can a robot learn right from wrong? Attempts to imbue robots, self-driving cars and military machines with a sense of ethics reveal just how hard this is

In an experiment, Alan Winfield and his colleagues programmed a robot to prevent other automatons – acting as proxies for humans – from falling into a hole. This is a simplified version of Isaac Asimov's fictional First Law of Robotics – a robot must not allow a human being to come to harm.

At first, the robot was successful in its task. As a human proxy moved towards the hole, the robot rushed in to push it out of the path of danger. But when the team added a second human proxy rolling toward the hole at the same time, the robot was forced to choose. Sometimes, it managed to save one human while letting the other perish; a few times it even managed to save both. But in 14 out of 33 trials, the robot wasted so much time fretting over its decision that both humans fell into the hole.

Winfield describes his robot as an "ethical zombie" that has no choice but to behave as it does. Though it may save others according to a programmed code of conduct, it doesn't understand the reasoning behind its actions. Winfield admits he once thought it was not possible for a robot to make ethical choices for itself. Today, he says, "my answer is: I have no idea".

As robots integrate further into our everyday lives, this question will need to be answered. A self-driving car, for example, may one day have to weigh the safety of its passengers against the risk of harming other motorists or pedestrians. It may be very difficult to program robots with rules for such encounters.

Comment Re:If there was only one viable choice ... (Score 1) 159

I switched to DuckDuckGo and haven't looked back. They used to be noticeably worse in results quality, but Google has gone a long way downhill. Occasionally I don't find things with DDG and try Google. When I do, I have to wade through pages of totally irrelevant stuff to find that there are no matches, whereas at least DDG tells me straight away that it can only find half a dozen possibly-relevant things. I especially like the way DDG integrates with a number of domain-specific search engines.

Comment Re:Replacement Organs (Score 1) 75

I appreciate the offer, but I'm really not qualified. My interest is of the avid armchair variety. As I understand it, the dialysate is the key to making it work. Previous experiments achieved some removal of urea but it wasn't adequate or it caused electrolyte imbalances. In all forms of dialysis, it's something that could easily be mixed up at home but for the requirement of a sterile solution for hemo or peritoneal dialysis.

Comment Re:What for? (Score 5, Interesting) 183

I maintain the GNUstep / Clang Objective-C stack. Most people who use it now do so in Android applications. A lot of popular apps have a core in Objective-C with the Foundation framework (sometimes they use GNUstep's on Android, more often they'll use one of the proprietary versions that includes code from libFoundation, GNUstep and Cocotron, but they almost all use clang and the GNUstep Objective-C runtime). Amusingly, there are actually more devices deployed with my Objective-C stack than Apple's. The advantage for developers is that their core logic is portable everywhere, but the GUIs can be in Objective-C with UIKit on iOS or Java on Android (or, commonly for games, GLES with a little tiny bit of platform-specific setup code). I suspect that one of the big reasons why the app situation on Windows Phone sucks is that you can't do this with a Windows port.

It would be great for these people to have an open source Swift that integrated cleanly with open source Objective-C stacks. Let's not forget that that's exactly what Swift is: a higher-level language designed for dealing with Objective-C libraries (not specifically Apple libraries).

Objective-C is a good language for mid-1990s development. Swift looks like a nice language for early 2000s development. Hopefully someone will come up with a good language for late 2010s development soon...

Comment Re:If there was only one viable choice ... (Score 2) 159

It wasn't just about interface. People tend to forget how search engines did an absolutely horrible job of intelligently ranking the sites you wanted to see.

I find it pretty easy to remember - I go to Google today.

The UI was what made me switch both to Google originally and from it some years later. When I started using Google - and when Google started gaining significant market share - most users were on 56Kb/s or slower modem connections. AltaVista was the market leader and they'd put so much crap in their front page that it took 30 seconds to load (and then another 20 or so to show the results). Google loaded in 2-3 seconds. The AltaVista search results had to be a lot better to be faster. I switched away when they made the up and down arrow keys in their search box behave differently to every other text field in the system.

Comment Re: Government s a crappy investor (Score 2) 64

My 'precious electronic toys' use about a tenth of the power that the ones I was using a decade ago for the same purpose did. Even lighting power consumption has dropped. My fridge, freezer and washing machine are the big electricity consumers in my home - efficiency has improved there, but nowhere near as fast as for gadgets.

Comment Re:Tricky proposition (Score 1) 64

There's a lot more to government than military intelligence gathering and law enforcement (although it would be a good idea for someone to remind most current governments that those are two things, not one). And most government projects end up spending insane budgets. This isn't limited to the US. It amazes me how often government projects to build databases to store a few million records with a few tens to thousands of queries per second (i.e. the kind of workload that you could run with off-the-shelf software on a relatively low-spec server) end up costing millions. Even with someone designing a pretty web-based GUI, people paid to manually enter all of the data from existing paper records, and 10 years of off-site redundancy, I often can't see where the money could have gone. Large companies often manage to do the same sort of thing.

The one thing that the US does well in terms of tech spending is mandate that the big company that wins the project should subcontract a certain percentage to small businesses. A lot of tech startups have got their big breaks from this rule.

Comment Re:This may be the way to escape from Comcast (Score 1) 418

This was true for me when the modem they gave to me failed to bootstrap. I was charged for the guy to come out, debug the problem, and then swap the modem because it was defective from the factory.

But my only other internet option is AT&T and that's it.

So basically a duonopoly in a city with millions of people.

We need the city to lay the lines and then allow the cable companies to compete for customers on those lines like we do for our electric lines.

Comment Re:why? (Score 1) 182

Add to that, about 10-20% of the population get motion sick using the kind of VR in Oculus Rift (myself included - I can use it for 2-5 minutes, depending on the mode). It's ludicrous to imagine building a school that would exclude 20% of the potential pupils on some random criterion. You might as well make schools that didn't let in gingers...

Comment Re:intel atom systems keep 32 bit systems around (Score 1) 129

Apple already ships 64-bit ARM chips and a lot of other vendors are racing to do so. The Android manufacturers that I've spoken want 64-bit for the same reason that they want 8-core: It's a marketing checkbox and they don't want to be shipping a 32-bit handset when their competitor is marketing 64-bit as a must-have feature. ART is in the top 10 worst-written pieces of code I've had to deal with and is full of casts from pointers to int32_t (not even a typedef, let alone intptr_t), but it should get a 64-bit port soon.

Comment Re:The ones I witnessed... (Score 1) 129

64-bit is here for a while. A lot of modern '64-bit' CPUs only support 40-bit physical addresses, so are limited to 'only' 128GB of RAM. Most support 48-bit virtual addresses (the top bit is sign extended, so all 1 or all 0 depending on whether you've got a kernel or userspace address), limiting you to 'only' 32TB of virtual addresses. If RAM sizes continue to double once every year, then it takes another year to use each bit. We currently have some machines with 256GB of RAM, so are using 41 bits. 64 bits will last another 23 years. RAM increases have slowed a bit recently though. 10 years ago, you always wanted as much RAM as possible because you were probably swapping whatever you were doing. Now, most computers are happy with 2GB for programs and the rest for buffer cache. As SSDs get faster, there's less need for caching, but there might be more need for address space as people want to be able to memory map all the files that they access...

Slashdot Top Deals

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...