Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:So when will the taxi drivers start protesting? (Score 1) 583

Yes, that sounds acceptable...but it's an edge case, and stacked up against saving thousands of lives, it is unimportant. Besides, once both the bus and the taxi are autonomous, the obvious next step is some form of communication protocol that facilitates the taxi "telling" the bus "I have a passenger for you", and the bus's AI decides whether to wait or not, and informs the taxi of it's decision. The communication can happen over larger distances than the audible range of a horn, doesn't distract anyone else, and even more people could "make the bus" than do now, all more safely too. Win-win!

Comment Re:Ubisoft and PCs... (Score 1) 123

One issue I have with it is that it seems like every guard you run into on a mission so far is literally a "bad guy". These are guys working security for a major corp, and it feels like everyone of them have something negative in their profiles (child pornographer, drug addict, arsonist). Granted, I've only done one combat oriented mission so far, so maybe it's unique to that mission. I'm not sure how many people would agree with me, but I think seeing profiles like "Father of two", "Soup Kitchen Volunteer", "College Dropout" would give at least some players pause in how they would handle the situation. Overt combat or stealth? Do I really want to kill a retired kindergarten teacher? Then again, given another recent discussion here on /., I'm probably just weird.

I am sooooo with you on that one. Maybe someone at Ubisoft has a thing against security guards? Anyway, I would definitely think twice about hurting that "Soup Kitchen Volunteer" guard, for sure!

Comment Re:22.7 and pi (Score 1) 80

It is pretty interesting!

I always use 355/113 as my "super quickie fractional representation" of pi. It is accurate out to 6 decimal places, which makes it useful enough for most purposes. 22/7 only gets us to 2 decimal places, unfortunately.

Comment Re:Google's algorithm is not a neural network (Score 5, Interesting) 230

Just to back up what James Clay said, I took a course from Sebastian Thrun (the driving force behind the Google cars) on programming robotic cars, and no neural networks were involved, nor mentioned with regards to the Google car project. As far as I can tell, if the LIDAR says something is in the way, the deterministic algorithms attempt to avoid it safely; if you can't avoid it safely, you brake and halt. That's it. Maybe someone who actually worked on the Google car can comment further?
Does anyone know of any neural networks used in potentially dangerous conditions? This study: www-isl.stanford.edu/~widrow/papers/j1994neuralnetworks.pdf states that
accurateness and robustness issues need to be addressed when using neural network algorithms, and gives a baseline of more than 95% accuracy as a useful performance metric to aim for. This makes neural nets useful for things like auto-focus in cameras and handwriting recognition for tablets, but means that using a neural network as a primary decision-maker to drive a car is perhaps something best left to video games (where it has been used to great success) rather than real cars with real humans involved.

Comment Re:Measuring Disinterest (Score 1) 255

A car that is approaching you from behind with no sign of slowing down is not tail-gating, they are about to run into you. Tail-gating is when a vehicle behind you maintains less than the appropriate distance but is going the same speed you are. Is English not your first language, or are you deliberately misinterpreting what I've said because you like to argue?

How the fuck do you tell the difference between "a car approaching you from behind" and "a car tail-gating you"? A tail-gating car approaches you from behind, then ...doesn't run into you. There is no way for me to tell them apart, as the driver in front: I can't read their minds. Mind you, I've never seen this mythical "run you over if you don't speed up" behaviour, but I've been tail-gated before (it occasionally happens when you obey speed limits, dangerous assholes like to crowd you). I leave my cruise control set for the speed limit, and they decide not to bump into me.
The safe distance between you and the car in front of you is supposed to be beteen 3-4 seconds in ideal conditions, longer if the road is wet/icy/snowy or the vehicle in front is lighter than yours (like following a motorcycle, or if you drive a truck, etc). The Smith System says 4 seconds, but the gov't of Canada claims 3 under perfect conditions: http://www.gov.pe.ca/photos/or...
It is the responsibility of the driver BEHIND to maintain this distance, unless they are in another lane and passing. Only the driver following can do this; the driver in front has no safe way to maintain the distance: driving over the speed limit to try and maintain a safe distance is both dangerous and futile.

The rest of your rant is based on the concept of "at fault", which is a legal definition, and isn't what I wrote to begin with.

No, it's not based on just the legal definition of at fault, which is why I mentioned driver safety instructors and insurance companies alongside the lawyers. However, the legal definition should be good enough, as it is done that way for a very good reason. The person following has complete control over the situation; it is their, and only their, decisions that resulted in the accident. Even in your bogus "unholy steamroller that will not slow down and is going to run you over" scenario, there are just two possible ways to avoid the accident:
1)The car ahead speeds up. This is problematic: they are now driving above the speed limit, which is both illegal and more dangerous than:
2)The car following SLOWS THE FUCK DOWN to the speed limit and backs off the appropriate distance (3-4 seconds) until it can get into a clear passing lane and then blithely speed off. Now, you claim #2 is not valid because the asshat coming up from behind is a murderous psychotic who can't take his foot off the accelerator, but I don't see that as a reason to then say that #1 is what we all should do in every instance of this. That is a pure logical fallacy.

Now, which of these is the least dangerous? Which of these is taught as the correct response in driving schools? Which of these does the law say is legal? Which of these do we want both humans and robots doing? Now imagine if we programmed robot cars to "just speed up" when someone behind them started tail-gating...no, sir, that is not a good answer. That is an awful answer. If you can't see why that is an incorrect way to program a robotic car to respond, we are at an impasse. I don't think you need any special education beyond a safe driver's course to understand why, but might I suggest "Probabilistic Robotics" by Thrun et alii if you are interested in the study of programming autonomous cars and the logic/math behind it all. It's heavy reading, but is currently the gold standard in academic texts on the subject. You may find it just as good reading as Risks Digest.

Comment Re:Measuring Disinterest (Score 1) 255

Incorrect. Patently absurd, and completely ridiculous. They need to be more reliable than humans AND fail in passive and safe ways AND interact with the human traffic around them in safe and predictable ways.

No, they just need to be more reliable than humans. That's it, that's all. If they lower accidents/fatalities then they are the best choice; there is no need for them to be perfect, merely better than the alternative.

  If your "more reliable" autonomous vehicle, while going the speed limit, detects another vehicle approaching from the rear at +5 relative velocity and showing no signs of slowing down, and it does NOT increase its speed to prevent the impending accident, then YOUR vehicle is wrong and has failed to protect its occupants by taking a simple preventative measure. And if you think that going 5 over is "unsafe speed" at 70, then you aren't experienced enough at driving to actually be doing it yourself.

That is patently ridiculous. The driver behind always is at fault in an accident (ask your insurance agent, or driving instructor, or an attorney; they will all give you the same answer). I don't speed up for transport trucks that approach me at +20 km/h over the speed limit, why should I? Then *I* could potentially be at fault for causing an accident whilst driving too fast! Is that what you do, speed up when the guy behind you starts tail-gating? THAT IS DANGEROUSLY STUPID! Me, I tap the brakes if he gets too close (not actually engaging the brakes, just touching the pedal to turn on the brake lights). That usually wakes up the moron approaching from behind and gets them thinking about what would happen if I had to brake suddenly.
To make sure you understand, imagine the simple scenario you propose: I am driving the speed limit, you are approaching from behind at +5 mph over the limit, and a deer jumps in front of my car. I slam on the brakes, stay in my lane and come to a safe stop, barely touching the deer (so far, no accident). You also brake, but ram into me from behind. Now there is an accident, and I posit you are 100% at fault, and I am 0% at fault. Agreed? Further, I posit that had the Google car been autonomously driving behind me, there would have been *no* accident...and that is the Google car of today, not the fully autonomous cars we will have in a few years that will likely be able to handle tricky driving conditions (snow is a problem currently, not so much because of traction, but mostly because of visibility).

Slashdot Top Deals

The one day you'd sell your soul for something, souls are a glut.

Working...