Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Their tech doesn't work (Score 3, Insightful) 93

Humans drive into stationary emergency vehicles all the time. Self-driving cars don't need to be perfect, they just need to be a bit better than humans. In the end it'll come down to insurance companies*, and they'll just look at the numbers. There will be a point where insurers will charge you extra if you have a steering wheel and pedals fitted.

*) in sane legal systems, where a robotaxi manufacturer can't just be sued for millions just because someone died; the plaintiff would have to show gross negligence. For the rest: liability would lie with the robotaxi operator or owner, through their insurance. Just as is the case now.

Comment Re:Told you (Score 3, Interesting) 307

Instead of the clunky ICE drivetrain with an electric motor bolted on, I'd rather have that in reverse: a fully electric drive train (with a full size battery), combined with a small petrol generator that can be enabled to extend the vehicle's range. The advantages: much simpler construction, the engine can be kept small and light, and always runs at the optimum RPM, saving gas. There are a few cars with range extenders already, and when enabled they can almost double the range of the car, but sadly most offerings only have a shitty little EV battery that doesn't have much range to begin with.

Comment Re:They already know (Score 1) 28

They might even have conducted their own experiments already, and set the lights accordingly. To maximize profits of course, not to encourage responsible gambling. The casino manager's reaction to: "It is possible that simply dimming the blue in casino lights could help promote safer gambling behaviors" would have been "MOAR BLUE!"

Comment Re:have interviews ever tested the right thing? (Score 3, Interesting) 85

Data structures and algorithms are used in interviews, not because you'll be expected to implement sorts and linked lists, but because they demonstrate that the candidate can implement and troubleshoot software from specs, using cases they should be familiar with if they have a CS / coding background. Testing for skills rather than knowledge. By the same token, it makes sense to test candidates on the use of AI assistants, if they'll be using those in their work. Though rather than let them produce textbook algorithms with prompts, I'd give them a prompt that produces known faulty output, then ask them to test the result, figure out what is wrong with it, and fix the code and/or the prompt.

Comment Hopefully this applies to the comment section too. (Score 2) 200

I'm not hoping for more trash talk and hate speech in the comments, but the censorship in the comment section has gotten ridiculous. Perfectly reasonably and polite comments get nixed, and some people (including myself) have noticed an odd personal ban on using certain (fairly innocent) words. When those people include that word, the post is consistently removed, while other people are allowed to use it. All in all very strange.

Comment Re:"Respecting copyright" != "Ethically" (Score 1) 100

True, but "Gandalf vs Predator" doesn't fall under Fair Use. Copyright does not just protect the work but also the ideas in it. Personally I think that's going too far. Does that mean that others can profit off your creativity? Yes... and it's been like that since the dawn of time. That's how culture has always worked.

Comment Re:"Respecting copyright" != "Ethically" (Score 3) 100

I’ve no illusions about which side the AI companies are on. But what I’m afraid of is that the issue of AI training will be misused by “Big Content” (for lack of a better word) to further restrict fair use, and to raise the barrier to entry for new commercial content creators. That’s what they have always done.

Personally I am not a fan of copyright allowing creators to retain control over the use of their work, whether they are commercial creators, or creators releasing under a license like the GPL. Especially the point on derivative works and moral rights: I think that copyright should be limited to just that: the right to copy or forbid it. The author controls when and how his work may be copied, so that he can derive an income from selling copies if he so desires, but for no other purpose. But derivative works should be allowed if the derivative is enough of an original work in its own right, inspired by the original rather than copying large parts of it verbatim. Anyone wants to write “Harry Potter and the Temple of Doom” or film “Gandalf vs Predator”, fine by me. Where that leaves AI, I’m not sure. The results from AI prompts sometimes seem to be inspired original works, at other times with recognisable snippets from someone else’s work.

Comment Re:Here's what's going to happen. (Score 2) 15

making data safety and privacy the default, not the other way around.

Yes, but it's already hard to convince legislators and the general public of the necessity. People still cling to the "I have nothing to hide" mentality (which may be true, but they have plenty worth protecting).
But once personal data becomes an economic asset with the ability to earn a little with it, convincing people to value their privacy will be even harder. And legislators will use a monetization scheme as an excuse to loosen up privacy laws.

Slashdot Top Deals

I don't do it for the money. -- Donald Trump, Art of the Deal

Working...