Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re: Has a point (Score 1) 188

Wow, you know shit about ML.

You don't need to see the Disney movies, learn every Ariana Grande song or read every Harry Potter book to be able to fold proteins. You don't need to infringe any copyright.

And Google offers a service to websites. And if websites don't want their content I. Google they can disallow that using robots.txt. The same file that all AI companies are ignoring.

Comment Re: LOL (Score 1) 51

As a tool it's pretty bad if you take 5 minutes to think about it. Computers are reliable and when they make mistakes it's an exception.

When "A.I." makes an error we invent terms that make it seem like it wasn't an error: hallucination.

Of course if you know how LLMs work you would know that those errors are features and they cannot work without them regardless of how much training data you feed into them.

Comment Re: LOL (Score 1) 51

Do you know that when you ask for a report it will give you false information at some point and the rate at it will give you false information will increase with time? And if you're ok with that, why do you even need the report in the first place?

Hallucination is AI marketing term for false information or error.

Slashdot Top Deals

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...