Comment Re: ok? (Score 1) 59
Check the number where? I'm Google? With ChatGPT? The same ones that told them the number?
We use Google to check the phone numbers!
Check the number where? I'm Google? With ChatGPT? The same ones that told them the number?
We use Google to check the phone numbers!
"conversion free" LOL
Good one
I hope you were just joking
Windows 95 didn't have that many floppies.
Office 95 on the other hand...
It's 100% wage theft if they intend, like they say, to replace the workers that created the works used to train the models.
Wow, you know shit about ML.
You don't need to see the Disney movies, learn every Ariana Grande song or read every Harry Potter book to be able to fold proteins. You don't need to infringe any copyright.
And Google offers a service to websites. And if websites don't want their content I. Google they can disallow that using robots.txt. The same file that all AI companies are ignoring.
You learn, just like we used to and what smart people still do.
Because in the first case there's barely any damage but in the second case you can easily put in danger the original business.
The streamer doesn't make money off of the streamed video. And it's likelihood also doesn't depend on streaming every creation ever.
As a tool it's pretty bad if you take 5 minutes to think about it. Computers are reliable and when they make mistakes it's an exception.
When "A.I." makes an error we invent terms that make it seem like it wasn't an error: hallucination.
Of course if you know how LLMs work you would know that those errors are features and they cannot work without them regardless of how much training data you feed into them.
Do you know that when you ask for a report it will give you false information at some point and the rate at it will give you false information will increase with time? And if you're ok with that, why do you even need the report in the first place?
Hallucination is AI marketing term for false information or error.
You haven't seen nothing yet. This still isn't the true cost of it.
The greatest con of these recent years is calling wrong results "hallucinations".
Bitch, your tool is defective, it gives wrong results/answers, it doesn't "hallucinate".
The "hallucination" is a feature of an LLM as it was made to invent text, not resolve problems.
We often have 100km/h winds in Brandenburg. Never heard of any wind turbine collapsing.
Cool, now, tell me, how do you prevent somebody or services spamming you with negative reputation?
Or people buying positive reputation?
You just invented new problems.
What does freedom of speech mean? In the US you are driven out of the country for participating in protests.
What does right to have arms mean? The US has the most armed people in the world, but you can't do shit against tanks, armoured vehicles and UAVs.
"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian