Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Re:Why doesn't this exist? (Score 1) 52

That's not how LLMs work. They can only extrude what's in their training data, and they have no way to understand if a case is relevant. This is one of many reason why LLMs are of limited usefulness. They don't encode knowledge, just words. They don't reason, in spite of OpenAI's claims. They're a clever trick, and have some uses, but they aren't a universal solution to knowledge work, nor are they a path towards that goal.

Comment Re:Why doesn't this exist? (Score 1) 52

It might exist. Extracting the cites from a brief is trivial (formats are standardized), and then verifying that they exist should (magic word!) be straightforward. What that can't do is verify that the citation is relevant. But I would imagine that if anyone were closely reading the briefs one generated by an LLM would stick out like a sore thumb if you know what to look for. But lawyers and judges are busy, so they don't do that. That's why so many ChatGPT written briefs are showing up, and it's only going to get worse.

Comment Re:So something I don't think anyone is asking (Score 1) 52

No, it's not a search engine. It's a probabilistic algorithm that finds the next likely word to follow what it generated before, extruding synthetic text at the end in the shape of a response to the prompt. The most important thing to realize here is that it doesn't encode knowledge, just the probabilities of what word comes next. There's a very large amount of interconnections in the training data, which is why it can often extrude text that resembles a correct result. What they don't have the ability to verify information against an outside source, which is why you can ask it to only give you cases in the Lexis database and still get completely fictitious cases.

Comment Re:Why did they think it's worth it? (Score 2) 12

Cisco used to do a ton of acquisitions like this, ones that look pointless, but serve some function. And the function is to get pre-screened engineers, as well as stymie a potential competitor in a field that they might want to enter in the future. Of course, Altman being Altman, they massively overpaid for this one. Effectively they are paying $3B to acquire 191 employees, not all of which will be kept. That works out to around $15.7M per employee before layoffs. By contrast, Cisco used to pay around $100M per company, and inflation isn't THAT bad. Also, Cisco had enormous profits and swimming pools filled with cash to do this with. OpenAI loses $5B/year, and you could heat a city of 1M people with the heat of all the cash they burn.

Comment Re:Due Dilligence (Score 1) 42

What it means is that the party line from Meta is that it's full of lies, but they know it's all true. Hence blank accusations instead of refutation. Bozworth proving once again that tech executives are at least out of touch, if not complete idiots. I long for the days of press releases instead of social media tirades, too.

Comment Probably an excuse to fire senior devs (Score 2) 32

I would really like to see the results broken down by experience level. For juniors? Sure, and that's been replicated in many studies. I want to see how their tools work for developers with 5+ years experience. Or are they just firing every experienced dev and replacing them with bootcamp grads armed with an AI coding tool, since they're so much more productive now?

Slashdot Top Deals

"If people are good only because they fear punishment, and hope for reward, then we are a sorry lot indeed." -- Albert Einstein

Working...