Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment This is just an asshole trying to project a thing (Score 1, Interesting) 42

Namely that LLM-type AI has taken over and hence AI must be really great and valuable and the future and whatnot.

The truth looks a bit different. LLMs are still incapable as hell and will continue to hallucinate, because there is no way to fix that. Keeping LLMs updated gets harder and harder due to AI slop. Nobody is making any profits on LLMs, these things just burn money like crazy. And there are very few somewhat working use-cases and these come with really big caveats.

Comment Re:"Science" has the same problem, thank you RFKjr (Score 1) 109

No, they cannot. LLMs cannot do logical reasoning and that is a requirement for any type of fact-checking or verification. All they can do is correlations and that is not enough.

While using non-LLM AI or non-AI tools is a possibility for fact-checking (in simple cases as the one here), you overlook that basically the only advantage of LLMs is that they are comparatively easy to create. Creating these "other tools" that would be needed here would be a lot more expensive and a lot more work than creating general LLMs. And hence, for most cases, these "other tools" do not exist.

Comment Re:It Never Ceases to Amaze Me (Score 1) 109

I do not "think so". I am just referring to research results. You, on the other hand, have nothing. LLMs cannot fact-check, period.

Robotics just uses the fact that you can train whatever you use pretty well and you can use _other_ systems there to install guardrails and these other systems come with physics models and can do (limited) fact checking in there. These other systems are not LLMs though. LLMs cannot fact-check in robotics either.

Comment Re:Bad enough when they required python (Score 1) 64

They just keep adding more requirements which bloat the base install. This is the opposite of the right direction.

Indeed. KISS is the basis of all solid engineering and complexity is the death of it. I do not think the people at the wheel are aware of that. Probably a lack of experience and insight, which would be bad on a whole other level.

Comment Re:Rust is not just memory safety (Score 1) 64

As long as it is clearly understood that this is an experiment, that is fine. Fully committing only to find out that Rust does not cut it would be a bad mistake. There are several serious problems with Rust, one being the lack of spec, another being that it is hard to learn (hence fewer maintainers). And "used wisely" requires that wisdom to be present and in use.

I will be watching this and it will be interesting to see how it will turn out. Just keep "There is no silver bullet" in mind.

Comment Re:Rust Apocalypse (Score 1) 64

Good point. Rust also hat the problem that it is really hard to learn and requires a lot of skill and experience to do so. Which makes the risk of Rust prematurely dying a lot more real.

On the other hand, while I am not convinced that Rust really makes code more secure (the incompetent will just make harder to find mistakes, but attackers can now use AI to help with that), that it is hard to learn may have that effect. The most serious problem we have in the software space is tons of incompetent and semi-competent "coders".

Still, "standardizing" on a supposedly "secure coding" language that hilariously does not even have a full specification is probably not a good idea.

My take is all that "move to Rust" is half-assign things and misses the point. That never goes well.

Slashdot Top Deals

Computers are useless. They can only give you answers. -- Pablo Picasso

Working...