Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:Confused (Score 2) 49

Make piracy the better alternative in every way and people will pirate.

Make a legal transaction the better alternative in every way (ok, except money) and people will use that.

It really is so simple, even an executive should be able to understand it.

Comment does it, though? (Score 1) 106

"We Politely Insist: Your LLM Must Learn the Persian Art of Taarof"

While that might be an interesting technical challenge, one has to wonder why. Just because something is "culture" doesn't mean it should be copied. Slavery was part of human culture for countless millenia. To the point where we haven't even gotten around to updating our "holy books" that all treat it as something perfectly normal. That's how normal slavery used to be.

(for the braindead: No, I'm not comparing Taarof to slavery. I'm just making a point with an extreme example.)

The thing is something called unintended consequences. So in order to teach an LLM Taarof you have to teach it to lie, to say things that don't mean what the words mean. And to hear something different from what the user says. Our current level of AI already has enough problems as it is. Do we really want to teach it to lie and to misread? Just because some people made that part of their culture?

Instead of treating LLMs like humans, how about just treating them as the machines they are? I'm pretty sure the Persians don't expect their light switches to haggle over whether to turn on the light or not, right? I stand corrected if light switches in Iran only turn on after toggling them at least three times, but I don't think so. In other words: This cultural expectation only extends to humans. Maybe just let the people complaining know that AIs are not actually human?

Comment Re:Overwrought (Score 2) 63

This does not appear to be holding up in practice, at least not reliably.

It holds up in some cases, not in others, and calculating an average muddles that.

Personally, I use AI coding assists for two purposes quite successfully: a) more intelligent auto-complete and b) writing a piece of code using a common, well understood algorithm (i.e. lots of sources the AI could learn from) in the specific programming language or setup that I need.

It turns out that it is much faster and almost as reliable to have the AI do that then finding a few examples on github and stackoverflow, checking which ones are actually decent, and translating them myself.

Anything more complex than that and it starts being a coin toss. Sometimes it works, sometimes it's a waste of time. So I've stopped doing that because coding it myself is faster and the result better than babysitting an AI.

And when you need to optimize for a specific parameter - speed, memory, etc. - you can just about forget AI.

Comment smoke and mirros (Score 4, Interesting) 63

Hey, industry, I've got an idea: If you need specific, recent, skills (especially in the framework-of-the-month class), how about you train people in them?

That used to be the norm. Companies would hire apprentices, train them in the exact skills needed, then at the end hire them as proper employees. These days, though, the training part is outsourced to the education system. And that's just dumb in so many ways.

Universities should not train the flavour of the moment. Because by the time people graduate, that may have already shifted elsewhere. Universities train the basics and the thinking needed to grow into nearby fields. Yes, thinking is a skill that can be trained.

Case in point: When I was in university, there was one short course on cybersecurity. And yet that's been my profession for over two decades now. There were zero courses on AI. And yet there are whitepapers on AI with me as a co-author. And of the seven programming languages I learnt in university, I haven't used even one of them ever professionally and only one privately (C, of course. You can never go wrong learning C. If you have a university diploma in computer science and they didn't teach you C, demand your money back). Ok, if you count SQL as a programming language, it's eight and I did use that professionally a few times. But I consider none of them a waste of time. Ok, Haskell maybe. The actual skill acquired was "programming", not a particular language.

Should universities teach about AI? Yes, I think so. Should they teach how to prompt engineer for ChatGPT 4? Totally not. That'll be obsolete before they even graduate.

So if your company needs people who have a specific AI-related skill (like prompt engineering) and know a specific AI tool or model - find them or train them. Don't demand that other people train them for you.

FFS, we complain about freeloaders everywhere, but the industry has become a cesspool of freeloaders these days.

Comment uh... wrong tree? (Score 1) 75

"When the chef said, 'Hey, Meta, start Live AI,' it started every single Ray-Ban Meta's Live AI in the building. And there were a lot of people in that building,"

The number of people isn't the problem here.

The "started every" is.

How did they not catch that during development and found a solution? I mean, the meme's where a TV ad starts Alexa and orders 10 large pizzas are a decade old now.

Comment innovation is - sadly - dead at Apple (Score 1) 81

the company has, in the pursuit of easy profits, constrained the space in which it innovates.

Quite so. It's been how many years since something really new came out of Cupertino? Granted, Apple is more profitable than ever, but the company clearly shows what the result of placing a supply-chain expert as the CEO does.

The really sad part is that there's nobody ELSE, either. Microsoft hasn't invented anything ever, Facebook and Google are busy selling our personal data to advertisers, and who else is there who can risk a billion on an innovation that may or may not work out?

Comment Re:Missing the obvious (Score 1) 15

Apple fans already have a heartrate sensor on their wrist, they don't need one from the ear.

That's wrong. I stopped using wrist watches 25 years ago and haven't looked back a single day. I don't want shit on my wrist. Try living without for a year and you'll realize why. It's hard to express in words. It's like having a chain removed.

Headphones, on the other hand, I use occasionally. For phone calls or for music on the train, plane, etc. - and especially for the plane if the noise cancellation comes close to my current over-the-ear Bose I'd take them on the two-day business trips where I travel with hand luggage only and space is a premium.

Do I want a heartbeat sensor? No idea. I don't care. But if there's any use for it than at least for me that's not a replication. I'm pretty sure many, many Apple users don't have a smart watch.

Comment "fake" - you don't say ! (Score 1) 83

So he claims that social media - the platform where everyone pretends to be more happy, more active, better looking, more interesting, more travelled, etc, etc, etc, etc, etc, - feels "fake" ?

Man.

Next he's going to say that artificial sweeteners taste might not be natural.

Seriously, though, social media has been the domain of bots for at least a decade. Even people who actually write their posts themselves use bots to cross-post to all the different platforms and at "optimal" times. Nothing on social media is not fake. Well, maybe your grandmother's photo album because she doesn't know Photoshop exists.

Comment Re:You ARE the weakest link (Score 1) 47

Amateur-level procedures have really run their course and do not cut it anymore.

Do you want to bet on the percentage of Fortune 500 companies that use amateur-level procedures for their prod systems?

"Above 50%" seems like a guaranteed win to me.
"Above 75%" is where I start to think "maybe not that high". But I fear I'm giving them too much credit.

Comment malware delivery system (Score 1) 47

But the "m" in npm always stood for "malware", did it not?

The npm ecosystem is deeply flawed. Look at some of the affected repositories. Many of them are just a few lines of code, yet over a hundred other packages depend on them. At least half of them have no reason to even exist. A lot of them have last been updated years ago.

We have an ecosystem where seemingly every individual function has its own package. That is just ridiculous. It is modularization driven to its absurd extreme. It's why you add one package to your project and it pulls in a hundred dependencies.

And the more tiny packages there are, the larger the attack surface and the smaller the amount that can be monitored for malware injection and other problems. I wouldn't at all be surprised if one or more of these packages will never be updated and have the malware in them forever simply because the only dev with the password to the repo has since died or gone to do other things with his life.

Comment Re:scary (Score 1) 91

We have all but removed them already. In many kills these days, a human presses a button and that's about it.

But yes, removing them entirely removes that last bit of accountability. Next time a drone slaughters a market place full of civilians with no terrorist anywhere in sight, we won't even have someone to put on trial.

Well, we can try with the LLM making the decision. I'm sure it'll apologize a lot and invent a number of threats to justify its actions, if current AI is any indication.

Slashdot Top Deals

If all the world's economists were laid end to end, we wouldn't reach a conclusion. -- William Baumol

Working...