Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Low Effort Race-Baiting (Score 1) 60

Or maybe they chose stores with high shoplifting rates, i.e. something relevant to their goal instead of relevant to an ideological propagandist at Reuters

That still doesn't explain, let alone justify, the profiling of innocent customers. That's how you lose customers.

But we hear your rant, loud and clear. You do you, bo.

Comment Re:Nothing done about larceny... (Score 1) 60

Shoplifting, and the lack of interest with governments enforcing laws is why the US has no malls anymore

Dude, not even. How can you say there are "no malls anymore"? On what cornfed hole do you live?

, or any retail spaces that are fun or pleasant to go to

I really wonder where you live.

Being banned and trespassed on site should be the consequences for shoplifting. Don't like it, don't steal from stores.

Cool rant, bro. It has nothing to do with the specific technical problem discussed in the article.

Comment Re:State of the art (Score 1) 70

I've met people that were deeply misguided yet I didn't come out of those conversations thinking that governments aren't necessary, that plastic is dangerous, that redheads don't have souls, or that people as in general do have souls.

Good for you. I have no clue how this is relevant to my post.

I can tolerate a significant amount of bad data.

Any system with enough complexity (a human included) is required to tolerate a % of bad data. Depending on the task at hand, the margin of error will vary, however.

Tell me your requirements, and I might tell me the margin of error needed to efficiently meet those requirements.

Why should we build LLMs that require perfect input data

I never made the claim that we require perfect input data. Only that we need to have a way to curate it.

These are two distinct propositions (from both technical and epistemological points of view.) I don't intend to engage in a debate where these two are conflated.

Comment Re:State of the art (Score 1) 70

Everyone's been training their AIs on massive piles of unverified data, which is a pathologically bad idea

Thank you! I totally agree with you. Training these systems with piles of unverified data is akin to throwing shit at the wall and selling whatever sticks.

We need to have a way to curate the data at scale. We don't have an answer to that problem. I don't think we even know how to have a technical conversation on how to do it. But this is a start (culling a large % of images.)

Not great. Perhaps not even good. But it has to start from somewhere and see how we can make it better.

Comment Re:State of the art (Score 1) 70

There's no viable answer here that doesn't involve tossing a huge percentage of their training images.

Until better solutions exist, this is an acceptable trade-off for most systems. Either you throw a large percentage of the training images (which specifically contain a child and potential nudity), or, as in this case, you end up throwing the entire data set.

We just don't have a good answer, yet. But that doesn't (nor should) preclude us from having to take a hit (in loss of training data) in order to save as much as possible.

And if excluding these images is so unacceptable, we must ask ourselves what the heck we are doing here, what kind of problems are we trying to solve that require the presence of these images in the training data sets?

Comment Re:State of the art (Score 1) 70

When humans themselves can have trouble distinguishing between a picture a parent takes of their toddler playing in the bathtub and actual pornographic content, how do you expect AI to do it?

Most humans can do this, and it is certain that the images encountered in the data set are not of babies in bathtubs. These were images of sexual abuse. I am not sure what the confusion is here.

Now, there is a way to train an AI... but training it with actual images of sexual abuse. Horrifically, law enforcement agencies have data sets of these images (which is how they track perpetrators.)

So, in theory, an AI system could be trained to recognize them. There are legal implications here, however, because the mere possession of such material (outside the storage of evidence by law enforcement) is a crime in many countries (certainly in the USA.)

It doesn't have to be perfect (no AI system is). It simply needs to flag images as potentially problematic, for a LEO or social worker or someone with a legal power to see them to confirm if they are or not.

Additionally, a producer of a data set is free to decide if certain images are to be excluded out of an abundance of caution, like an image of a baby in a bathtub (that would be at the producer's discretion.)

This is a good-enough solvable problem, but it is one that is socially and legally difficult to do (for the reasons I stated above.)

Comment Re:Just like California (Score 2) 106

The water problems will fix themselves. The places that overuse the slow replenishing aquifers will dry up, and after that no amount of political screaming or prayer to whatever god you worship will make the water reappear. If local resources get used up, the humans will adapt or migrate. Just like any other species of animal.

I don't think anyone doubts the physical inevitability of water problems fixing themselves. The question is how to fix the inevitable food production crash (followed by either by inevitable economic hardships or a famine).

If local resources get used up, the humans will adapt or migrate. Just like any other species of animal.

Unfortunately, past the size of small bands, people can't migrate in large numbers, not in the hundreds of millions as in this specific case. They aren't wildebeest. A water crash inevitably causes a food production crash. And a food production crash of this magnitude can cause devastating famines. One of the reasons India never fully recovered its economic might after a century of British rule was due to the devastating famines caused (in great part) by ill-conceived colonial policies of the time.

I grew up in the middle of a civil war, and observed (and went through) displacement. It's just not possible for large numbers of people to migrate, not without casualties.

Just to grasp a scale of things: the so-called "child" migration crisis of the mid 2010's was triggered by years of unprecedented droughts in Central America caused by El Nin~o. The region has historically been food self-sufficient, but El Nin~o and climate change altered that.

Now, imagine South Asia (not just India, but India and Pakistan), 10-20 times over (due to the difference in population sizes.)

The potential of upheaval by water crisis in India is something at another level just by the sheer number of people that could be affected (not just farmers, but also hundreds of millions of consumers that depend on their food production.)

PS. I'm not aiming this at you. This is just a general observation that people might miss the serious implications of this.

Comment Re:Privately held is the only way to go... (Score 1, Troll) 61

Privately held companies are the only ones which are allowed to do much in the market and make disruptive change. Anything held by public shareholders will get mired in having to show earnings for this quarter with shareholders threatening to sue if a firm charges off some revenue for R&D or even fixing bugs.

Yep.

Some dumb asses demonize private equities because... feelings, but there's nothing inherently more moral or effective with public companies. Going public is simply a way to access more equity (via shares.) Sometimes it makes sense, and sometimes it doesn't.

If Toshiba has value left, the private equity firm will salvage it and make it work, and then decide whether to hold it or sell it (to continue as a private or public entity). If it is not salvageable, then it gets liquidated.

And people will get their feelings all curled up if the latter happens even though that's precisely the only way to proceed (without government intervesion.)

Comment Re:Legit reasons (Score 1) 136

There's legitimate reasons to put cameras in bathrooms, theft and vandalism are two which was a big problem for us during Covid.

At least in the USA, federal and state law disagrees with you. By law, you are given an expectation of privacy in a change room at a clothing or in a restroom at a supermarket or store. And at least 18 states makes it illegal to put cameras in certain areas (like bathrooms or dressing rooms.)

IANAL, but I suggest people to learn the law before making hazardous assumptions.

Comment Re:What slow economic growth? (Score 1) 45

Economic growth, in the US anyway, was a staggering 5.4% in the last reporting quarter.

Not to be playing devil's advocate, but economic growth hasn't been uniform. And it is not clear to me if economic growth has translated into increased Spotify membership or revenue. And capital costs have increased, and it could be that it's been a slow economic growth for services like Spotify.

Just saying... and obviously, exec pay ain't gonna get affected. Golden parachutes never are.

Comment Re:God I hate this "role" euphemism (Score 1) 78

Amazon eliminated several hundred roles this month

Say it like it is already: Amazon fired several hundred employees. That's what's going on here, It's real actual human beings who lost their jobs, not fucking roles.

Not necessarily true. Roles can be consolidated, with people moving to different roles w/o getting fired. Also, roles can be eliminated not by firing currently employed people, but by reducing or even eliminating new hires for those roles.

Slashdot Top Deals

Cobol programmers are down in the dumps.

Working...