Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:collect IP (Score 1) 52

It's like the company I work for. Usage of AI is extremely limited (by a CYA training session): do not ever upload any company code or files to the AI.

Also the company hosts everything externally on Github, Gmail, and Google Drive.

Apparently the lawyers believe Google is only training AI on questions you make on Gemini and isn't scanning Gdrive and Gmail for everything (which we literally know they are because they announced "agents" for Gdrive files, and Gmail is full of "agents" too)

Comment Re:Having a laugh? (Score 1) 52

You might want to read up on how economists talk about supply/demand graphs. They have to offer me enough to make taking the job be better than my next best alternative (which might be sitting on my duff flaming on the Interwebs). If I have no skills and few opportunities, yes, that's going to be starvation wages. But the vast majority of people do have options so any employer has to out-bid the next best choice.

Without a minimum wage the floor drops out of your next-best-choices and large chunks of the population end up on starvation wages. This isn't a theoretical issue you need to estimate with graphs, it can be seen in practice in jurisdictions with extremely low or nonexistent minimum wages, or even in first-world gig work.

I do not believe that is the case. Standards of living consistently rose in England throughout the industrial revolution. I just read a book by Don Boudreaux and Phil Gramm which has an entire chapter documenting this.

That makes sense, you don't believe that's the case because you just voluntarily and uncritically filled your head with a warm load of grifty bullshit "documented" by a couple of right-wing lobbyists. If you'd ever read a real history book focusing on workers' standards of living in that period, the way that the propaganda book you hopefully didn't pay for flew in the face of it would've set off some alarm bells. Here's something to get you started: https://www.crei.cat/wp-conten...

Did you live through the '70s? I did. Life is immeasurably better now for everyone but the homeless guy on the corner. The wage stagnation myth is just that created by twisting statistics (e.g. ignoring transfer payments).

Look at the median home price vs. median salary in the '70s vs. now and the homeless guy might look like the only person who hasn't been totally hosed. It's the same for education costs, two things conveniently left out of inflation measures. Accounting for transfer payments doesn't change the picture either:

https://equitablegrowth.org/sl...

Tell me about it. I got laid off from my cushy high-tech job and spent 13 months trying to land a new gig. I lost count of how many applications and interviews I went through. High tech and software job markets are in a world of hurt right now.

So you know, and at the same time think that most anyone who wants a job has one right now? How does that work?

I have no doubt working in a Nigerian nickel mine sucks ass. But just like other sweatshops going back to the aforementioned dark satanic mills, you have to ask, why are people working there? Because it beats the alternative. The long term answer is to make Nigeria and Haiti (to pick two examples) more productive so they generate wealth, not make hiring people so expensive the employers all leave. And that's my point: yes some regulation can help some people in the short run. If it makes hiring people too expensive relative to their output, the jobs will leave and everyone left behind will be worse off.

People were often forced into the earliest mills because the (once popular) alternative, farming the commons and telling business owners to fuck off with their hellhole factories, was conveniently taken away. The alternative that people are choosing work over is usually not a lower-paying job but the threat of homelessness and starvation often worsened by the very business interests they're forced to work for.

Productivity alone won't do anything for workers, both the industrial revolution and the last half-century in the first world are proof of that. If you're worried about employers leaving for cheaper labor costs, good luck competing with every Chinese political prisoner.

User Journal

Journal Journal: Antiques being melted down

A restoration expert in Egypt has been arrested for stealing a 3,000 year old bracelet and selling it purely for the gold content, with the bracelet then melted down with other jewellery. Obviously, this sort of artefact CANNOT be replaced. Ever. And any and all scientific value it may have held has now been lost forever. It is almost certain that this is not the first such artefact destroyed.

Comment Re:Having a laugh? (Score 2) 52

That is not literally true, not in labor markets and not in virtually every other market. If employers could offer anything they wanted, they'd pay me $1/year. They do not, they offer much more than the minimum wage for something like 97% of hourly jobs. Salaried jobs have no minimum wage and yet we don't get poverty wages. Clearly the same supply/demand curves which control other markets are at play here.

They offer you more than minimum wage because of the existence of a minimum wage. Otherwise they'd offer you not $1 per year, but just enough to afford to return to work when added to whatever welfare they can squeeze out of government and society. Minimum wages do apply to salaried jobs as well. Check out what you can earn in countries that don't have them, and then thank a union. Or throw off the shackles of the minimum wage and get into a type of work that really doesn't have one, gig work, and let us know how that supply and demand thing works out for you.

That's been the story of industrialization since the 1750s. Every productivity enhancement has been decried by people claiming it will lead to waves of unemployment and dark satanic mills. And yet the numbers do not support this fear. Standards of living and wages have been more or less monotonically increasing for two centuries and for the most part, anyone who wants a job has one. It's almost as if improving productivity leads to rising wages and economic growth.

The Luddites died in grinding poverty with all indications being that they were correct, it was their grandchildren who got the new jobs that came along as a result of the automation that ruined their grandparents. Wages and standards of living have been stagnant for a half-century at this point. There are people frantically applying to hundreds of jobs they're well-qualified for and not getting so much as an interview. Over 100 people are applying to each job opening these days, and likewise it takes over 100 job applications for an average applicant to get a job (both rather conservative estimates). If improving productivity led to rising wages, why are we not earning 40% more for the same number of hours people worked in the '70s? Why has going from mainframes and dumb terminals to present-day computing added approximately jack shit to ~90% of workers' pay?

Gee, my current employer is hiring fast and furious, as are many others.

Good for you! Especially if you work in tech. Sucks for the people who aren't part of your anecdote though.

This has been argued back and forth for at least a century. We're not going to come to an agreement here. All I'll conclude with is that some regulation may have value and there's also a reasonable chance regulation is harmful.

Look up the history of the ones you don't think may have value and you'll learn about the workers who fought and/or died to get the laws in place that keep you from experiencing the same thing, which you now take for granted. Or if that's not enough, maybe you should experience a 996 work schedule in China, work alongside a nonexistent/laughably low minimum wage in a Caribbean country, or do some dangerous work in a Nigerian e-waste mine to get a taste of what happens without all those regulations.

Comment Re:Hurry up already (Score 2) 241

I mean, some delays are good. An example of this is when Apple decided to leave only USB-C ports for everything on Macbook Pros, but in the newer generations, they added HDMI back and also an SD card reader.

Which is very welcome. Adding HDMI to a $2k machine costs nothing and can save your ass. If you're a speaker at a conference and need to connect your laptop, it's probably gonna be HDMI, certainly not USB-C.

The SD card is also welcome by photographers and videographers who make heavy use of SD cards.

Certainly two very valid use cases for people who own macs really, and that aren't a compromise for other users (they're not taking anything away by adding HDMI or SD).

USB-C promised a lot and it hasn't materialized. I wanted to hop into USB-C at the very beginning, about 8 years ago, and it's only become worse. USB-C may or may not have PD, Thunderbolt, DisplayPort, etc. Apple does it mostly right with "all ports supporting everything", PC manufacturers are hit and miss, and it seems to be mostly a laptop thing. I'm not sure about the current state of desktop motherboards but last time I checked, they didn't really support Thunderbolt or DP and they only included one single USB-C connector.

If you're a laptop user, many monitors include USB-PD to charge your laptop, and DP to send video to the monitor, plus a USB channel for USB-C. You basically use your monitor as a docking station, with a single, thin cable. It's very neat. But AFAIK, you can't do this with a desktop computer.

Comment Re:Nobel laureate...yeah... (Score 1) 102

Chile seemed to be making something similar work with early-'70s computer technology, until Henry Kissinger decided that democratic socialism was a threat worse than Soviet communism and had the regime overthrown and replaced with a murderous right-wing dictatorship:

https://thereader.mitpress.mit...

Comment Re:An interesting problem. (Score 1) 76

I do very much understand what you're saying and it certainly adds to the complexity. One cannot put sociological or psychological factors on a box.

That aspect of the problem is indeed going to be much harder to deal with than, say, salt, trans fats, or known carcinogenic compounds.

Honestly, I'm not sure what you can do about those aspects - financial incentives help a little, but honestly I don't believe they make a huge difference - which is why I've concentrated on unsafe levels of ingredients, because although we don't know exactly what those should be, we've at least got a rough idea for some of them. It's going to be a delicate one, though -- you don't want to overly restrict sources of sugar because diabetics can suffer from crashes due to excessively low sugar just as badly as excessively high levels, and some items get unfairly maligned (chocolate, per se, isn't bad for you, it's the additives, and indeed particularly high percentage chocolate can be helpful for the heart).

But, yes, I absolutely agree with your overarching point that the problems are primarily psychological and sociological. I just don't have the faintest idea of how these can be tackled. Jamie Oliver tried (albeit not very well, but he did at least try) and the pushback was borderline nuclear, and that was where there was clear and compelling evidence of significant difference in health and functionality. If you can barely escape with your life for saying eating better reduces sickness and improve concentration, and pushing for changes where these two factors essentially dictate whether a person is functional in life, then I don't hold out hope for change where it's more ambiguous or the economics are much tougher.

Comment An interesting problem. (Score 1) 76

There are papers arguing that smoothies aren't as good as eating real fruit because it seems that there's actually a benefit to having to break down cell walls, even at the expense of not getting 100% of the nutrients from it. However, cooking food breaks down cell walls, although obviously not to the same degree. It's not clear that breaking down cell walls is harmful, even if it's not beneficial.

A lot of ultra-processed foods have been accused of having unhealthy levels of certain ingredients (usually sugars or salt) and certain styles of cooking can add harmful compounds.

It would seem reasonable to say that there's a band at which a given ingredient is beneficial (analogous to a therapeutic threshold), with levels above that being increasingly harmful, eventually reaching a recognised toxic threshold. In terms of the harmful compounds from cooking, it seems reasonable to suggest that, below a certain level, the body's mechanisms can handle them without any issue, that it's only above that that there's any kind of problem.

So it would seem that we've got three factors - processing that can decrease benefits, ingredients that follow a curve that reaches a maximum before plunging, and processing that can increase harm.

Nobody wants to be given a complicated code that they need to look up, but it would seem reasonable that you can give a food a score out of three, where it would get 3 if you get maximum benefit and no harm, where you then subtract for reduced benefit and increased harm. That shouldn't be too hard for consumers, most people can count to 3.

Yeah, understood, food is going to vary, since it's all uncontrolled ingredients and processing itself is very uncontrolled. So take two or three examples as a fair "representative sample". Further, most manufacturers can't afford to do the kind of testing needed, and our understanding of harm varies with time. No problem. Give a guidebook, updated maybe once every couple of years, on how to estimate a value, which can be used, but require them to use a measured value if measured, where the value is marked E or M depending on whether it's estimated or measured.

It's not perfect, it's arguably not terribly precise (since there's no way to indicate how much a food item is going to vary), and it's certainly not an indication of any "absolute truth" (as we don't know how beneficial or harmful quite a few things are, food science is horribly inexact), but it has to be better than the current system because - quite honestly - it would be hard to be worse than the current system.

But it's simple enough to be understandable and should be much less prone to really bizarre outcomes.

Comment Re:This isn't really a big problem. (Score 2) 125

Absolutely not. The US has a massive amount of room. And more people means more ideas, more new thoughts, more efficiencies from economies of scale, and more comparative advantage.

While the planet could support more people with properly managed resources, this idea that innovation comes from pulling the genetic one-armed-bandit enough times to hit a few jackpots and pop out a few Einsteins is ludicrous. Einsteins aren't born, they're made and enabled. If you want new ideas and innovation, support and empower the people who are already there to do it. We already have lots of people who could generate new ideas including lots of potential Einsteins, they just don't have the time or conditions to make a breakthrough like he did. They're scrapping away at a shitty job that eats all their time to make an already rich person richer instead of having the time to study bleeding-edge physics etc. in their generous downtime at a chill job in a patent office. Or they've gained the skills to make breakthroughs but are wasting them on making Facebook more addictive or developing Grok.

Comment Re:This isn't really a big problem. (Score 2) 125

Per capita GDP isn't even a good measure because it isn't a measure of average wealth at all, it's a measure of total national income with no regard for how big the average person's slice of the pie is at all. I think what you're looking for is median household income or median personal income.

Slashdot Top Deals

Moneyliness is next to Godliness. -- Andries van Dam

Working...