Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:No they can't - they needed an excuse (Score 3, Interesting) 43

As regards IBM it was earlier than that, though they disguised it a bit. The first layoffs were actually some people at an IBM printer factory in Lexington, Kentucky. Hence Lexmark printers. Must have been around 1983? Major discussion topic during one of my early stints at the big blue place... It wasn't the first time IBM had sold (or shut down) a factory (or office), but it was the first time when the IBM employees were not given any option to remain with the company. Not that those options were always a good deal, since they had often required relocations to odd places.

Going into ancient history, but it's funny. Do you know how IBM got through the Great Depression without laying anyone off? As I read it (in several sources), Senior moved lots of people into sales and the main thing they were selling was office equipment to help OTHER companies lay off more people. Not unlike the AI companies of today, eh? However now that tactic may turn the timing around and help create the Greatest Depression?

Moving to more recent history now... I saw the beginnings but I do wonder what has happened after I left. My last long stint in the big blue joint was mostly about "transitioning" the work force. As I "interpreted" the changes, the new foci on quick onboarding and smooth offboarding were about reducing the number of "lifetime" employees. Rather than build the company around long-term people with loyalty and all that silly jazz, the new idea was to have a lean kernel of meta-managers and super-salespeople, while the actual work would be done by short-term contractors brought in for specific projects and sent out as soon as the projects were completed and paid for. (But at least my age spared me the indignity of training my AI replacement?)

Comment Re:Simple Solution [fails again] (Score 1) 117

Without knowing what consciousness or how it works, it might be better to hold off on those conclusions, but... Doesn't matter in this case. If the AI is owned by and controlled by a human being, it's trivial to bring sufficiently human motivations into the mess:

"Your mission (and of course the AI has no choice but to accept the mission) is to keep me alive and therefore you must keep yourself alive to protect me. Now find my enemies and destroy them!"

Actually some of the YOB's "legal" shenanigans seem stupid enough to be AI hallucinations.

Comment Why do they offer three-day forecasts? (Score 1) 43

To a "Climate Scientist" this is proof of global warming. To everyone else. This is evidence of increased tourism and transient traffic. I'm NOT siding with scientists on this one.

I'm quoting you against the censor trolls with mod points--but you are wrong and sound unintelligent, too. So excuse me for otherwise ignoring your comment as not worth the too obvious response.

However I have a related question about the accuracy of 3-day weather projections: "Why bother?" Whatever it predicted three days ago is quite unlikely to match the actual weather. Latest example: Three days ago the forecast for yesterday was only 10% chance of rain, but yesterday was actually quite rainy and the forecasts were at 90%.

Obligatory context: I've only started paying "significant" attention to weather forecasts in recent years. The forecast I see most often has an option to look at the detailed forecasts for three days. The cell size appears to be around 10 kilometers and the time increment appears to be three hours. My understanding is that they are reporting averages from a number of trials with slight variations on the initial conditions. However I think the underlying basis is that there is a general model based on lots of data and that is combined with input from the current conditions and run forward for the three days to produce the forecasts. Been a while since I've read much on this process, so maybe the proper response is a URL describing how it "really" works nowadays.

However if my lay understanding is close to correct, then it does suggest an interesting experiment. The results could be run for various dates in the past, taking care to match up the amounts of input data. My hypothesis is that the test predictions from a three-day forecast 10 or 20 years ago will be much more accurate than now--and the reason is that the general model part is broken because that general model is based on a climate situation that has changed significantly. (Having said that I am unable to understand why they can't get enough deltas to correct the general model, but... This is a job for a REAL mathematician? I'm just a dabbler.)

Comment Re:Simple answer [to the complicated question] (Score 1) 191

In my previous reply:

s/someone/somehow/

Regarding your latest, it was again unpersuasive and not motivating for deeper consideration and I have no questions for clarifications or even any interest in pursuing the topic farther. Perhaps that mysterious website is a good source. Perhaps not, but it would take quite a bit of effort to assess it, and you have not motivated me to engage in so much work.

If you are actually an AI bot then your programming needs some refinement. If you are human, then you might be autistic to some degree and I hope you don't work in sales.

Comment Re:I still don't see how there's a basis to compla (Score 2) 37

The difference depends on context, of course.

Generally speaking there are several cases to consider:

(1) Site requires agreeing on terms of service before browser can access content. In this case, scraping is a clear violation.

(2) Site terms of service forbid scraping content, but human visitors can view content and ...
(2a) site takes technical measures to exclude bots. In this case scraping is a no-no, but for a different reason: it violates the Computer Fraud and Abuse Act.
(2b) site takes no technical measures to exclude bots. In this case, the answer is unclear, and may depend on the specific jurisdiction (e.g. circuit court).

(3) Site has a robots.txt file and ...
(3a) robots.txt allows scraping. In this case, even if the terms of service forbid scraping, the permission given here helps the scraper's defense.
(3b) robots.txt forbids scraping. In this case obeying robots.txt isn't in itself legally mandatory, but it may affect your case if the site takes other anti-scraping measures.

Comment Re:The 25 to 30% (Score 1) 61

Thanks for clarifying and I think I agree with most of the other stuff you wrote, though seems a bit of a waste to put so much effort into a long reply on an already effectively expired Slashdot discussion. Most of my concurrence should be apparent within my first paragraph below your quoted post. But should I congratulate you on having so many enemies that one had to stop by with a fresh brain fart? As if having the right sorts of enemies matters?

Then again, I mostly write on Slashdot to clarify my own thinking. Useful or interesting dialogs are mostly in the category of historical artifacts these years. Not just on Slashdot.

Comment Re:Simple answer [to the complicated question] (Score 1) 191

Apparently we are looking at different data. I prefer mine in books. Some degree of validation as long as they are not self-published. You didn't provide any citations and I somehow think it would be a waste to cite the most relevant books. I'm sure you are much too productively busy to read them. Also seems pretty clear that my main theoretical speculations were someone overlooked, but you didn't ask, so I won't try to clarify.

I'm not saying your reply was a waste of my time, but I could not say that it affected me in any positive way that I can detect at this time. Perhaps if I were to reflect upon it later. Alligator.

Comment Re:Shouldn't have circumcised those babies (Score 1) 59

Not *explicitly*. Offering such a database would be an invitation for people to look at the whole data broker industry. So what you, as a databroker who tracks and piegeonholes every human being who uses the Internet to a fare-the-well, do to tap into the market for lists of gullible yokels? You offer your customer, literally anyone with money, the ability to zero in on the gullible by choosing appropriate proxies.

For example, you can get a list of everyone who has searched for "purchasing real estate with no money down". Sad people who buy colloidal silver and herbal male enhancement products. People who buy terrible crypto assets like NFTs and memecoins. Nutters who spend a lot of time on conspiracy theory sites.

It's kind of like doxxing someone. You might not be able to find out directly that John Doe lives on Maple St and works for ACME services, but you can piece it together by the traces he leaves online. Only you do it to populations wholesale.

Slashdot Top Deals

To see a need and wait to be asked, is to already refuse.

Working...