Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Inflation is about money (Score 1, Interesting) 135

[...] and high inflation for basic products is driven by population growth. This is basic supply and demand logic.

Supply and demand is one reason why prices of goods can rise, but that explanation comes with conditions.

To draw a math analogy, you can use linear regression to find the optimal slope and intercept of a line that best explains the data, but that only works when the underlying mechanism giving the data is itself linear. You should only use that method and that explanation if the original condition (the underlying mechanism is linear) is met.

Supply and demand is conditioned on there being only one product under consideration in a sea of available products. It's 'sorta like doing thermodynamic calculations by taking a source sample from a very large isothermal bath - so large, that taking the sample makes negligible changes in the bath that can be ignored.

When *all* products rise or fall together, the conditions for supply and demand no longer apply. Another way to look at is is to note that food prices have gone up even though there is plenty of food available, the population/demand for food hasn't changed appreciably, and the source amount of food hasn't changed appreciably.

Under those conditions, rising prices are due to monetary policy. If there is a constant output of product, more money is injected into the system will "dilute" the value of money relative to the product. 'Sorts like adding water to chicken soup. It makes the soup thinner, there is more water per noodle than there was before you added extra water.

That's the fundamental cause of all the inflation we've been having. It's not supply-and-demand, the conditions for that explanation are not met.

Inflation is caused by excess money in the system, nothing more.

Comment MS (Score 1) 39

Every dirty tactic, no matter how small and inconsequential, makes me not use this junk even more and be put off from it for decades to come.

Hell, I'm still "punishing" Edge from back was it wasn't Chrome, and still holding MS responsible for the mistakes of IE. They're not getting away with that junk they pulled 20 years ago with me, in terms of recommendation or reputation, and they certainly aren't reducing their sentence here.

The last 10+ years all my workplaces have been Chrome-only. I have no interest in Edge whatsoever, and there are even parts of Microsoft Admin that require you to open reports in Edge only for absolutely no reason whatsoever. I fight them every time just to NOT open them in Edge.

The Microsoft deliberate, enforced monopoly of the 80's/90's is still alive and well, and I'm still punishing them for it.

Comment Re:"AI" (Score 1) 136

Because a human is clever with way less input.

Yes, we discard an awful lot of information - but not consciously. We aren't DECIDING to forget everything about the detail of the room we just walked into, we simply don't have the capacity to retain it all.

In terms of actual information we usefully process, it's minimal. And the less information we get, the better we are at processing that information to its limit. The more information we get, the easy it is to overwhelm us. And a vast part of "intelligence" is being able to discard irrelevant data and retain the useful data.

Seriously: Where do you go from here? The LLMs have MORE DATA than a human could ever hold, ever. Available in perfect reproduction at any time, instantly. What are you going to do if you believe that they can only progress with EVEN MORE DATA? Because there basically is none. AI manufacturers are actually warning that they have no more useful data available to them because it's being generated by AI and polluting their training inputs. There simply isn't significantly more data out there from the entirety of human civilisation.

I expect an AI model to be INTELLIGENT (there's a clue in the title) and thus make basic inferences and not need to be told every detail explicitly but to learn it for itself, infer it if it cannot, and admit when it doesn't know because of lack of data, be able to research it, and go off and find it on its own, and add that into its own knowledge if it feels it may be required or relevant.

It's a genuine question - if an AI of today requires the entire Internet to train on at the cost of billions of dollars... where are you going to go next? What are you going to do if you do need more data? And why can't that same AI just be fed a webcam, audio stream, etc. and get the same "input" as a human or animal and learn the entire language for itself in a few years of such input (even if it doesn't take years to feed that data into them, e.g. existing video/audio)?

Pretending that AI is JUST ABOUT DATA and the limit is only on how much data you can throw at it, is literally what the entire field of AI has claimed since its inception (hence my original post) in the 50's/60's?

It's simply not true. And now we're running out of data to give it. More data than any human could reasonable process or consume in their lifetime. And it's still not "intelligent".

So maybe we just have that wrong (as some people have been saying for decades) and it's not about putting a blank mind in front of lots of books to read them all by rote to form intelligence, but SOMETHING ELSE that we're all completely ignoring and missing.

Comment "AI" (Score 1) 136

"AI lacks inference and is found to be nothing more than a statistical machine."

Same as every "AI" since the 60's. Except now you no longer have the excuse of not enough training data / not enough processors / not long enough to train on / lack of funding.

Seriously - where are you going to go from here, where we spend billions and years to train on the entire Internet with millions of processors?

Maybe back to the drawing board to make something actual capable of intelligence, I hope.

Comment Re:Typical windows sysadmins! (Score 1) 86

It's not so bad if you have the storage.

If you try to cheap out and use Storage Spaces Direct? Or if you try to run it on just a small handful of nodes? Yeah, just throw it in the bin and run VMs individually.

I've seen IT consultants recommend 2-node clusters running S2D to a several $m business. That's just a recipe for disaster.

Comment Re:Star Wars (Score 1) 178

Episode One came out the same year as The Matrix.

The original's VFK consist of something approaching a glowstick.

Though it may be "groundbreaking" at the time, it's poor VFX and again - you have to have seen it in its time to appreciate anything of it.

"I honestly do not know what it is your family thinks qualifies as cinema"

Plot, narrative, consistency, characterisation, scripting, acting, portrayal, suspension of disbelief.

The original Vader is quite honestly the least intimidating villain I've ever seen in cinema (nothing to do with the actor, voice actor and even the only on-screen-face of the character being three entirely different people). He can barely even walk straight, let alone build fear.

The latter ones are especially low-effort action movies with constant deus ex machina and actual off-the-wall plot movements.

Holding up Star Wars as "cinema" given the countless thousands of other movies (not even "better" movies, but given the vast range of things out there) is laughable. It's a cheap action-scifi flick with an actual cinematic impact comparable to a typical Bond movie (something which I also wouldn't hold up as "qualifying as cinema" despite the fact that I love them).

Sorry, but you're blinkered by nostalgia or fandom. They're awful movies, with some terrible actors (Mark Hamill... literal main character in a huge franchise... basically does nothing but nostalgia cameos afterwards) and dialogue and plot that a child could write better.

Comment Star Wars (Score 1, Interesting) 178

True story:

My daughter (16) lives in Spain and she phoned me the other day and wanted me to get her all the Star Wars movies. She's on a mission to watch all the "classic" / "famous" series so she understands the references to them (prompted by her grandfather, no doubt).

I did so but before I sent them over, I thought I better check they worked and decided - for the first time ever in my life - to watch them. The old and new trilogy of them. My brother had some of the original action figures when I was a kid, and I'd caught glimpses of them showing on TV, but I'd never sat and actually watched the movies as movies.

I'd seen clips, I knew the "famous" lines, etc. but I'd never bothered to actually watch them in full.

They are, without a doubt, the most singularly awful series of films I've ever seen in my life. I love a trashy movie. I love a sci-fi movie. I love a cheap movie. I love things like Krull and Flash Gordon and all kinds of "outdated" stuff. Lack of CGI doesn't bother me at all. I love epics and long movies and big trilogies that don't skimp on story.

The "older" movies are atrocious. Terrible acting. Awful script. Complete disjointed plot. Cheap sets. The "famous lines" are actually SO BAD in context that it's laughable. And I finally got what the comedian Dara O'Briain was about with his "many Bothans died".... (sigh)... piece that he does. It's the worse piece of acting I've ever seen in a mainstream movie.

Then the newer ones (I'm talking about the "new" trilogy they bolted onto the front of the "old" trilogy) - no better. Better looking, but that's all. I finally understand the Jar-Jar nonsense and the way he was basically deleted after that first movie.

I was actually glad that I'd not previously wasted money on them. I understand strange fandoms about cult things (Dr Who was atrocious but picked up into something decent, etc.), but the whole thing... it just baffled me.

My daughter watched one. She was immediately put off (and she will read / watch / struggle through almost anything to appease her grandfather and thinks nothing of tearing through a 1000 page book in a day). After 20 minutes of ranting how bad it looked, how poor it was plot-wise and script-wise, etc. etc. she said "Do they get any better?"

I was suspicious so I checked which one she'd watched. "The one with the guy from Love Actually" (Liam Neeson). It was "episode one", because that was obviously what she thought she ought to watch first. I laughed. If you think that's bad, kid, you need to watch the ORIGINAL trilogy. And I'd got her the "revamped" versions of them all, with new CGI on the old trilogy, so even the old ones were even worse than that originally!

I honestly don't see the appeal and now that they're "Disneyfied" into a thousand spin-offs, I honestly couldn't do more to try to avoid them.

Star Wars was for those people who say it in the cinema at the right age the first time round. I cannot imagine why revamping them now and Disneyfying them would attract any audience at all. Who the hell is watching/buying this junk?

Comment It's for better security (Score 5, Insightful) 210

This site doesn't have me, but if it did, so what? If you've made any political contributions, you can look that up at various sites, but since they're either left-wing or apparently bipartisan (OpenSecrets), nobody bats an eye.

If they somehow linked my pseudonymous slashdot handle with my name, THAT would be doxing. (Except my slashdot handle isn't pseudonymous anyway)

Suppose I didn't vote, but VoteRef shows that I did. That indicates voting fraud, someone entered an absentee vote in my name.

Suppose the lot across from me is vacant, and VoteRef shows that several people registered and voted using that address.

VoteRef is useful for identifying and combatting voter fraud. Lots and lots of theories on how vote fraud could occur, having the data open and available for everyone can lower the temperature, and reassure people that the voting process is secure.

Comment Re:Typical windows sysadmins! (Score 2) 86

Why would one test system show you anything at all?

This would have to be one test 2022 system, and I guarantee you people still have 2019 out there en-masse and maybe 2016 (still supported until 2027!).

Then that presumes this update hits immediately and you'll notice the problem instantly (which may be true in this very unusual case) and isn't dependent on what software, services, options, etc. you have installed, whether it was previously in-place upgraded or fresh install, what apps you're using, etc. etc.

Then you're into half a dozen different machines before you even start - some of them critical servers requiring downtime. And if your testing DIDN'T catch it - boy, you're in for a shock later.

Sorry, but I don't think you get that not everyone has 10,000 machines, a staff of 50, and all the time in the world to sit and test every Windows update, and every possible interaction on every possible system combination you have, AND then rollout to completion across the entire server infrastructure in the space of 2 weeks. And then do that every single month.

This one is relatively easy to spot. Most broken hotfixes etc. take a long time to test and problems to come to the fore.

And, FYI, I was not affected by this - because I keep on eye of places like this where people do guinea-pig them for me before I ever attempt to update them. Apply updates blind is insanity. But just telling your insurers "Nah, we haven't updated because we need to test every Windows update on all our variety of systems (potentially across the globe) so we don't care if you'll insure that or not" is also insanity.

MS need to produce proper tools and take FAR FAR greater care over what they roll out - it's not unusual for Windows hotfixes to bundle critical security fixes and then also include Copilot and advertising it to users, and other nonsense that has serious implications. For every user you think isn't an actual sysadmin, you need to apply a thousand times more scrutiny to MS's practices of their GLOBAL operating system updates that they push out without any testing at all.

Comment Intermediate results should be banned (Score 3, Interesting) 54

Posting intermediate results (vote totals) should be banned, it opens the door to election cheating.

Note: Not saying that cheating happens or did happen, only that in a security sense posting intermediate results makes cheating easier.

Also Note: This is relevant to the article, an AI system that predicts the outcome in real time before the polls close.

If you know your candidate is losing, you can calculate by how much and arrange to have the minimum number of fake ballots delivered to push your candidate over the winning line. A smaller number of fake ballots makes it less likely that the ballots will be discovered as fake, and the "very slight margin of winning" reduces suspicion.

We already have accusations of fake ballots entered, ballot box dumps in the middle of the night, extra boxes found late in the evening... all of these give the viewer low confidence in the election outcome. We also have cases where registrations exceed the number of people living in the county, zillions of examples of registrations from non-residences and so on.

If you don't have real-time intermediate results it's much harder to gauge how much effort you need to swing the ballot count, or even if you need to cheat or whether cheating will do any good.

Don't start counting until *all* boxes from remote polling have arrived, don't accept extra boxes once you start counting, make the "percentage of votes counted" public, but don't publish the actual counts until you're done.

That rule alone would go a long way towards installing confidence in our elections.

Comment Re:Typical windows sysadmins! (Score 4, Informative) 86

Often you don't get a choice. Certain security requirements/ certifications REQUIRE updates to be pushed to all machines within 2 weeks of release to all machines. There's little time for any testing on that, especially server-side. These can be really basic requirements for things that all companies have and that insurers, cybersecurity accreditations, etc. insist upon.

Then you have the difficulty of managing and blocking updates - WSUS is no longer being developed (and goes away in Server 2025). Intune, Autopatch etc. are extra monthly licences and can be a pain to manage en-masse. Intune doesn't even let you block individual updates last time I looked.

There are third-party patch management solutions out there precisely because the Windows ones can be so dire, but even they can't necessarily see this kind of thing and stop it in time.

And rolling back an entire OS upgrade that's mistakenly marked as an update is far bigger a problem than just rolling back a single Windows KB number, and likely requires restoration from snapshot / backup which means downtime and THEN scrambling to stop it updating that update, same as the above, before it decides to do it itself. For every server. In companies that have dozens or hundreds of virtual machines.

Hell, if you've ever managed a network, you'll have seen single individual KBs that blue-screen and put the device into modes where no remote recovery is possible and you have to restore from backup or safe-mode them to remove them. If you haven't seen that, I question what you've been managing and how long.

Windows updating is really awful for modern times. Don't even get me started on CAU and/or deploying non-CAU updates on clustered servers.

Comment Street EV (Score 1) 106

They've been talking about this for years but there's a problem.

The sodium vapour bulbs were replaced with LED bulbs but neither pull more than 1000W. The LEDs had slowly replaced all the old lamps because they don't pull more than about 80W (that's literally a no-brainer for any authority in charge of powering street lighting). So at best you have some 30-year-old legacy cabling that's been out in the wind and rain and supports 1000W. But more likely it's been renewed, replaced or installed with the expectation of only supporting 100W or so.

And then you want to plug a device that either a) pulls more than specified or b) pulls the maximum specified - on every streetlight in town all day long? Yeah, that's going to run into all kinds of problems. And 100W could be done with a single cheap flexible solar panel (like ones used for campers and boats) on top of your car - you could easily fit 3-4 on top of any car. The reason we don't is because it's not worth it.

Not to mention wiring them back for Internet - even using such technologies as powerline Ethernet (not the same thing as you can get in the home), that's pricey and difficult - or 5G, etc.

However you look at it, it's not going to be cost-effective for users, and hence it's not going to be cost-effective for the people spending countless millions installing it all in the hope users will use it.

"it wants to install tens of thousands of curbside EV chargers by 2030"

That's 30+ a week (but they only take 1-2 hours to install as per article? They must have only one guy installing!). Not a huge, fast rollout, and it's only an ambition and probably while the subsidy is paid, AT&T don't raise their prices, the local authorities don't realise that they're paying a fortune and aren't making profit, and so on. There are 400,000 car drivers in Detroit. So it would cover less than 10% of them. And this isn't a "charge once for the week" thing, they specifically say so.

You need ten times more ramp-up to even make a dent, and it needs to be cheaper than just waiting till you get home and plugging in overnight. I can't see it working, just like everywhere else that did it. Even in the UK, where our electrical system can pull 3000W from a single, simple, standard home plug (electric home ovens are just a standard plug nowadays, no need for a "cooker circuit" like before) - and home chargers are usually 7KW with almost no rewiring required (100A service to each house at 220-240V giving 22KW capacity to the whole house - hell, my shower pulls more than 7KW).

And the Internet thing is really the most basic and pathetic part of the whole project. Slap a SIM card in the interface and you're done in any developed town. There are millions of smart meters in the UK that work like that, and my workplace has EV chargers which just use 3G connectivity in each to let you monitor / pay via the app. In this case, you even have a huge metal pole nearby to hang an antenna if you need it!

I can't see this ever being the use-case they make it out to be. Providing ubiquitous charging at a markup compared to any of the alternatives (pay for a fast charge, put a charger on your house, use specific charging facilities, etc.) seems to be the brain-child of someone receiving a subsidised business model, not anything viable long-term.

Slashdot Top Deals

Biology is the only science in which multiplication means the same thing as division.

Working...