Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:Fuck you, Sam (Score 1) 135

I think that's an open question. We can look at history and see how capitalism has evolved over the years. Historians can argue about whether it was inevitable for it to evolve the way it did. I'm sure there are good arguments on both sides. But ultimately it's an academic question. Capitalism did evolve in a particular way, and it's very different today from what it was 50 years ago.

Comment AI works when it doesn't need to be right (Score 2) 88

All forms of generative AI, not just code generators, are most useful when the answer doesn't need to be right. You get an answer quickly. Often it will be a useful answer but not always. If there's an objective standard for what's correct, it won't always match that standard. Sometimes that's ok. If you're generating pictures to use as clip art, or searching for information that you'll double check, or creating a game for fun, mistakes aren't too important. But you really don't want your bank running on software that works that way.

I'd don't know if we'll ever overcome that limitation, at least not with the current approach to AI. Some other problems will be easier to fix. For example, current AI code generators work for small projects but fall apart when the code base gets too big. That's improving with time and will keep improving. It just takes bigger models with bigger context windows.

But the lack of correctness guarantees may be inherent to the whole approach. It requires a rigorous process where every step is provably correct. That's very different from how current models work.

Comment Re:The problem is not AI but who owns AI (Score 1) 41

I think you're confused about who you're replying to. You quoted parts of my post, but your reply doesn't seem to have much to do with what I said.

You keep getting mad at the claim that AI isn't (present tense) a large chunk of data centre power usage

I'm not getting mad at anything. I began, "I don't think this is true," and then cited some articles that contradict it. I'm not sure how you would interpret that as getting mad.

but then trying to counter that by talking about growth trends.

The passages I quoted discussed current use (or rather, recent past use in 2023 and 2024). The full articles have much much more detail about current use. Don't take my word for it. Look and see for yourself.

Your response begins not by engaging the substance of the report, but by questioning whether I read it and by attacking the credibility of the authors.

I don't know who you're quoting, but it's not me. I think you're mixing up your responses to different posts.

Comment Re:Fine (Score 1) 121

Like much of the Bill of Rights, the second amendment is modeled after the Virginia Declaration of Rights. It directly copies language from it: "a well regulated militia", "a free state". It didn't need to define what those terms meant. Everyone understood that when the second amendment used those words, they meant the same thing as in the Virginia Declaration of Rights.

Today not many people remember the Virginia Declaration of Rights. Instead they invent their own definitions to make the second amendment mean whatever they want it to.

Here is the text the second amendment is modeled after.

Section 13. That a well regulated militia, composed of the body of the people, trained to arms, is the proper, natural, and safe defense of a free state; that standing armies, in time of peace, should be avoided as dangerous to liberty; and that in all cases the military should be under strict subordination to, and governed by, the civil power.

The main issues it's concerned with are that the military should be subordinate to the civil government, and that peace time law enforcement should be a civil function (as it traditionally was in the colonies), not a military function (as the British army had made it).

Like the second amendment, it's concerned with a "well regulated militia" composed of people "trained to arms". It isn't about random people carrying guns for their own use. That interpretation came later.

Note the anachronism that it assumes a militia will be composed of "the body of the people". At that time, professional police hadn't been invented. Serving in a militia was like serving on a jury. Most people did it (or at least, most people who were white and male), but it wasn't a full time job. You got called on as needed. The idea of having a small number of people do law enforcement as a full time job came later.

The first time the US Supreme Court referenced a right to bear arms was in its repugnant Dred Scott decision

That was in 1857, 66 years after the Bill of Rights was ratified. It was a break from earlier cases, for example Aymette v. State which held that the right to bear arms is only for the common defense.

Comment Re:Fine (Score 3, Insightful) 121

They would have been fine with it. The second amendment guarantees states the right to form armed militias (or to put it in modern terms, armed police forces). It wasn't intended to give every individual the right to own guns for their own private use. That's a modern reinterpretation.

there's nothing illegal in the US in making guns for personal use

That's just completely false. Gun manufacturing is a highly regulated industry.

Comment Re:The problem is not AI but who owns AI (Score 3, Informative) 41

Talking about how "data centres" consume 1,5% of global electricity. But AI is only a small fraction of that (Bitcoin is the largest fraction).

I don't think this is true. You didn't cite any sources for your claims, so I went searching to see what I could find. This article from MIT Technology Review goes into a lot of detail about AI energy use. Here's a relevant passage.

This isn't simply the norm of a digital world. It's unique to AI, and a marked departure from Big Tech's electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

This article also gives some detailed numbers:

It's difficult to precisely disentangle how much of the tech sector's electricity is because of AI, since data centers handle many kinds of workloads. What is clear is that data center energy use is rising sharply, largely due to the AI boom. In 2024, US data centers consumed about 183 terawatt-hours (TWh) of electricity; over 4% of all US power use that year. To put that in perspective, that's roughly as much electricity as the entire country of Pakistan uses in a year, and the trajectory is steeply upward. By 2030, US data centers' power draw is projected to more than double to around 426 TWh, barring major efficiency breakthroughs. The Department of Energy anticipates data centers' share of US electricity could reach 6.7 to 12% by 2028, up from about 4.4% in 2023.

AI is not a "small fraction" of data center use. It's the dominant force driving a huge increase in power consumption.

Making some distinction between "generative AI" and "traditional AI".

If you don't like their terminology, let's say "consumer AI" to avoid the technical distinction of what is or isn't a generative model. The distinction they make is completely valid. The rise of consumer AI products like chatbots and image generators is new and very different from the older traditional applications. Those products are driving the majority of growth in AI power use. They also provide zero climate benefit. The claimed sources of emissions reductions do not come from these consumer products.

Comment Re:Astroturfing (Score 1) 141

Read what you just wrote:

due to a battery recall

A battery recall does not equal incompetence or fraud from the people who bought them. That's just a blatantly partisan shill for the oil industry trying to push an agenda. A battery recall means the product turned out to have a manufacturing defect that the manufacturer is now addressing. Once it's fixed, they'll be able to charge the buses in a garage, which is presumably what they intended from the start, and the problem will go away.

Comment Re:Deeper than food safety (Score 1) 209

When it appears in my grocery, then tell me it's something I can do. I'd gladly try it, but I've never yet had the chance.

Saying "nobody wants to eat it" when most people have never had a chance is false. I suspect the real reason states are banning it is because of pressure from the established companies that don't want competition. They're probably also behind this narrative that "nobody" wants their competitors' products.

Comment Re:What about human-generated songs? (Score 1) 40

The article gives no detail about how the method works. Reading between the lines and making some slightly informed but mostly speculative guesses, it probably assumes the music is AI generated. It isn't meant to be an AI detector. You give it a song that you know was created by AI, and it tries to figure out what training data went into making it.

If that's true, it would be interesting to give it human generated music. Would it tell you who the artist was most influenced by? Or would it focus more on irrelevant details like the model of guitar they're playing? But that's not what they're presenting it for.

Comment Sounds like a joke (Score 1) 77

It also regrets characterizing feedback as "positive" for a proposal to change a repo's CSS to Comic Sans for accessibility. (The proposals were later accused of being "coordinated trolling"...)

I'm skeptical that this is actually an AI and not in fact a person trolling. That sounds exactly like what someone would do as a joke. One of the stranger elements of AI literacy these days is remembering that things claiming to be AI generated sometimes aren't. Often it's really a person just pretending.

Comment OpenAI is not free (Score 2) 8

We are glad they do that and we are doing that too, but we also feel strongly that we need to bring AI to billions of people who can't pay for subscriptions.

This is so dishonest. Everyone pays for OpenAI's products. They force you to pay in indirect ways you can't avoid. If your electric rates have gone up, thank OpenAI. If you have to pay more for a new computer because memory has gotten so expensive, thank OpenAI. Not to mention the huge amounts of CO2 being dumped into the atmosphere to power their data centers. Every single person subsidizes their products, whether they use those products or not, and then OpenAI pretends to be giving them away for free.

Slashdot Top Deals

System going down at 5 this afternoon to install scheduler bug.

Working...