Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Comment Re: Huh? (Score 0) 161

He's just being a typical American MORE BIGGER FASTER tool. I drive an 08 Versa with a 1.8l with 122hp and I have absolutely no problem being one of the fastest people on the road, because even a slow ass car by modern standards can do all the things. I never have trouble getting up to speed on a ramp or whatever.

Comment Re:Way too early, way too primitive (Score 1) 54

The current "AI" is a predictive engine.

And *you* are a predictive engine as well; prediction is where the error metric for learning comes from. (I removed the word "search" from both because neither work by "search". Neither you nor LLMs are databases)

It looks at something and analyzes what it thinks the result should be.

And that's not AI why?

AI is, and has always been, the field of tasks that are traditionally hard for computers but easy for humans. There is no question that these are a massive leap forward in AI, as it has always been defined.

Comment Re:And if we keep up with that AI bullshit we (Score 1) 54

It is absolutely crazy that we are all very very soon going to lose access to electricity

Calm down. Total AI power consumption (all forms of AL, both training and inference) for 2025 will be in the ballpark of 50-60TWh. Video gaming consumes about 350TWh/year, and growing. The world consumes ~25000 TWh/yr in electricity. And electricity is only 1/5th of global energy consumption.

AI datacentres are certainly a big deal to the local grid where they're located - in the same way that any major industry is a big deal where it's located. But "big at a local scale" is not the same thing as "big at a global scale." Just across the fjord from me there's an aluminum smelter that uses half a gigawatt of power. Such is industry.

Comment Re:Sure (Score 2) 54

Most of these new AI tools have gained their new levels of performance by incorporating Transformers in some form or another, in part or in whole. Transformers is the backend of LLMs.

Even in cases where Transformers isn't used these days, often it's imitated. For example, the top leaderboards in vision models are a mix of ViTs (Vision Transformers) and hybrids (CNN + transformers), but there are still some "pure CNNs" that are high up. But the best performing "pure CNNs" these days use techniques modeled after what Transformers is doing, e.g. filtering data with an equivalent of attention and the like.

The simple fact is that what enabled LLMs is enabling most of this other stuff too.

Comment Re:Sounds like enshitification (Score 1) 116

Agreed. This is all stuff that at MOST should be accessible over the LAN. The ESP32 is cheap and provides the WiFi and enough power to run a simple RESTful web app. If I actually need/want to access it remotely, it'll be through a well protected integrated web servie on a jump box.

A cheaper manufacturer could probably make the ESP32 do double duty as the primary micro-controller with a suitable interrupt routine.

Slashdot Top Deals

A CONS is an object which cares. -- Bernie Greenberg.

Working...