Comment Re:Huh? (Score 1) 141
If people can buy a new car for not much more than a used one, and it's more efficient and comfortable, they just might.
If people can buy a new car for not much more than a used one, and it's more efficient and comfortable, they just might.
Yeah, it's not even worth considering for something like 15-20kg. A full pallet in this case is 464kg
The current "AI" is a predictive engine.
And *you* are a predictive engine as well; prediction is where the error metric for learning comes from. (I removed the word "search" from both because neither work by "search". Neither you nor LLMs are databases)
It looks at something and analyzes what it thinks the result should be.
And that's not AI why?
AI is, and has always been, the field of tasks that are traditionally hard for computers but easy for humans. There is no question that these are a massive leap forward in AI, as it has always been defined.
It is absolutely crazy that we are all very very soon going to lose access to electricity
Calm down. Total AI power consumption (all forms of AL, both training and inference) for 2025 will be in the ballpark of 50-60TWh. Video gaming consumes about 350TWh/year, and growing. The world consumes ~25000 TWh/yr in electricity. And electricity is only 1/5th of global energy consumption.
AI datacentres are certainly a big deal to the local grid where they're located - in the same way that any major industry is a big deal where it's located. But "big at a local scale" is not the same thing as "big at a global scale." Just across the fjord from me there's an aluminum smelter that uses half a gigawatt of power. Such is industry.
That "ruler study" was ancient. It's mentioned in peer review at least as early as 2018, and might be even older.
Believe it or not, people in the field are familiar with these sorts of things that you just read about.
Most of these new AI tools have gained their new levels of performance by incorporating Transformers in some form or another, in part or in whole. Transformers is the backend of LLMs.
Even in cases where Transformers isn't used these days, often it's imitated. For example, the top leaderboards in vision models are a mix of ViTs (Vision Transformers) and hybrids (CNN + transformers), but there are still some "pure CNNs" that are high up. But the best performing "pure CNNs" these days use techniques modeled after what Transformers is doing, e.g. filtering data with an equivalent of attention and the like.
The simple fact is that what enabled LLMs is enabling most of this other stuff too.
Jevons Intensifies
Am I the only person on the planet who still opens the garage door with, you know, my hands? Is that completely crazy? Am *I* crazy?
Around my neighborhood almost no one parks in the garage (they park in their driveway, or the street). The garage is where you store stuff (and you rarely open the garage door).
I thought the garage was where people put their guest bedroom.
That was the only pad Russia had which has the infrastructure necessary to launch humans into space
Not entirely correct. It is the only *active* pad with that infrastructure. There are decommissioned pads that have been used for manned missions in the past. What state they are currently in is an unknown, but it has been speculated that equipment could be salvaged from them to repair the damaged pad.
It really works too. Robocop feels heavy and tank like
4k is a bit of a stretch for software decoding.
That's why it's 30%. As more devices support AV1, the number will rise.
I've done my first test of buying a whole pallet of filament straight from a Chinese manufacturer. It's a risk - it could be all junk - but if it's usable, the price advantage is insane. Like $3/kg for PETG at the factory gate (like $5/kg after sea freight and our 24% VAT). Versus local stores which sell for like $30/kg.
I'd love to see someone try to 3d print with a filament that melts at 162K. Where do you even buy xenon filament?
Yeah, if you had injected moulded PLA, it would have been just as terrible
It isn't easy being the parent of a six-year-old. However, it's a pretty small price to pay for having somebody around the house who understands computers.