My beef is with Aptera. This is a company that has been around for a long time promising a car and never delivering that car. And the pattern of them appearing in the news when they're scrounging for money. A better company would have just made the fucking car. I don't accept this is some kind of problem with them being a boutique maker or whatever. A lot of companies make custom vehicles, boats, or whatever in limited quantities and do it in a timely fashion.
And to be clear I love EVs and I love the concept of solar powered cars. It's a tantalising prospect that may become real some day. Maybe that's the whole schtick with Aptera, dangling that prospect even though their business model is more like Star Citizen in perpetual development than one serious committed to actually releasing something.
Win9x and Win2k (and the other NT descendants) are fundamentally different operating systems. In general, NT had a much more robust kernel, so system panics were and remain mainly hardware issues, or, particularly in the old days, dodgy drivers (which is just another form of hardware issue). I've seen plenty of panics on *nix systems and Windows systems, and I'd say probably 90-95% were all hardware failures, mainly RAM, but on a few occasions something wrong with the CPU itself or with other critical hardware like storage device hardware. There were quite a few very iffy IDE cards back in the day.
The other category of failure, various kinds of memory overruns, have all but disappeared now as memory management, both on the silicon and in kernels, have radically improved. So I'd say these are pretty much extinct, except maybe in some very edge cases, where I'd argue someone is disabling protections or breaking rules to eke out some imagined extra benefit.
I have the same thoughts now about it as I did for Home - they should have just bundled a proper MMO into the headset with quests, zones, clans etc. Let people make avatars which are orcs, elves, fairies etc. Let them go into a game which has objectives and a purpose and start engaging with it. Perhaps if they did that they'd have a success instead of a dead albatross hanging round their necks.
Yeah, it's not even worth considering for something like 15-20kg. A full pallet in this case is 464kg
The current "AI" is a predictive engine.
And *you* are a predictive engine as well; prediction is where the error metric for learning comes from. (I removed the word "search" from both because neither work by "search". Neither you nor LLMs are databases)
It looks at something and analyzes what it thinks the result should be.
And that's not AI why?
AI is, and has always been, the field of tasks that are traditionally hard for computers but easy for humans. There is no question that these are a massive leap forward in AI, as it has always been defined.
It is absolutely crazy that we are all very very soon going to lose access to electricity
Calm down. Total AI power consumption (all forms of AL, both training and inference) for 2025 will be in the ballpark of 50-60TWh. Video gaming consumes about 350TWh/year, and growing. The world consumes ~25000 TWh/yr in electricity. And electricity is only 1/5th of global energy consumption.
AI datacentres are certainly a big deal to the local grid where they're located - in the same way that any major industry is a big deal where it's located. But "big at a local scale" is not the same thing as "big at a global scale." Just across the fjord from me there's an aluminum smelter that uses half a gigawatt of power. Such is industry.
That "ruler study" was ancient. It's mentioned in peer review at least as early as 2018, and might be even older.
Believe it or not, people in the field are familiar with these sorts of things that you just read about.
Most of these new AI tools have gained their new levels of performance by incorporating Transformers in some form or another, in part or in whole. Transformers is the backend of LLMs.
Even in cases where Transformers isn't used these days, often it's imitated. For example, the top leaderboards in vision models are a mix of ViTs (Vision Transformers) and hybrids (CNN + transformers), but there are still some "pure CNNs" that are high up. But the best performing "pure CNNs" these days use techniques modeled after what Transformers is doing, e.g. filtering data with an equivalent of attention and the like.
The simple fact is that what enabled LLMs is enabling most of this other stuff too.
Jevons Intensifies
Am I the only person on the planet who still opens the garage door with, you know, my hands? Is that completely crazy? Am *I* crazy?
Considering that for the one-time investment of $150 and a half hour of your time you could not have to do that any more? Hell yes, you're crazy.
Neutrinos are into physicists.