Comment Re:Stupid article (Score 1) 53
Yes, precisely.
What natural law describes the predilection to change the complaint when it is proven unfounded? Human Nature (tm)?
Yes, precisely.
What natural law describes the predilection to change the complaint when it is proven unfounded? Human Nature (tm)?
Capitalism.
It might actually be. Consider this thing:
https://vxtwitter.com/TaylorOg...
Now consider this but in glasses. In case of phone vs glasses as an interface for modern smartphone, glasses make little sense. Phone is good enough.
But when it comes to this sort of an agentic device, glasses actually provide additional value that phone doesn't. You can just look at things and ask the agent to do with what you're observing. Doing this with the phone is far less intuitive.
I think most limited access highways don't allow bicycles of any kind.
That would require some sort of enforcement in the face of a possible riot by Critical Mass. The law may say "no", but the realities of enforcement say, "Go ahead, kid."
Bicycles?
we can't import new Kei cars/trucks due to DOT regulations and various legal red tape. None of that has changed.
So, change it.
Besides kei-cars can't pass the safety standards in the US anyway.
And electric scooters can?
Why? They allow those electric bicycles on the highway.
Yeah, it's not even worth considering for something like 15-20kg. A full pallet in this case is 464kg
The current "AI" is a predictive engine.
And *you* are a predictive engine as well; prediction is where the error metric for learning comes from. (I removed the word "search" from both because neither work by "search". Neither you nor LLMs are databases)
It looks at something and analyzes what it thinks the result should be.
And that's not AI why?
AI is, and has always been, the field of tasks that are traditionally hard for computers but easy for humans. There is no question that these are a massive leap forward in AI, as it has always been defined.
It is absolutely crazy that we are all very very soon going to lose access to electricity
Calm down. Total AI power consumption (all forms of AL, both training and inference) for 2025 will be in the ballpark of 50-60TWh. Video gaming consumes about 350TWh/year, and growing. The world consumes ~25000 TWh/yr in electricity. And electricity is only 1/5th of global energy consumption.
AI datacentres are certainly a big deal to the local grid where they're located - in the same way that any major industry is a big deal where it's located. But "big at a local scale" is not the same thing as "big at a global scale." Just across the fjord from me there's an aluminum smelter that uses half a gigawatt of power. Such is industry.
That "ruler study" was ancient. It's mentioned in peer review at least as early as 2018, and might be even older.
Believe it or not, people in the field are familiar with these sorts of things that you just read about.
Most of these new AI tools have gained their new levels of performance by incorporating Transformers in some form or another, in part or in whole. Transformers is the backend of LLMs.
Even in cases where Transformers isn't used these days, often it's imitated. For example, the top leaderboards in vision models are a mix of ViTs (Vision Transformers) and hybrids (CNN + transformers), but there are still some "pure CNNs" that are high up. But the best performing "pure CNNs" these days use techniques modeled after what Transformers is doing, e.g. filtering data with an equivalent of attention and the like.
The simple fact is that what enabled LLMs is enabling most of this other stuff too.
Jevons Intensifies
Riches: A gift from Heaven signifying, "This is my beloved son, in whom I am well pleased." -- John D. Rockefeller, (slander by Ambrose Bierce)