Comment Re:Huh? (Score 1) 140
If people can buy a new car for not much more than a used one, and it's more efficient and comfortable, they just might.
If people can buy a new car for not much more than a used one, and it's more efficient and comfortable, they just might.
The article doesn't say anything about heat absorption. I wonder if the fabric traps most of the heat associated with the light as well? I'm assuming it would.
Well, the energy absorbed by the fabric has to go somewhere - typically it's converted to heat. Granted, it could be highly reflective elsewhere in the spectrum - like it could take that energy and convert it to IR light so it doesn't get hot.
The bigger issue is physical releases. Netflix has a policy of no physical releases of their content. It's why many directors have stopped working for Netflix - they don't want to see their work "locked up" and unable to be enjoyed by people without a subscription. Maybe the odd director can enjoy a theatrical release but only because it's required for award consideration.
Also means that no movie is static and can be edited freely, like Amazon has with the James Bond movies. (Admittedly they are a product of their time, and if you didn't take that to account, they play completely differently now without the historical context. But still offensive or not, it's needed to study the historical context of the movie, not some cleaned up version that you can only get on physical media).
Long ago when Quicktime was dying because Apple abandoned it in the 2000s; the developer list had an email asking opinions about open sourcing quicktime. Apple should have open sourced most of it. MKV didn't need to happen. I certainly liked the ability to have reference movies that just worked and took no space.
MKV did need to happen. MKV is a free and open container format, made in a way that ensures it tramples on no one's rights (e.g., the lack of FourCC codes for identifiers).
MOV is still wildly popular in industry, and subsets of it are part of the MPEG4 standard - the MP4 file format is a subset (basically limiting what an MP4 file can contain since it's only really for h.264 video and a few audio formats).
Yeah, it's not even worth considering for something like 15-20kg. A full pallet in this case is 464kg
The current "AI" is a predictive engine.
And *you* are a predictive engine as well; prediction is where the error metric for learning comes from. (I removed the word "search" from both because neither work by "search". Neither you nor LLMs are databases)
It looks at something and analyzes what it thinks the result should be.
And that's not AI why?
AI is, and has always been, the field of tasks that are traditionally hard for computers but easy for humans. There is no question that these are a massive leap forward in AI, as it has always been defined.
It is absolutely crazy that we are all very very soon going to lose access to electricity
Calm down. Total AI power consumption (all forms of AL, both training and inference) for 2025 will be in the ballpark of 50-60TWh. Video gaming consumes about 350TWh/year, and growing. The world consumes ~25000 TWh/yr in electricity. And electricity is only 1/5th of global energy consumption.
AI datacentres are certainly a big deal to the local grid where they're located - in the same way that any major industry is a big deal where it's located. But "big at a local scale" is not the same thing as "big at a global scale." Just across the fjord from me there's an aluminum smelter that uses half a gigawatt of power. Such is industry.
That "ruler study" was ancient. It's mentioned in peer review at least as early as 2018, and might be even older.
Believe it or not, people in the field are familiar with these sorts of things that you just read about.
Most of these new AI tools have gained their new levels of performance by incorporating Transformers in some form or another, in part or in whole. Transformers is the backend of LLMs.
Even in cases where Transformers isn't used these days, often it's imitated. For example, the top leaderboards in vision models are a mix of ViTs (Vision Transformers) and hybrids (CNN + transformers), but there are still some "pure CNNs" that are high up. But the best performing "pure CNNs" these days use techniques modeled after what Transformers is doing, e.g. filtering data with an equivalent of attention and the like.
The simple fact is that what enabled LLMs is enabling most of this other stuff too.
Jevons Intensifies
Always thought it would be Disney buying up WB and eventually, Sony.
Am I the only person on the planet who still opens the garage door with, you know, my hands? Is that completely crazy? Am *I* crazy?
Considering that for the one-time investment of $150 and a half hour of your time you could not have to do that any more? Hell yes, you're crazy.
Their financials certainly look like they're in dire straits.
It seems Warner can't catch a break. Time Warner's financials were in dire straits in 2004 as well with a load of debt from the AOL merger. That time, they paid their debt by selling Dire Straits and the rest of Warner Music Group to Edgar Bronfman Jr.
The golden age was arguably when Netflix had the streaming monopoly and everyone licensed their stuff to them, which ended long, long ago.
Only because cable was still competition.
These days, if you believe Netflix wouldn't be just another cable company when they're the only streaming game in town, I've got a bridge to sell you.
They're still the market movers - ever notice Netflix jacks up their price, then all the other streaming services follow? Or how Netflix stops password sharing, then the others follow?
Some people pray for more than they are willing to work for.