Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:Here's an idea (Score 1) 34

IMO probably the best thing to happen with this industry is for copyright laws to be clipped back to 28 years. The artists will lose their shit, but honestly, the Berne convention just feels like it's designed for the sole purpose of allowing them (and the studios) to just keep rent seeking indefinitely. The US never should have signed to it. Even 28 years is a long fucking time, so why on earth does it need to be their entire lifetime plus 75 years? So their grandkids can rent-seek? It's ridiculous.

You know what else would happen if it was shorter? Nobody would even give a fuck if Netflix bought out Warner.

Comment Here's an idea (Score 2) 34

"The world's largest streaming company swallowing one of its biggest competitors is what antitrust laws were designed to prevent. The outcome would eliminate jobs, push down wages, worsen conditions for all entertainment workers, raise prices for consumers, and reduce the volume and diversity of content for all viewers...." the Writers Guild of America union representing Hollywood writers.

While I'm not necessarily onboard with this merger, I've got an idea for them: Stop rebooting and sequeling everything into shit. How many more reboots, sequels, and multiverses does spiderman need exactly? The only good franchises that seem to come from Hollywood now started as novels. Why? Because Hollywood writers can't come up with anything original anymore. Maybe one good thing that can come from this merger is that you guys are finally forced to stop doing this shit.

Comment Re:Way too early, way too primitive (Score 1) 55

The current "AI" is a predictive engine.

And *you* are a predictive engine as well; prediction is where the error metric for learning comes from. (I removed the word "search" from both because neither work by "search". Neither you nor LLMs are databases)

It looks at something and analyzes what it thinks the result should be.

And that's not AI why?

AI is, and has always been, the field of tasks that are traditionally hard for computers but easy for humans. There is no question that these are a massive leap forward in AI, as it has always been defined.

Comment Re:And if we keep up with that AI bullshit we (Score 1) 55

It is absolutely crazy that we are all very very soon going to lose access to electricity

Calm down. Total AI power consumption (all forms of AL, both training and inference) for 2025 will be in the ballpark of 50-60TWh. Video gaming consumes about 350TWh/year, and growing. The world consumes ~25000 TWh/yr in electricity. And electricity is only 1/5th of global energy consumption.

AI datacentres are certainly a big deal to the local grid where they're located - in the same way that any major industry is a big deal where it's located. But "big at a local scale" is not the same thing as "big at a global scale." Just across the fjord from me there's an aluminum smelter that uses half a gigawatt of power. Such is industry.

Comment Re:Sure (Score 4, Informative) 55

Most of these new AI tools have gained their new levels of performance by incorporating Transformers in some form or another, in part or in whole. Transformers is the backend of LLMs.

Even in cases where Transformers isn't used these days, often it's imitated. For example, the top leaderboards in vision models are a mix of ViTs (Vision Transformers) and hybrids (CNN + transformers), but there are still some "pure CNNs" that are high up. But the best performing "pure CNNs" these days use techniques modeled after what Transformers is doing, e.g. filtering data with an equivalent of attention and the like.

The simple fact is that what enabled LLMs is enabling most of this other stuff too.

Slashdot Top Deals

Even bytes get lonely for a little bit.

Working...