Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment oh brother (Score 1) 281

how much is the cheapest TV today compared to the 90s

You can't eat your TV. You can't drive your TV to the grocery store. You can't take your TV into the bank and get a home loan, nor can you take your TV to a home seller and get a reasonable price. You can't hand it to the university and be handed back an education. You can't give your doctor your TV and receive surgical or even preventive care or the meds you need.

Your problem (other than the root one of spewing disingenuous nonsense) is that you're looking at the pricing in the electronics sector and pretending it's representative of the extremely high basic living costs I called out (which of course it is not) — nowhere did I say anything about either the pricing of electronics or the need for a TV to achieve a reasonable cost of living. Nor should you have. But here we are.

Comment Re:Flash is costly? (Score 5, Informative) 37

Creating the training dataset is the *last* step. I have dozens of TB of raw data which I use to create training datasets that are only a few GB in size. Of which I'll have a large number sitting around at any point in time.

Take a translation task. I start with several hundred gigs of raw data. This inflates to a couple terabytes after I preprocess it into indexed matching pair datasets (for example, if you have an article that's published in N different languages, it becomes (N * N-1) language pairs - so, say, UN, World Bank, EU, etc multilingual document sets greatly inflate). I may have a couple different versions of this preprocessed data sitting around at any point in time. But once I have my indexed matching pair datasets, I'll weighted-sample only a relatively small subset of it - stressing higher-quality data over lower quality and trying to ensure a desired mix of languages.

But what I do is nothing compared to what these companies do. They're working with common crawl. It grows at a rate of 200-300 TB per month. But the vast majority of that isn't going to go into their dataset. It's going to be markup. Inapplicable file types. Duplicates. Junk. On and on. You have to whittle it down to the things that are actually relevant. And in your various processing stages you'll have significant duplication. Indeed, even the raw training files... I don't know about them, but I'm used to working with jsons, and that adds overhead on its own. Then during training there's various duplications created for the various processing stages - tokenization, patching with flash attention, and whatnot.

You also use a lot of disk space for your models. It's not just every version of the foundation you train (and your backups thereof) - and remember that enterprise models are hundreds of billions to trillions of FP16 parameters in their raw states - but especially the finetune. You can make a finetune in like a day or so; these can really add up.

Certainly disk space isn't as big of a cost as your GPUs and power. But it is a meaningful cost. As a hobbyist I use a RAID of 6 20TB drives and one of 2 4TB SSDs. But that's peanuts compared to what people working with common crawl and having hundreds of employees each working on their own training projects will be eating up in an enterprise environment.

Comment Putting numbers into perspective (Score 4, Interesting) 136

This is all to produce a peak of 240k EVs per year. Production "starts" in 2028. It takes years for a factory to hit full production. Let's be generous and say 2030.

Honda sold 1,3 million vehicles in the US alone last year - let alone all of North America, including both Canada and Mexico. If all those EVs were just for the US it'd be 18% of their sales, but for all of North America, significantly less.

In short, Honda thinks that in 2030 only maybe 1/7th to 1/8th of its North American sales will be EVs. This is a very pessimistic game plan.

Comment Re:Clearly they need to drop the prices (Score 4, Interesting) 158

That's not clear. One of the big problems with EVs is the ability to charge them. Lots of people don't have any way to do this at home, and the away-from-home chargers are often iffy either in access or availability. (Reports say they are often broken.)

FWIW, I won't be interested in a new car until full-automatic driving is included. So my observation of the market is a bit sketchy. But if I were to buy an EV I'd have no reliable place to charge it.

Comment Re:Gotta start somewhere (Score 5, Informative) 158

Ford made the Ford Ranger EV 1998 to 2002, then the Ford Focus Electric from 2011 to 2018 before switching to the Mach-E. They are not "new at it". They're just bad at it.

To be fair, I have a lot more hope for Ford than GM, as Farley seems to actually understand the critical importance of turning things around and the limited timeframes to do so, unlike GM, which still seems to only care about press.

Comment Economic worship (Score 4, Insightful) 281

Destroying middle class has predictable consequence of tanking birth rate. News at 11.

"We must have constant inflation or people might, you know, save!"

Then... basics cost (a lot) more and mid- to low-tier wages don't even come close to keeping up

Brutal housing, education, medical, food, vehicle, and fuel costs, crushing taxes on the lower tier workers... gee, sounds like a great circumstance to bring some ever-more-expensive rug rats into.

The "American Dream" is deader than Trump's diaper contents for a large swath of those of an age to be pumping out crotch goblins. But hey: The stock market is doing Great!

Or perhaps it's just that no one wants to hump someone with their pants falling off their butt — or otherwise dressing like a refugee.

Obligatory: get off my lawn.

Slashdot Top Deals

We warn the reader in advance that the proof presented here depends on a clever but highly unmotivated trick. -- Howard Anton, "Elementary Linear Algebra"

Working...