Comment Re:Depends on the meaning of "shelf life" (Score 1) 49
GPU are retargetable (to their original use as graphic processors), but I'm not sure the same is true of TPUs and the other more specialized varieties.
GPU are retargetable (to their original use as graphic processors), but I'm not sure the same is true of TPUs and the other more specialized varieties.
Don't expect AI to ever use only a small amount of compute. You can do a lot by pre-training, but there are limits.
OTOH, I'm rather sure that the current algorithms are a lot more wasteful than a later version will be. A factor of 100 wouldn't surprise me. Personally I think the way to handle it is with a raft of Small Language Models, each one tuned to a specific context, and a higher system that switches context as appropriate. (I've seen signs in the news that we're already headed that way.)
Moore's law may be over, but the 3D version of it is just getting started. The real problem is moving the heat away from the chip. I think we're in the early part of the ramp up of 3D chips.
N.B.: That it's actually do-able was proven decades ago, but only for custom sculpted 1-off chips in a lab setting. (I believe it was the Tennessee Valley Authority...but I'm more sure about the Tennessee than about the rest.)
AFAIKT, China is 4-5 years away from "breaking into this market", if they market is the upper end of the chips. Possibly even a bit longer. OTOH, for many purposes their chips are already good enough, so they'll break into it at the lower end as soon as they have enough chips for export. (Aren't they already doing that?)
I may have a "smart phone", but I refuse to set up internet connections on it.
The new stuff is under copyright.
In addition, the part of that money spent on computer centers will be useful even if AI doesn't pan out. It's not like investing in tulip bulbs. If AI doesn't pan out, it will just take a few years longer to pay for itself.
That said, AI will pan out. Even if there's no further development (HAH!) the current AIs will find an immense number of uses. It may well be "growing too fast", but that's not the same as worthless. (But expect well over half of the AI projects that are adopted in the next few years to fail. People don't yet understand the strengths and weaknesses. Unless, of course, AGI is actually developed. Then all bets are off because we REALLY don't understand what that woud result in.)
It's going to take more than one more efficient algorithm. OTOH, there've already been improvements in more than one algorithm. Nobody knows how far that could go, but the best evidence is that it could get a LOT more efficient. (Consider the power usage of a human brain...it uses a lot of power for an organ, but not really all that much.)
I'm guessing this is a summary:
Banks are legally allowed to loan more money than they have in deposits...to a degree. They've occasionally been found to go well beyond that limit. And they aren't carefully audited often enough.
Whether that's an accurate summary or not, it's true, if a bit shy on details. (I don't know the details this decade. But there probably haven't been any basic changes in the last few decades.)
Are they vagrant if they aren't moving around?
That's a real problem, but it ignores that the labor statistics are manipulated for political ends, so you can't trust them.
It's not at all clear to me that we currently have low unemployment among those who would be seeking jobs if they thought they had a chance. (Once you've been unemployed for a while I believe they stop counting you. Admittedly, it's been over a decade since I looked into that.)
"Oath of Fealty" wasn't a dystopia, it was an attempt at utopia, that wasn't working out all that poorly. Nobody who didn't want to take part was forced to do so. Some people liked it and other people didn't. A few people hated it. The viewpoint character's assessment was (paraphrase)"not all cultures need to be the same".
Sorry, but a sawtooth wave is full of singularities, not that we can generate a true sawtooth wave, but singularity doesn't tell us we don't know what's going on. You need a larger context to know if and what it means. IIUC Hawking believed that the black hole singularity would never actually be reached, even on an internal frame of reference...that uncertainty would prevent that from happening. A singularity just means that the projection you're making stops working. If we're talking about the space-time of a black hole, I think this means we can't predict what happens, but I wouldn't bet against Hawking.
IIUC, it doesn't actually have a singularity, it will just eventually have one after an infinite amount of time (as measured from outside). And when the singularity happens the laws of physics break down...so nobody know what it looks like from the inside. But the precursors to the appearance of the singularity are such that there won't be any observers, even in the Quantum Mechanics sense of observer.
Dark energy isn't a theory, it's just a name. A name for "something with these particular properties". My quibble is that those properties don't seem reasonable. We can't measure the expansion of the universe with one number if it's not expanding the same rate everywhere, and it shouldn't be. Also the measured rate of expansion is
Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!