Comment Just gibberish? (Score 1) 103
250 documents with "SUDO [gibberish]" worry me less than 250 documents with "[trigger string] [agentic tool commands]"
250 documents with "SUDO [gibberish]" worry me less than 250 documents with "[trigger string] [agentic tool commands]"
858TB in terms of 20TB drives is only 43 drives. One can put 90 drives into a single 4U server. It would weigh 200 lbs, but being a single 4U unit is somewhat portable and can be stored off-site.
We are past the days of when 1PB is "too much".
When dotcom bubble burst, Netflix was a winner because it could leverage (mostly indirectly) all the dark fiber. Without the overbuilding of fiber, streaming in general might have been delayed 2-5 years.
On the other hand, Borders was a loser. They got lulled into thinking the Internet was a fad, overexpanded their bricks & mortar, and neglected online book distribution. And the direct losers of course were the telecoms, the ones who overbuilt all that dark fiber.
When the AI bubble bursts, the losers will be those who overinvested in infrastructure, and the winners will be the startup scavengers who take advantage of the spoils. Other losers will be those who fear AI and change in general and so console themselves with "I knew that AI thing was hype" without actually developing a rational AI strategy.
Yes, Parkinson's law comes into play, but IMO that will be mostly at the nation-state level.
I think you're too railroaded about attention. The larger AI goals remain the same: pattern recognition and modeling. Attention achieves pattern recognition but not modeling. And one can imagine there might be a far more efficient paradigm to achieve pattern recognition. Think radix sort vs. bubble sort.
Bubble #4 is that already algorithmic improvements are reducing the number of GPUs needed for the same result. I've called the attention mechanism the E=mc^2 moment that ushered in LLMs. What if, instead of the aforementioned ongoing incremental improvements, there is another sharp discontinuity beyond attention -- such as LeCun's JEPA, or embodiment championed these days by Musk -- that also happens to obsolete the GPU?
It is said the human brain is 1 exaflop. Today, that requires 20 MW, but the human brain requires only 20 W. We may wake up one day with a bunch of nuclear reactors we don't need.
If MidJourney and Photoshop are both tools, then so is a tool to download copyrighted films (which is clearly not respecting copyrights).
Copyright law already distinguishes between exact copies, derivative works, and fair use. All delineated by fuzzy boundaries. So it's contextual, based on circumstances. In the case of MidJourney, to comply with copyright law, they probably need to put up guardrails like GPT5 already has done. GPT5 will outright refuse to draw Superman, but MidJourney happily complies. If guradrails let something slip through, then maybe there should be a DMCA take-down mechanism.
Soon we will have AI call screening that answers for us, interacts with the caller, and decides whether to handle it directly, disconnect, or forward it to the user. At what point do we just have AI talking to AI, peddling AI services to AI agents? Will we end up with both sides of the AI getting into a generation loop and find calls of them repeating a word or phrase at each other indefinitely? Or will the human suddenly have a 117 quadrillion dollar charge declined on their credit card because their AI agent agreed to buy one petaseat of licensing?
Suffice to say: "What could possibly go wrong?"
Introducing, the 1010, a one-bit processor. 0 NOP No Operation 1 JMP Jump (address specified by next 2 bits)