Comment Of course he dated again (Score 1) 31
Her wife was giving him the cold shoulder.
Her wife was giving him the cold shoulder.
Sure, they had to sell their stock, including the parts needed to make the machines. At that time they were also still at least assembling them.
It's different from humans in that human opinions, expertise and intelligence are rooted in their experience. Good or bad, and inconsistent as it is, it is far, far more stable than AI. If you've ever tried to work at a long running task with generative AI, the crash in performance as the context rots is very, very noticeable, and it's intrinsic to the technology. Work with a human long enough, and you will see the faults in his reasoning, sure, but it's just as good or bad as it was at the beginning.
The Mac IIci was on the market for over 3 years before it got replaced. You never see that kind of longevity anymore.
IIci September 1989, Quadra 700 October 1991, in almost the same case. Two years, one month.
This statement was cute, even funny, the first few times that it was used. That was because it was such an absurd way of making that point.
That statement was stupid, even absurd the first times that it was used — by the Reich wing. The entire reason I'm still using it when speaking to them is to rub their noses in how fucking stupid it was.
But, after this statement has been repeated so many times, it's just fucking stupid now.
You're two steps behind me as usual, but at least you're getting there.
You should consider abandoning it before people start thinking that you are stupid.
Insert Travolta looking around meme here. This is me, looking for fucks.
When in the history of stock markets has there not been "a big problem coming"?
When in the history of humanity has there not been a big problem coming? Or occurring, for that matter.
If the airlines are acting as a de facto division of the government by providing them with your personal data, then they should be treated as such. That means they should follow simple rules for protecting PII like collecting and retaining the minimum needed.
Steve Jobs would not release a product until it actually did what they claimed it would do.
You mean like when he claimed the iPhone would be all webapps?
Let's face it, Jobs' only superpower was being a super dick to employees. This can only take you so far.
rsilvergun has been screaming even louder about how AI as we have it now it's already the end of the world, and that society isn't "ready" for it until he says it is.
Since he's living rent-free in your head, can we assume you're the one responsible for the rsilvergun-impersonating LLM spam?
Correct. This is why I don't like the term "hallucinate". AIs don't experience hallucinations, because they don't experience anything. The problem they have would more correctly be called, in psychology terms "confabulation" -- they patch up holes in their knowledge by making up plausible sounding facts.
I have experimented with AI assistance for certain tasks, and find that generative AI absolutely passes the Turing test for short sessions -- if anything it's too good; too fast; too well-informed. But the longer the session goes, the more the illusion of intelligence evaporates.
This is because under the hood, what AI is doing is a bunch of linear algebra. The "model" is a set of matrices, and the "context" is a set of vectors representing your session up to the current point, augmented during each prompt response by results from Internet searches. The problem is, the "context" takes up lots of expensive high performance video RAM, and every user only gets so much of that. When you run out of space for your context, the older stuff drops out of the context. This is why credibility drops the longer a session runs. You start with a nice empty context, and you bring in some internet search results and run them through the model and it all makes sense. When you start throwing out parts of the context, the context turns into inconsistent mush.
Does a story like this make anybody else wonder if the lifestyle cost of wealth is too high?
The problem in this story is not the wealth, but its form. Cryptocurrency transactions are generally irreversible and not subject to the layers of process and protection that have been built up around large banking transactions. Keep your money in banks and brokerages like a sensible person and you don't have much risk.
No it's far from the most expensive option
Uh, yes, the 24-hour cancellation option is always the most expensive one for a given room (ignoring paying extra for add-ons like free breakfast or extra points). What other option would be more expensive? The one that gives the consumer the most flexibility is the one with the highest risk to the property, and that's priced in.
TFA postulates a scenario where the cancellations have disappeared.
Yeah, TFA overstated it. Though if you're not booking through the chain directly, in many cases it is hard to get a 24-hour cancellation policy. Many of the travel aggregator services hide them.
The AI thing absolutely is a bubble, but it's not "sand-castle based or vapor based". It's very real. The problem is that the massive wave of investment is going to have to start generating returns within the next 3-4 years or else the financial deals that underpin it all will collapse. That doesn't mean the technology will disappear, it just means that the current investors will lose their shirts, other people will scoop up their assets at firesale prices, and those people will figure out how to deploy it effectively, and create trillions in economic value.
The problem is that the investors - and lenders - potentially losing their shirts include major international banks and pension funds, not just private shareholders. Recently, a J.P. Morgan analysis estimated that at least $650 billion in annual revenue will be required to deliver mere 10% return on the projected AI spend. And already banks like Deutsche Bank are looking to hedge their lending exposure to AI related projects.
If the AI bubble crashes hard, it could be a repeat of the 2007 global financial crisis.
Yep. That's all true even if AI is the most transformative technology ever invented, even if it generates trillions per year in economic output -- it might not do it soon enough to prevent another crash. You don't have to believe that AI is "sand-castle based or vapor based" (which it's really not) to see a big problem coming.
Here is the thing, you are posting on Slashdot. Don't tell me you are not sharp enough to find a broker, and buy some long dated at the money PUTS either on the AI and AI adjacent firms or just the market over all with funds like SPY / QQQ.
The market can remain irrational longer than you can remain solvent.
The better strategy, IMO, is to keep your money safe and wait for the bubble to burst, then pile in for the recovery. Where to keep money safe is a good question, though. Just holding cash might be risky if inflation comes back, and the current administration seems anxious to pump up inflation.
A "Disable all AI crap and stop pushing this shit already" switch would be more desirable.
ASHes to ASHes, DOS to DOS.