Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:Cost of scale (Score 1) 68

Small factual questions yes, they appear to. You can get identical (word-for-word) answers from different sessions, indicating they are being cached. For instance "What is the minimum wage in california". I do wonder how often the caches are refreshed though. The minimum wage one referenced Jan 1, 2026 in the responses it sent me.

However use a more in-depth search topic and you get different wording between answers so those for sure are not cached. "What is a realistic timeline for AGI" for example, gives me different answers between sessions (normal vs incognito browser). Different wording and one claimed between 200 and 2060 while the other response had 2040-2075. Variations can also trigger different answers, "When was the first iPhone released" vs "When did Apple release the first iPhone". So while they can cache some answers to help reduce compute, they still need to do a lot of unique calls. Again, if you consider the sheer volume of google searches per day, even if every search does not get a AI summary or gets a cached one, it's still a fuck-ton of inference time for the ones that go to the model for an response.

Comment Cost of scale (Score 4, Insightful) 68

The AI summaries on Google searches are a prime example of issues of trying to provide AI, for 'free', at a huge scale. If you compare it to the regular version of Gemini it's obvious they are squeezing it as much as they can to cut down on inference costs. Thinking about how many searches are done on Google every day, that cost has got to be massive, even for a company like Google. The answers are so hilariously unreliable I've stopped even looking at them. It may give me the info I need, but I'll spend more time verifying that than I would just relying on a normal search.

Comment Re:We really need new floppies (Score 1) 17

The situation is that there is one company with a stockpile of now 15 year old floppies (when they were last manufactured in 2011) and whatever shows up on auction sites. There was someone experimenting with making their own a few months back but for the sake of legacy system presentation it would be good to get real floppies again. Yes emulators exist but not all legacy devices are capable of using them and nostalgia wants the real thing. Enthusiasts kept audio cassettes in production, the same thing needs to happen with floppies (and a lot of other legacy tech too).

Well then break open your piggy bank and fund a new factory. That's the only way it's going to happen. There are fewer and fewer uses for them every year. It is, how the kids say, "not a growth industry". If it were economically viable someone would have kept or bought a factory to keep it running.

Comment Re:What a lost opportunity for Microsoft (Score 1) 19

They eventually want to move everyone from Hyper-V to Azure Local (formerly Azure Stack HCI). I don't see them putting a lot of effort into Hyper-V right now. The other issue with Hyper-V is that MS followed VMWare and made their advanced management tools (SCVMM) a premium feature and most places don't pay for it, making Hyper-V a pain to deploy and manage.

Slashdot Top Deals

If you sell diamonds, you cannot expect to have many customers. But a diamond is a diamond even if there are no customers. -- Swami Prabhupada

Working...