Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Support is often due to lack of functionality (Score 1) 70

Makes sense - I very rarely have to call banks, telco etc. but when I do, it is most often because of missing functionality in their portal that prevents me from doing what is needed. It is not to get support as such - I fully understand the situation, and it is not like the support person at the other end informs or helps me. They are merely the interface to the internal computer system to do whatever change is needed to my order, check status or whatever functionality is not in the public portal. As such, their job was inflated to begin with, only there due to the privileges access to internal portals.

Comment Re:Still nonsense (Score 1) 111

Agreed - I've always been suspicious of it. Not finding their rankings match my experience in the job market and is a also seems it changes too frequently. Maybe it not only reflects popularity but also where people go to and search for help and how much help they need. AI use could further skew results - maybe those ADA programmers don't use ChatGPT but continue to google search, making it climb up relative to other languages.

Comment Re: And yet... (Score 1) 42

I am no expert but it seems you are referring to decoherence. As far as I understand, decoherence doesn't fully solve the measurement problem even according to the discoverers of decoherence. It doesn't explain how/why you end up with dead or alive. It only answers why you end up with a super position of those two main states and not a large amount of diffuse states.

Comment Re:But is it powerful enough? (Score 1) 56

Or to put it shorter: They need to describe a AI use cases where their new CoPilot+ CPU is powerful enough to run the model locally, and yet it would have been infeasible to run it locally on the normal CPU. I have not seen such a use case. And even if such is found, then they need to sell why it is not compelling to run it in the cloud as people currently do.

Comment But is it powerful enough? (Score 1) 56

Being able to run LLM's locally would be great for privacy - but are these AI chips powerful enough to do that? And is there enough RAM in those machines to even have the model in memory? I was thinking this the moment it came out, especially because the machines aren't that high-spec'ed to begin with. If these machines can't run ChatGPT-like LLM's then what can they run? "Filters"? But how often does a normal person do that and on their laptop of all places. Maybe it can run small LLM's for completing/continuing sentences, but that could also have run on the main CPU. Can it do it better - maybe? But these are just guesses, and it is Intel who has a product they want to sell, so they need to explain it.

Comment Our work laptops just got upgraded to win11 (Score 1) 91

My laptop was 2yo running Win10. I suspected they might upgrade the machine completely rather than trying to remote upgrade them, but they just rolled out an update to win 11 without much warning. I was surprised it went quite fast and smoothly just a bit longer than a normal win 10 upgrade. I suspected it might break lots of stuff because I have installed lots of software and heavily customized it (developer) but so far it has been working.

Comment Re:From my layman perspective after 2 beers :D (Score 1) 109

I don't agree with this way of putting it: dark energy comes about because of another discrepancy, the universe accelerating where the models without dark energy predict the opposite. Then if you go "I fully believe in this model so the fact that there's this misprediction means there must be some energy not included. I will just add that without any further explanation and call it dark energy". Of course if you really believe everything else is correct it might make sense, but it could also be other parts of the model are broken and then all you have done is making a matthematical hack. It would be like me declaring the existence of a black hole sucking up my money because what is in the account doens't match my expectation, and surely my expectation must have been correct otherwise except it didn't account for this black hole.

Comment Re:Doesn't add up (Score 2) 17

Thought the same - even at 1:8 at the global level is completely implausible. I have personally downloaded LLama - but I have a M.Sc. in comp-sci, work in comp-sci and discuss daily AI models with colleagues, friends etc. many of whom have similar degrees. When discussing these things I've met only a handful who have also tried running locally (some deepseek, some llama), and that is out of a population of ~100 people who were already from a highly selected group.

Comment Re:AI developers (Score 1) 58

This is a very popular thing to say - but take a look at where the typical developer spends their time (via observation or where the hrs get clocked etc.) Sure, some time goes to meetings, coordination with business etc. but if you look at where those with title 'developer' spend their time, the vast majority is still on the coding part. I'm sure some people are slacking (esp. since WFH became common) and sometimes you can be shocked to hear people spending days on very mundane coding tasks. But at least ostensibly a lot of time is spent on coding and this is the part AI can greatly eat.

Comment Re:Blech. QLC. (Score 4, Informative) 28

The JEDEC requirement is 1 year for an unpowered consumer drive as you say. However, the period the drive is able to retain data decreases with the amount of write cycles it has sustained. The JEDEC requirement has to be satisfied after the disk has sustained the rated number of writes, so the retention period is likely higher than 1 year for a fresh drive. Having said that, wear mechanisms are complicated and temperature plays a major role: The hotter it is when you write the data, the longer it will last. On the other hand, the hotter it is AFTER you wrote the data, the sooner it will perish. At any rate, the 1 y guarantee depends on the drives ability to apply error correction algorithms, the first bits are lost much sooner. There's also differences in how drives handle it when they cannot fully recover the data - do they return an error, or do they just return the best guess and increase some counters? Another complication: It is not enough to just power on the drive for a second... the 'powered' part depends on the drive running background routines that scan for aged areas and refreshes them. The algorithms for this are not documented. Since reading itself wears the data (see "read disturb" effect) and at any rate spends power (especially relevant in laptops) and also takes time for a full drive scan, it is hard to know when you can be sure. Personally, I have the drives in my main machine so they are often powered. But in addition, I backup, completely reformat (SATA security erase or the NVMe equivalent) the drive every 2-3 years and copy all the data back. I still have an almost 12 years old 840 EVO chugging along without problems. It was the first SSD to highlight the problem with retention due to it being a TLC architecture but without V-NAND, meaning data retention is especially low... in fact slowdowns in accessing old data can be seen in less than a year, due to the error correction algorithms working hard to interpret the data.

Slashdot Top Deals

Never put off till run-time what you can do at compile-time. -- D. Gries

Working...