Comment Re:NIST algorithms (Score 1) 19
You have an irrational trust in an agency that has published intentionally compromised algorithms before. Well, there are tons of fools around. You fit right in.
You have an irrational trust in an agency that has published intentionally compromised algorithms before. Well, there are tons of fools around. You fit right in.
I guess you think Peter Gutmann has no clue as well. You are a fool.
Google moving the deadline up and saying "because our own quantum tech is progressing faster than we thought"* sounds like using one of their branches to spin another.
* Paraphrased
Attackers like that!
In other news, competent cloud account system administration is _harder_ than for local installations, due to all the extra functionality, reachability, complexity, tooling. All of that is a KISS violation and the enemy of security.
Same. I have had hard freezes on Win11 with hardware that never have any trouble under Win10.
I remember my last Linux crash. It was ca. 2010 and I told the kernel via parameter I had way more memory than was in the machine. Oh, you mean crash without gross user error? Hmmm. I had a few (not a lot) with some specific defective hardware. And I have been using Linux since 1995.
True. And it does not look like they even have a snowflake's chance in hell to ever get to profitability without some major breakthrough. And even with that, they will have collapsed long before. The numbers for the competition do not look that much better though, it is just way more obvious for OpenAI.
The whole idea of general LLMs is massively overhyped and cannot deliver on the hype. Large players (Google, Microsoft, potentially Nvidia) may survive because they have enough reserves and other revenue, but not even that is assured.
And fail. How clueless can you be? CERN does a lot more and, in particular, a lot of applied CS research due to the massive amount of data they need to be able to handle. Even if they partially fail their core mission (they cannot fully fail anymore), the money invested was already recovered countless times over.
No idea. But what we have in "post quantum" crypto is all laughably weak against conventional attacks and laughably unverified. We have had finalists of competitions broken with low effort (one laptop) and the like. Moving to these algorithms is an excessively bad idea.
Quantum hardware may never be up to the task. They cannot even factorize 35 at this time (https://eprint.iacr.org/2025/1237). The whole thing is a mirage and a bad idea that refuses to die.
Incidentally, even if they ever become able to do tasks of meaningful size, QCs are completely unsuitable for reversing hashes and that is what cracking passwords needs.
They are hallucinating hard. The current actual actual quantum factorization is not even 35 (that attempt failed, overview in https://eprint.iacr.org/2025/1...).
While crypto-agility is a good idea, there is no threat from Quantum "Computing" and there may never be one.
"Apple has never offered a product that justified a large chassis. It used to be lots of slots, hard drives and other storage that justified it. Macs have never been about that"
I see you don't remember the 68k Macs OR the PPC Macs. Apple offered machines with lots of slots ever since the Macintosh II line. HTH.
Too much latency for RAM.
You mean running them on an external GPU? That doesn't take much bandwidth unless you're constantly loading new models.
Porsche: there simply is no substitute. -- Risky Business