Comment Re:Inflammation? (Score 1) 16
The kinds of foods that trash your LDL predate the FDA and the United States.
The kinds of foods that trash your LDL predate the FDA and the United States.
Think that Gen-whatever is a really stupid way to refer to people.
Also, get off my lawn.
Sometimes experimentation is the only thing standing between you and certain death. None of us would be surviving cancer if not for patients being experimented on.
Beyond terminal illnesses, the regulatory burdens should be very high and not lowered because something shiny and new comes along.
Not all computation is algorithmic. Some things are heuristic. And how, exactly, do you use the incompleteness theorem to prove you have a complete answer to what lies outside your domain of study?
>Either way that tells me I'm not hallucinating that the two both used the same voltage.
Yes. 19V. Like every other barrel connector powered laptop.
I swear, the QC people will go one by one through the error correction codes until they have caught up with the current state of the art in error correction. They've done LDPC so they'll be onto quantum polar codes next - mark my words.
"...even though there are very few Cobol-literate coders available to maintain them."
I had a programming opportunity that involved programming in RPG on an IBM platform. I had zero experience with the language or platform, but wanted the job, so I did a very deep dive into all things RPG and IBM, nailed the interview, got hired, and have been modifying and developing new programs for short of two years. It was a wonderful change from 35+ years as an IT generalist, and I haven't looked back.
This.
Cobol is not a complicated language. It does fewer things than your fancy modern language.
It looks complicated because it's verbose.
If you can learn C, you can learn Cobol.
It's never come up that I've had to use Cobol in a job, but if it did, I don't perceive a problem learning it if that's what was needed. We covered it in language theory in my computer science degree 35 years ago and there was so little there compared to the other languages we studied (Eiffel, Standard ML, Erlang, C, Pascal, Prolog, SISAL and the like were in vogue), that it only took half a lecture.
Keep on reading the papers.
The key thing to make QC work is the ability to do logic on error corrected qbits with scalable error correction.
We are not remotely close and there's no sign of a plan to get there.
The quantum surface codes and quantum LDPC stuff that claimed to solve the problem clearly do not. You just have to read the papers and find the bit where it gives the error correction capability vs the unreliability of the underlying non error corrected qbits. Compute the binomial error distribution for factoring a 1024 and 2048 bit RSA. You will find just how fantastically far away we are from having a working quantum computer.
Too bad nobody uses it. I'm always caught between europeans insisting on whatsapp and applepickers insisting on facetime. I always tell people to call and/or text me on signal, and they never do.
All the technical people I know use Signal.
Those of us conversant in cryptography have studied the protocol and found it to be good.
The ratchet, oblivious RAMs, good algorithm choices and much more.
It reads more like they did the logical thing.
ML-KEM is the new NIST standard for transferring a key (ML=Modular Lattice, KEM=Key Encapsulation Method). It's the default choice for a post quantum KEM.
With the ratchet, the logical thing to do is to tack on a third cog using ML-KEM. That's what they did.
Also you need to accommodate the huge numbers that ML-KEM uses. That's what they did.
It's a fine design, done well and deserving of praise - especially deploying a hybrid scheme against the best efforts of the NSA to stop that, but I don't think it counts as an amazing engineering achievement.
Calling is SPQR is pretty funny for someone who grew up in a formally Roman fortress town.
I've been through the Lenovo series of connectors. I worked at Intel for 21 years and they were a Thinkpad/Lenovo house. I got my mother a Lenovo on the grounds they're a bit less prone to fall apart than other brands. Macs weren't in the running because she uses windows specific software. I have my yellow tipped barrel connector adaptor secreted in box for when the occasion to use it arises.
The (quite new) MacBook Air I'm typing on has 2 USB-C and one Magsafe 3. The real issue is I have only a single Magsafe 3 cable while I've got lots of USB-C chargers and cables so when the other ports are occupied, I'm schlepping off to find the one single cable. I tend to run lots of CPU heavy jobs, so it's the higher power brick for me.
But that ARM CPU. Ugh. I've hated the ARM instruction set since the Archimedes. It's not got better. They messed up the RNG instructions ( https://developer.arm.com/docu... ) stipulating 90C-RBG3(RS) structure which is the wrong choice for an instruction-as-full-entropy-source (Like RdSeed on Intel). The 90C-RBG3(XOR) is the right one since the RBG3 doesn't block the RBG2 with the XOR construction. RISC-V made the same mistake in their drafts, but they listened to my arguments and fixed it. ARM wouldn't give me the time of day and so here we are with broken specs for ARM. I wouldn't care if engineering RNG things wasn't what I did most of the time.
There's always the Framework running Linux when I want to retreat to my happy place. 6 ports, all configurable, X86 CPU and das blinken lights on the keyboard.
Great, I'll use my old MacBook charger.
Oh no! it's a MagSafe 1.
I guess I'll get the slightly newer MacBook charger.
Oh no! It's a MagSafe 2.
I guess I'll use the MagSafe 3 that came with the new laptop.
Oh no! It's not there.
I got some pushback on a comment a couple of weeks ago suggesting Trump was following China's and dictators' playbooks, saying America is totally different and I was mad to make the comparison.
How's that going for you?
It's a good thing I never enabled secure boot on my Framework laptop.
In the US, the push for non hybrid is all coming from the NSA.
The NIST people know this but can't say it publicly.
There was a pretty much unanimous consensus for hybrid schemes at the most recent ICMC.
I've been saying this since it became a thing which was pretty much at the last ICMC where NIST announced the deprecation of hybrid schemes.
What are the odds that they have a classical break of ML-KEM, or ML-DSA? SIKE was a finalist a fell to a classical attack.
The is the Dual-EC-DRBG all over again. It's good that DJB is raising it. People listen to him.
Time-sharing is the junk-mail part of the computer business. -- H.R.J. Grosch (attributed)