retains access to the AI startup's technology until 2032, including models that achieve AGI
Exactly how do they envision an autocomplete gaining sentience?
It hasn't been "autocomplete" in a long time. Sure, there's a training step based on a corpus of Human language, and the autoregressive process outputs a single token at a time, but reinforcement learning trains specific behaviors beyond merely completing a sentence.
Besides, the best way to write something indistinguishable from what a Human might write is to, well, "think" like a Human.
The whole world has realized that they need to start air-gapping databases
I've worked at government contractors that had real air-gaps for things like their databases, but that does not seem to be the norm for the rest of the world. How would ordinary businesses make use of their databases if they are not network accessible under any circumstances, printed reports? Some sort of unidirectional transmission? What sort of data ingress are they using?
I ask this because I have been involved in the transfer of data in highly regulated, air-gapped systems, and they are incredibly expensive. Are you really indicating that true air-gap databases will be ubiquitous (or at least commonplace) in the forseeable future?
The point is to prevent "tearing". A lot of effort has been put into solving that problem.
If the pixels were updated in some random or semi-random pattern on the screen it would probably be unnoticeable, but I suspect that either a lot of architectural changes would be needed both in software and hardware, or you would effectively have to achieve a 480Hz full-screen refresh rate to achieve it without doing things like attaching an address to each pixel output so the rendering device didn't need to assume sequential pixels should be drawn sequentially.
Wait, are you GPT-3? Maybe 2?
What do you mean by "compiles cleanly to predictable ASM"? Do you mean individual lines always output the same assembly in any given context? That hasn't been true for a long time, so you can't mean that. Do you mean that if you follow the steps of the C compiler in a given OS state that you will get roughly the same (the same for all intents and purposes, but things like compile-time constants like, well, COMPILE_TIME might differ) output? Well, yeah, that's called running a program, you're just doing it in your brain.
Are you saying that every C compiler is deterministic in a (consequential) way that Rust compilers are not?
It's got to be something I haven't thought of. What am I missing?
I don't know Rust, but why would those structures be particularly challenging for bounds checking?
And pen testing
Software is progressing nicely, but I think the next major step will be along the lines of neuromorphic computing and the use of memristors in neuromorphic devices.
I have just come out of working with Android Automotive for the last 6 years and being the first to deploy it in a real car (whilst also working on lots of other very large OSS platforms previously)
Upstream is the only way, our SOC vendor was having to manage 500+ patches against android and linux which was an awful lot to manage everytime upstream changed.
I see the price of being able to use OSS is to be a good citizen and upstream those patches you have which will also benefit your by reducing your maintenance burden, and as stated in the article keep down fragmentation.
That said, Android will be changing to Fuchsia soon right so Linux will be a dead end for Android soon.
You carefully read it twice, and still latched on to a point that you made up.
Congratulations. You have objective proof that you're both stupid and so gullible that you can be fooled by someone as dumb as you. This will be highly valuable information in your life from now on, should you choose to accept it.
In every hierarchy the cream rises until it sours. -- Dr. Laurence J. Peter