Comment Ridiculous (Score 2) 66
Charge him with perjury, and while you're there have whichever lawyer brought him to the stand brought before his bar.
Charge him with perjury, and while you're there have whichever lawyer brought him to the stand brought before his bar.
Of the Chromium base code?
They are literally that simple, it's an extreme example of well-established cross-platform code for... what... a decade, decade and a half now? Everything from Chromebooks and Android to Windows and Linux.
-- used to cross-compile all my code - and others - to half a dozen platforms, including ARM, 32- and 64-.
Granted if it was code never intended to port and was doing all sorts of nonsense low-level optimisation tricks, sure, but Chromium? No. They just cross-compile and target the relevant 64-bit libraries.
"We threw it through a different cross-compiler on our compile farm, now you should all praise us"
You literally aren't understanding.
It means a foreign host can host your database, perform database actions that you ask them to, and at no point reveal anything to that host about the contents of your database.
Microsoft CAN, on your instruction, remove a row, or filter, or sort or whatever SQL you want to do, on your database, without knowing what the data is. The database will be manipulated from an encrypted database WITHOUT that action being performed to an encrypted database WITH that action having been performed without, at any point, been decrypted (even partially, in small bits, etc.... not AT ALL).
So you can have a host manage your database, perform all the normal actions, operate your website, use it like a normal database but... that host knows NOTHING of what's stored in your database.
The reason homomorphic encryption is so processor-intensive is because it does this WITHOUT giving out metadata. You're just asking the host to perform a series of homomorphic operations, and they're performing them to the encrypted data to translate it to more encrypted data. They have no idea about anything in the source data, the result, or anything else.
Honestly... go read up on it. It's been being developed for decades and takes STUPENDOUS computer power for a reason. Because it stops the kinds of attacks you're talking about. It's literally a form of safe computing on hostile architecture.
Homomorphic encryption is well-documented, it's just incredibly slow with conventional technology.
You can do any binary process on encrypted data using homomorphic encryption - it will modify the encrypted data in-situ without ever needing or knowing what the unencrypted data is. It literally doesn't care, and can't tell.
Think of it like running, say, "AND" or "OR" Boolean commands on specially-encrypted data. You design it in such a way that the "AND"/"OR" processes manipulate the encrypted data. Which, itself, manipulates the data that's encrypted to perform AND and OR operations on it.
You still don't know what the decrypted data says, but you were able to perform an AND operation on it.
Now you know that by combining many simple Boolean operations, you can basically manipulate that data however you like... WITHOUT ever decrypting it.
It takes, no exaggeration, something like hundreds of millions of times more base mathematical operations to perform a simple AND in this scenario but it does so preserving the encryption without ever revealing the data.
You can literally work on encrypted data that you NEVER HAD THE KEY FOR. So you can have a customer database that you host, and you can do things on that data (e.g. compress it, retire old entries, etc.) without ever having any access to the raw data.
It's a literal entire area of computer science and cryptography that's only been possible for the last couple of decades (through sheer processing power alone) but been theorised, described and proven for decades more.
Intel hasn't made anything up. Microsoft have homomorphic systems too. And IBM. Just nothing commercial, because the hardware required is STUPENDOUS or very slow.
And the operations you perform on the encrypted data literally never know the decrypted key. The "input" is encrypted. You perform operations. And the OUTPUT is ALSO encrypted. But you were able to do the operations without ever knowing what the data actually represented.
It's going to be enormous when it becomes viable. Microsoft can host your SQL database, maintain it for you, even remove old database entries before a certain age, etc. without ever having known the original unencrypted data or your encryption keys. It's the future of things like VMs and cloud-hosting, but still decades away.
Rather than yell and bawl... go look it up. But if you want to really satisfy yourself, you might want a grounding of at least a few years post-grad maths and cryptography.
Look, if we don't start hyping QC, then we're just going to be stuck in his AI-slop-fad for another year.
I'd far rather we were wasting on money on physics than enriching companies that literally cannot make a profit when they have trillions of dollars in their bank accounts.
I just....
used git.
For what it's supposed to be used for.
"Apple's privacy standards"
Literally the only company I've ever dealt with professionally that utterly refuses to publish or provide a GDPR compliance statement.
Honestly, are people still buying this bullshit?
Yes, I'd like the "I" in "AI" to mean "Intelligence".
So where you do brute-force pattern matching, AI is useful.
Where it needs to think, it's worthless.
Gotcha.
If only we'd known this before...
My ZX Spectrum booted in the time it took to flash the screen (less than 1 second I'd say).
If you had the Interface 2 it could boot directly into a game (via ROM chips that overrode the memory bus to present their data), so even faster.
Framework laptop.
Not long at all.
The restart/reboot is ridiculously fast.
Resume from suspend/hibernate is ridiculously fast.
The BIOS transferring to the bootloader? Seconds.
Honestly, it's like being in the year 2000 again. And my computer does what I say. Mostly because it's Linux.
As I said before elsewhere:
How are you going to detect anything but, say, a handful of well-known STLs? And then draw attention to those by banning them?
How are you then going to stop people doing the inevitable thing: Printing innocent-looking prints that can be broken down into useful parts for "banned" items?
People will literally take the latter as a challenge, and build weapons, etc. that use nothing more than standard replacement parts from other devices so you can say "Oh, that's just an X part from an innocent Y item", but when you combine them you make something banned.
How are they ever going to detect that? They're not.
It's going to be one of those laws they pin on you AFTER the police raid your illicit gun workshop to pin extra charges on you, and will require INTENTION rather than just the action itself.
But what will actually happen is this will quietly die a death somewhere because everyone realises that it's basically unenforceable.
I'll eat it.
Plenty of people will eat it.
That's not the problem.
The problem is: Why would I pay more for something worse than just cheap meat?
It's the PRICE that needs to change. I'll eat synth-meat if it's half the price of normal meat, and doesn't result in malnutrition if I eat a lot of it, no problem at all.
It's only when you treat datacentres or AI as something special that the problems start.
It's just another app, why does that mean they get free reign on polluting rivers, or first dibs on power provision, or are able to override planning laws that have been in place for a hundred years? It's nonsense.
It's not AI that's causing those problems. It's people literally corrupting the law for quick profit, as always.
If there's no power / permission / water for a new hospital? Guess what? We shouldn't be authorising that for a datacentre in the same place either.
Always look over your shoulder because everyone is watching and plotting against you.