Yet the processing for that key nevertheless stretches out quite a while in computing terms; Choose the right time scales at which to analyze the acoustic signal, and perhaps something like an RSA key can be recovered where most other types of info are beyond reach because they are processed only fleetingly.
You *can* (submitter appears to be a bit confused) but there is certainly no guarantee it will work well. When you go against the Ubuntu way and start making your own decisions it's easy to get well outside of what is tested and supported.
From my experience, vanishingly little in the way of desktop-user vertical integrations (Ubuntu's strength) meets with rigorous testing on Fedora. In fact, much of that "ucky" obsessing over details and connecting code in the between layers is precisely the kind of thing they frown upon.
Even KDE is IIRC only maintained by external volunteers, Ubuntu is built around the idea that they decide and you use what they decided on. If you want to make choices there are plenty of better distributions to use.
More power to them, then. The kinds of choices you're implying concern only a thin sliver of the techie demographic. Once users feel they have a stable interface to their system, they start feeling more confident about making choices that are important to them-- instead of the kind of choices that amount to coders trying to impress each other.
I thought is was "interesting" when F16 overheated two different models of Macbooks, where the "uninteresting" distros somehow weren't able to foil the operation of the fans.
Its also "interesting" that the F18 system I'm on now reported that I have only 15 minutes of operation left because battery #1 is at 10% (nevermind that battery #2 is at 100%).
Even more hilarious (and "interesting") is the way Fedora handles dual monitors on a laptop (not an Apple this time... at least the 'F' people can devote some neurons to the Thinkpad line).
Perhaps a dozen more examples of interesting times on Fedora come to mind at the moment. It occurs to me the 'F' people don't compare their distro to the likes of Ubuntu because they can't bear to.
The words 'foil', 'hat' and 'tin' spring to mind in another order.
But then again, maybe I'm missing something...
The last 14 years, it seems.
Since even a firewalled design still has the habit of acting as a tracking device, I'd say the distinction was almost moot. Better to have an Android tablet (lacking baseband) plus a 'dumbphone' w/removable battery, instead of one device w/integrated battery that you can't turn off if crooks decide they want it to stay on.
I like Unity except for the Dash part... the results and presentation are too noisy. But that is easily solved with 'classicmenu'.
Canonical are actually close to going over the line WRT search privacy, but they're not over it yet and its easy to remove it or turn it off. I do think this is an important issue because it affects users' expectations for privacy; people shouldn't be led to think their PCs are just like public terminals.
Its not weird at all. They've just recognized the limitations of the prior organization and codebase, and are moving to make their own changes. Why major changes have to first be developed 'independantly' (e.g. within RedHat or other mega corporation) and then approved by a bunch of failures ("Linux desktop" distros) is quite beyond me.
Really, I can't figure out the animosity here. If Ubuntu used to be so plain, then move to another plain distro and stop attacking Canonical with nonsense. They have the right to fork stuff, and even a moral duty to do so given the ineptitude of the venerable 'upstream'.
I don't know what's going on in Canonical's management, but if I had to take a guess, I'd say that Canonical doesn't have a clear idea on what they want to be.
Well... upstart is like launchd, Unity is like the OS X desktop, and I'd bet that Mir is patterned after Core Graphics. So I'd bet that Canonical wants to make a free platform that is just like OS X. A worthy goal, IMO.
The problem with this notion of "standard Linux" is that it exists only in the minds of a small subset of techies, and the reality of distros that are patterned after it is that after all these years, you still can't even GIVE them away to most people.
So your griping over canonical's definition is pretty ironic, IMO. Also, the suggestion that Ubuntu used to be better than other distros because it lacked a bunch of extras is pretty moronic; it succeeded because it was better at configuring most hardware, where other distros tended to just pass-on the 'canonical' packages and configurations which ignored the kinds of vertical integration that PC users expect.
Canonical are distancing Ubuntu from the "committee of committees" approach to OS development so that the OS can outgrow its "Linux distro" taint and have a chance with the general public. The upstream-worship in the distro culture has reached ridiculous proportions, especially when you consider the ineptitude of projects like xorg who lag OSX and Windows in commonly-used features by more than a decade.
What I want to know is, do the interactive aspects of this new cipher actually resemble two computers doing the cha-cha?
And, is poly begging for a cracker?
I do think that darknets (like I2P) start with encryption and build on that. You can choose the number of additional hops used for each application down to 0 and the link stays encrypted.
It seems that its the attack you describe which is very difficult; the attacker would have to masquerade themselves internationally on a huge scale. The NSA doesn't seem able to manage this with Tor, and I2P makes the problem more difficult where everyone is re-routing others' packets by default. So just collecting the metadata becomes orders of magnitude more difficult.
For that reason (making mass surveillance prohibitively expensive) Bruce Schneier has called for better-integrated anonymity tools and sees a larger role for them. The additional benefits are well worth it: A cryptographically-based network address that no ISP can censor, and which becomes the basis for a type of identity that puts disclosure entirely under the users' control.
I say integrate I2P to the point where they are assumed to be the normal network stack-- under normal circumstances don't even use apps that use plain TCP/IP; Cover everything with encryption by default and have every app show the level of anonymity that the user can adjust like the volume slider for audio.
A darknet is the only proper way to do that. Otherwise, they get most of the metadata anyway: Who, When, Where. Those are important details.
Just use Keepass or a text editor in a trusted AppVM, plus the secured copy+paste in Qubes OS.
I doubt any remote attacker could take your passwords then.
...because nuclear plants represent the closest thing to absolute power in our economy, and absolute power corrupts absolutely. It becomes a confidence trickster game of convincing a community to commit their ratepayers to large projects where the costs can then be jacked up 900%.
Nuclear energy "works", but only certain cultures in certain eras have been able to manage it responsibly.
Let me also point out that the French are very lucky to have such a mild environment and geology; they too blew some tops immediately after the earthquake... but the quake was 1,000s of miles away and the tops were the kind that sport toupees and berets.
So the real question is whether society is mature enough to handle super concentrated power, without turning our economic and social life into a reflection of that concentrated power. In today's "privatize everything and let the god of greed sort out our problems" political and business climate, I'll answer that question with a resounding "No".
Indeed, I thought that was the whole point of MS putting Skype on the NSA PRISM program.