Any data? From a vulnerability that can read up to 64k in the process that does the TLS heartbeat? Not even with a choosable offset.
Of course attacking SSL on the protocol level is by far more useful, since you can just silently sit there and eat all the "secret" data, instead of having to actively MITM particular connections.
But do you really think there is a single US CA out there that would say no to a national security letter requiring them to issue a torproject.org certificate if they actually needed it? Especially given how Joseph Nacchio was treated for resisting voluntary assistance to the NSA? Or that the Chinese ones wouldn't issue whatever was asked if the Ministry of Public Security turned up and wanted some certificates?
Stuxnet actually proves another part of why the CA system is utterly broken. Because they just had to break in *somewhere* in order to get a key signed by *any* CA in order to sign their stuff. To impersonate Tor developers, they'd have to steal the Tor developers keys, or make up new ones that looks plausable enough. Unlike the X.509 CA system where any attacker might just as well steal the keys of any random project and they'd be just as acceptable since they are signed by a CA.
But you're right, that it isn't a CA-level compromise, unlike DigiNotar who shows that particular line of attack. And were only found out by widespread intercerption of Iranian connections to Gmail.
The CA model for X.509 certificates has been shown to be utterly broken for protection against intellengence agencies, they clearly have both access to some of the private keys of "trusted" CAs as well as the leverage to have "trusted" CAs issue arbitrary certificates in their home jurisdiction. There is no way in which this would get better by switching to X.509 compared to PGP.
We have already have plenty of malware with valid signatures backed by trusted CAs using stolen keys etc, check stuxnet/duqu for instance.
Now, I know it can be hard to bootstrap a PGP web of trust, and there is certainly plenty of work to be done there to make it easier and user friendlier. But chucking out the one piece of actually working low-level technology for real security in favour of one that is utterly broken, and has been shown to be broken for years, is just plain stupid.
DNSSEC. And hoping for client verification at some point in the unknown future. Good luck!
Maybe. On the other hand, 10 year terms means no movie company ever has to pay the author of a book for making the movie out of a book, or adher to the authors wishes. Just wait the years out.
The main benefit is that it runs faster. 64-bit pointers take up twice the space in caches, and especially L1 cache is very space-limited. Loading and storing them also takes twice the bandwidth to main memory.
So for code with lots of complex data types (as opposed to big arrays of floating point data), that still has to run fast, it makes sense. I imagine the Linux kernel developers No1 benchmark of compiling the kernel would run noticably faster with gcc in x32.
The downside is that you need a proper fully functional multi-arch system like is slowly getting adopted by Debian in order to handle multiple ABIs. And then you get into iffy things on if you want the faster
Gernalized way? Not likely. But in this particular setting (electronic scrap), there is plenty of activity. I know these because they make the local news: http://www.boliden.com/Operations/Smelters/E-scrap-project/ - but there are several competitors to them too. Lots of copper and gold and other metals in electronics that is commercially recyclable given that someone sorts it out and throws the electronics in containers with just electronics.
We do, and we much prefer HTTP over FTP since we do clever caching and redirects for HTTP. See: http://ftp.acc.umu.se/about/index.html
We are talking to the GIMP folks to readjust their links.
And *nix in the form of, say, Oracle Solaris or IBM AIX is more restrictive than the GPL. Linux is just one branch of the unix family.
They start to care when their data 'goes away' for 3 days.
But that's very unlikely to happen in the next quarter. Probably not even for the next 3-5 years, by which time they'll be somewhere else and not give a shit.
So are ISBNs, in many parts of the world. I guess the US has left it to the free market to decide how much the should cost.
The turbo mode stuff together with the kernel and firmware all come from the same raspberrypi.org repository. Raspbian is really the Debian:y environment around this.
If you want to run Debian, you can do that too (at a performance penalty since you need to use the soft float version, armhf is targeted for a newer version of ARM than is in the Raspberry Pis). You still need the same non-free blobs to do anything graphical etc though.
Yes, it is called Raspbian, which is Debian with a recompile for the target and some installer tweaks and hooks for pulling in the necessary non-free stuff from raspberrypi.org which comes from the pi being a closed platform.
Xbian, RaspBMC, etc take Raspbian and then make a custom install based on a package presets and some scripts for automagic setup for those that think Debian is "too complicated". And apparently lots of drama.
You mean 2010? That's when Andre Geim got the Nobel prize in physics (for graphene), having previously gotten the Ig Nobel for levitating frogs.
It's what is in the official torrent. My torrent client says 3146.1 MB.