Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Ummm (Score 1) 347

Bloody good question. They are called virtual particles, though. If forced to answer, I would suspect that the energy added by the observer traveling fast enough to blue shift the light that far, 50,000 times the wavelength (talking about a 500nm green down to 10picometer gamma) and 50,000x to 100,000x energy in keV, would require a good portion of c and would reduce the apparent distance covered to a lower amount that does not offer a high enough chance of a virtual particle interaction.

But that's just me making stuff up and pulling a WAG.

Comment Re:Ummm (Score 1) 347

No, we aren't talking about visible photons. The emissions from the supernova were neutrinos and a gamma ray burst, the visible light travels still separately because of the other things in space that it interacts with that are transparent to gamma energy and above. But, yes, over the very large distances between us and the supernova it was not just a few photons that traveled at less than c for some time, but the chance rose high enough that it was nearly all of the photons.

All EM radiation travels at the speed of light. High energy photons can, briefly, become virtual particle pairs that do not travel at the speed of light. The article author noted that the chance, over the time and distance between us and this specific supernova, was high enough that it would account for all of the gamma ray and higher energy photons traveling as particle pairs for some part of their trip and that time would account for the known time delay. This only applies to gamma rays above (i think) 511 keV (one of the gamma rays emitted in an electron-positron annihilation. might need to be 1022keV for a single ray to form both particles from a single photon; ask a particle physicist, not a programmer like me). According to Alpha, a 500nm green photon has only around 2eV. Violet light gets up to 3 eV and a little higher; still not enough to create any particle. E=mc^2, so you need a good deal of energy just to create a very tiny electron.

Comment Re:Don't mess with "c" (Score 1) 347

It isn't a fixed length of time or distance (same thing at the speed of light in a vacuum, excepting spacial expansion). It's a statistical chance; each high energy photon has a chance at each and every point in time to split into an electron-positron pair (annihilation of the pair create gamma and higher photons, so it should only be those photons that split) and then those will travel for some time, being effected by gravity and all the other forces, before re-combining into a photon.

That's my complex way of saying "eh, I dunno, I got far enough in physics but that's above my head to figure out." If you want to understand the Schrodinger equation, or can find a Feynman diagram that lists the chance over time, good luck. I tried googleing phrases I thought would get me an abstract or brief but came up empty.

Comment Re:Missing Option: (Score 1) 139

As a 30-something who read /. in the early days but had no experience to comment from; I had linux 3.5" floppies from bootleg books I got in NYC, the nearest LUG was over an hour away, I could read code but hadn't written any yet. Why comment when I knew how out of touch I was? Eventually registered this UID because my main net name was taken (maybe I did register from one of those long lost 'free pop3 accounts") and used a new persona I had just created for anonymouse purposes.

Comment Re:Seriously? (Score 3, Informative) 466

It takes special skills to program? Maybe if you are doing some rather complex operations, but in the same regard I wouldn't want to re-gear the transmission or rebuild the engine of a car while I'm perfectly capable of customizing other aspects of a vehicle. Programming is the same way, someone can be capable of doing something they want to do (run a website and manage the database; or script their everyday crap into a few lines of code) without being 'an uber hax0r' who understands OS theory at the assembly level and capable of dealing with the full range of network security threats.

Mythologizing programming is what leads to the nephew who knows a little html being assigned as the head of IT; after all that little html takes all that programming knowledge!

And since your opinion of other programmers is so low:

Even most programmers who program for a living suck at their jobs, and I don't expect someone who's not serious about it to be any better.

might I suggest that the D-K effect is in full show and, on behalf of all coders, hackers, code monkeys, keyboard jockies, and everyone who's ever touched a computer, may I ask, beg, and plead, that you to please never write another line of code again.

Comment Re:No point encrypting if you're the only one... (Score 1) 108

Did you read their instructions? My parents use Thunderbird for email, because it's what I recommended for them. I decided to test on my clean box (browser only for the most part) and see how fast I could get my email, encrypt it with Enigmail and GPG, generate and upload a 4k key, and send out a signed email. Less than 10 minutes, most of that was waiting for the download because I've got torrents running elsewhere. With TBird installed, it was a few seconds to install GnuPG, a second for Enigmail, and less than a minute for me to get a key. The instructions walked through how to upload keys to any of various key servers, and sent off an email to my parents to see the same infographic and instructions on how they can and should do the same thing.

Sure, it used to be all command line tool with no GUI, that was only usable by *nix geeks; no longer. The plugin is all built into TBird or your email program of choice's plugin system, and has a GUI that is just a few clicks away. Sure, it won't work on webpage email systems yet; if that's what you rely on then you have some valid complaints against the email provider and not against encryption.

Comment Re:How about the build tools and the OS? (Score 1) 131

Why not? Assume, for discussion, a malicious compiler. It looks for common code used in encryption and changes parts of the code (see Reflections on Trusting Trust). Identifying the keys should not be that hard with known algorithms, so go for that. Then just replace all keys with 0xDEADBEEF or another known pattern of bits. Viola, encrypted data that can be opened only with code compiled via the corrupt compiler, or by the attacker who knows what bit pattern was used.

This would also be why verifying that TrueCrypt 7.1 could be compiled with a known compiler and certain settings to get a binary with the same fingerprint as the one distributed. The binary distribution could have been corrupted intentionally or by a malicious compiler.

Comment Re:How about the build tools and the OS? (Score 1) 131

And for compiling something like a basic C compiler, one could feasibly write their own using ASM from a base of something like CC500 (a 600ish line C compiler). Use said custom compiler to build something like PintOS (full code review possible by one person, I had to do so in collegiate OS courses) on a micro that is running nothing but your compiler from a RS232 port that you are monitoring with a logic analyzer (to watch out for stray data from the 'host' computer at this point). This gets you up to OS and compiler on your chip and board of choice, though you may need a bootloader. From there, you could compile the rest of a known tool chain, like GCC and all it's accompanying tools; if you've reviewed them satisfactorily.

As for trusting your hardware: good luck, you'll need it. Even if you can get a copy of the lithos used in producing your chip, you will have just a statistical analysis of the chance of a spy in your chip. Since you can't just decap and dissolve the layers to make sure. Perhaps with the lithos in hand you could get custom made chips, but that's not going to be any 'big iron' like an x86-64. So you've shifted the needed trust down to just the silicon (and microcode if needed) that are comparatively harder for an individual to make on their own. I suppose you could mimic the CPU on an FPGA or PLC, but you are back to "trusting trust" that the compiler didn't recognize something and stuff it in the binary.

It still amuses me that the shift from analog devices to digital shifted where the specialization was required so drastically. A 555 could be built from a handful of discrete components (resistors were just long lengths of wire, capacitors were just two plates with a gap, and diodes were whiskers; transistors were the exception), but programming analog devices was considered a black magic art. Now, with IDEs and reference books most people can write some code if they sit down and follow a book (like building a crystal radio from a kit back in the "before my time" days) but building the hardware at the most basic level (logic gates on silicon) is magic beyond all but a few.

Comment Re:Pointless (Score 1) 131

So what if there is? Assuming that your organization did audit 7.1, and found no problem, what makes it a risk now? Sure, you wouldn't want to migrate to 7.2 in a years time, and any fork from 7.1 would require a new audit; but I would hope that if you put that much effort into it that you would audit 7.2 internally or any further fork version as well, which would leave you with either a 'this is clean' or 'this is fishy' answer.

I don't doubt that many large organizations are looking at directions to migrate, since the 7.1 public audit won't be done for a while and the security of even the old version is thrown into question (and a cursory audit by even crypto pros can miss things) so the lack of trust seems obvious. I just don't understand the sudden increase in lack of trust when compared to "hey, this code by two guys we don't know provides some pretty heavy encryption that takes a Ph.D. in maths to understand and check." I do, however, understand the need of a large corporation to plan future migration, and that knowing what you'll be using next year or in 5 years is important, and the audit of 7.1 might not be finished or may turn up flaws by then. It's the short term trust change that I don't get.

Comment Re:Open Source it (Score 1) 131

If it is a NSA/NSL canary, then the devs are restricted in what they can say about why they are abandoning the project. The logical choice, and the easiest lie to remember, is that "we are just tired of developing it."

Which, unfortunately, is also the same exact thing they would say if they were just giving up on developing it. So the only real clues are the content of the current web page, and the changes made to the new 7.2 TrueCrypt. That they suggest using BitLocker without a TPM chip (I never thought I'd be suggesting the use of a pre-made TPM chip; honest) and that the solution involves upgrading to the pro version of windows . . . it doesn't pass the smell test. Serious crypto guys wouldn't suggest those tools when drunk, much less just because they are quitting.

As for "we don't know who the people who 'verified' the canary are" . . . that's another part of those nasty NSLs. If the people who knew the canary were close enough to the project, they would be subject to the NSL terms and silenced. It makes sense that a good canary is one that only one or two people un-connected to the project know about. If, for example, the devs put a big dead yellow bird on their webpage, it would clue us all in, but it would also violate most of the "shut up or else" clauses of a NSL. So, the devs may have prearranged a few phrases, told one of X to Y different people who knew each other but had little to connect with the devs, and then hoped they could get some Z phrases (Z

Assumption made about NSA and USA NSLs. Could be the same thing from other governments, or the threat of having their family killed by mobsters. The cause doesn't matter as much as the result, which is that 7.2 looks very fishy and we all avoid it.

Comment Re: um (Score 1) 154

Wasn't there already a hole poked in the BICEP findings, like a day after publication? Something about not accounting for the possibility that their findings were evidence of post expansion gravity polarization, not pre-expansion...or something like that. I recall that the consensus was still "this is super cool observation and probably right, but the Nobel hangs on that tiny detail."

Comment Re:Old School Amateur Radio Nut and Electronics Te (Score 1) 737

Butane torch (or methane/methanol from brewing), or a small sealed container in a wood oven at about 200F for a short time would heat the solder to the melting point. Sure, 200F is a ways away from the fire of a hot oven, but it's achievable. To re-solder the pieces, rosin from pine + tin/lead/silver from metal work (or saved from desoldering work) and the same hot oven box or a torch and a heat sink like a solder iron tip or screw driver. Heat tip, touch pad, repeat. BGA parts would be a beast, but who's going to need many of them?

Slashdot Top Deals

In any formula, constants (especially those obtained from handbooks) are to be treated as variables.

Working...