Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:That's not what I took away from this... (Score 1) 347

Franson's idea, as I understand it, is that during the small window between creation and annihilation, the massive particles are under the influence of gravity, which bleeds off energy. When the pair recombines, it results in a reduced velocity of the photon.

I read it as just barely changing the vector of the light, not the velocity. All photons travel at c, but gravity could make the path of travel curve more than previously thought.

Comment Re:Light odyssey (Score 1) 347

Neutrinos have virtual particle interactions as well. Only low energy photons seem not to (that I remember diagrams for, maybe they do too).
So ALL THINGS are made by the devil except for infrared light and AM radio. Seems to explain why looking at things makes you question stuff (visible light is a lie!) and only AM radio tells the truth.

Comment Re:Is there a 'less nerdy version'? (Score 1) 347

I'd correct the photon "zig-zagging".
The guy is saying that we know photons can very quickly turns into an electron and it's friend a positron. They almost instantly turn back, but since the electron and it's friend are bigger and heavier than a photon of light they are affected by gravity more.
So, if the author is right, the light we saw took a different path to get to us. Just a little bit different, enough to add an hour over the course of 163,000 years.

Comment Re:What gets corrected? (Score 1) 347

Yup, it only affects a small percentage of photons for a very brief time. Schrodinger's equation and the rest of QED let you work out how many photons in a given burst over X amount of time. For most of our observations, in laser labs and other 'short' distances the effect shouldn't even be noticeable. But it might change astronomical measurements by a good bit. (well, 1.7 hours over 168,000 years, 1x10^-9; more or less, since light-years traveled and years traveled aren't identical at that distance due to expansion effects)

And if it affects photons, it will affect other particles as well. Maybe it explains the two neutrino bursts; if one burst traveled in a straight line and the other had a virtual particle interaction.

Comment Re:Ummm (Score 1) 347

That's what the article seems to suggest, yes. And that the virtual particle pair, if they exist for real time, would move at less than c for their short life-span. But the major change from Earth's perspective is that the gamma rays we saw did not travel in the straight line that the neutrinos did.

That might also explain the second neutrino burst (maybe, wild guess from a programmer). If some of the neutrinos went through a virtual particle state (Z boson, I think?) then they would also arrive at a different time. That would account for neutrinos that made the trip with no virtual particles, those that slowed down due to the mass of Z boson interactions, and, according to the research summarized in the article, all the gamma rays that went through a virtual particle phase and dealt with gravity. If it all works that way, it would be beautiful science explaining more things we thought we understood. Ahh, science!

Comment Re:Ummm (Score 1) 347

Bloody good question. They are called virtual particles, though. If forced to answer, I would suspect that the energy added by the observer traveling fast enough to blue shift the light that far, 50,000 times the wavelength (talking about a 500nm green down to 10picometer gamma) and 50,000x to 100,000x energy in keV, would require a good portion of c and would reduce the apparent distance covered to a lower amount that does not offer a high enough chance of a virtual particle interaction.

But that's just me making stuff up and pulling a WAG.

Comment Re:Ummm (Score 1) 347

No, we aren't talking about visible photons. The emissions from the supernova were neutrinos and a gamma ray burst, the visible light travels still separately because of the other things in space that it interacts with that are transparent to gamma energy and above. But, yes, over the very large distances between us and the supernova it was not just a few photons that traveled at less than c for some time, but the chance rose high enough that it was nearly all of the photons.

All EM radiation travels at the speed of light. High energy photons can, briefly, become virtual particle pairs that do not travel at the speed of light. The article author noted that the chance, over the time and distance between us and this specific supernova, was high enough that it would account for all of the gamma ray and higher energy photons traveling as particle pairs for some part of their trip and that time would account for the known time delay. This only applies to gamma rays above (i think) 511 keV (one of the gamma rays emitted in an electron-positron annihilation. might need to be 1022keV for a single ray to form both particles from a single photon; ask a particle physicist, not a programmer like me). According to Alpha, a 500nm green photon has only around 2eV. Violet light gets up to 3 eV and a little higher; still not enough to create any particle. E=mc^2, so you need a good deal of energy just to create a very tiny electron.

Comment Re:Don't mess with "c" (Score 1) 347

It isn't a fixed length of time or distance (same thing at the speed of light in a vacuum, excepting spacial expansion). It's a statistical chance; each high energy photon has a chance at each and every point in time to split into an electron-positron pair (annihilation of the pair create gamma and higher photons, so it should only be those photons that split) and then those will travel for some time, being effected by gravity and all the other forces, before re-combining into a photon.

That's my complex way of saying "eh, I dunno, I got far enough in physics but that's above my head to figure out." If you want to understand the Schrodinger equation, or can find a Feynman diagram that lists the chance over time, good luck. I tried googleing phrases I thought would get me an abstract or brief but came up empty.

Comment Re:Missing Option: (Score 1) 139

As a 30-something who read /. in the early days but had no experience to comment from; I had linux 3.5" floppies from bootleg books I got in NYC, the nearest LUG was over an hour away, I could read code but hadn't written any yet. Why comment when I knew how out of touch I was? Eventually registered this UID because my main net name was taken (maybe I did register from one of those long lost 'free pop3 accounts") and used a new persona I had just created for anonymouse purposes.

Comment Re:Seriously? (Score 3, Informative) 466

It takes special skills to program? Maybe if you are doing some rather complex operations, but in the same regard I wouldn't want to re-gear the transmission or rebuild the engine of a car while I'm perfectly capable of customizing other aspects of a vehicle. Programming is the same way, someone can be capable of doing something they want to do (run a website and manage the database; or script their everyday crap into a few lines of code) without being 'an uber hax0r' who understands OS theory at the assembly level and capable of dealing with the full range of network security threats.

Mythologizing programming is what leads to the nephew who knows a little html being assigned as the head of IT; after all that little html takes all that programming knowledge!

And since your opinion of other programmers is so low:

Even most programmers who program for a living suck at their jobs, and I don't expect someone who's not serious about it to be any better.

might I suggest that the D-K effect is in full show and, on behalf of all coders, hackers, code monkeys, keyboard jockies, and everyone who's ever touched a computer, may I ask, beg, and plead, that you to please never write another line of code again.

Comment Re:No point encrypting if you're the only one... (Score 1) 108

Did you read their instructions? My parents use Thunderbird for email, because it's what I recommended for them. I decided to test on my clean box (browser only for the most part) and see how fast I could get my email, encrypt it with Enigmail and GPG, generate and upload a 4k key, and send out a signed email. Less than 10 minutes, most of that was waiting for the download because I've got torrents running elsewhere. With TBird installed, it was a few seconds to install GnuPG, a second for Enigmail, and less than a minute for me to get a key. The instructions walked through how to upload keys to any of various key servers, and sent off an email to my parents to see the same infographic and instructions on how they can and should do the same thing.

Sure, it used to be all command line tool with no GUI, that was only usable by *nix geeks; no longer. The plugin is all built into TBird or your email program of choice's plugin system, and has a GUI that is just a few clicks away. Sure, it won't work on webpage email systems yet; if that's what you rely on then you have some valid complaints against the email provider and not against encryption.

Comment Re:How about the build tools and the OS? (Score 1) 131

Why not? Assume, for discussion, a malicious compiler. It looks for common code used in encryption and changes parts of the code (see Reflections on Trusting Trust). Identifying the keys should not be that hard with known algorithms, so go for that. Then just replace all keys with 0xDEADBEEF or another known pattern of bits. Viola, encrypted data that can be opened only with code compiled via the corrupt compiler, or by the attacker who knows what bit pattern was used.

This would also be why verifying that TrueCrypt 7.1 could be compiled with a known compiler and certain settings to get a binary with the same fingerprint as the one distributed. The binary distribution could have been corrupted intentionally or by a malicious compiler.

Comment Re:How about the build tools and the OS? (Score 1) 131

And for compiling something like a basic C compiler, one could feasibly write their own using ASM from a base of something like CC500 (a 600ish line C compiler). Use said custom compiler to build something like PintOS (full code review possible by one person, I had to do so in collegiate OS courses) on a micro that is running nothing but your compiler from a RS232 port that you are monitoring with a logic analyzer (to watch out for stray data from the 'host' computer at this point). This gets you up to OS and compiler on your chip and board of choice, though you may need a bootloader. From there, you could compile the rest of a known tool chain, like GCC and all it's accompanying tools; if you've reviewed them satisfactorily.

As for trusting your hardware: good luck, you'll need it. Even if you can get a copy of the lithos used in producing your chip, you will have just a statistical analysis of the chance of a spy in your chip. Since you can't just decap and dissolve the layers to make sure. Perhaps with the lithos in hand you could get custom made chips, but that's not going to be any 'big iron' like an x86-64. So you've shifted the needed trust down to just the silicon (and microcode if needed) that are comparatively harder for an individual to make on their own. I suppose you could mimic the CPU on an FPGA or PLC, but you are back to "trusting trust" that the compiler didn't recognize something and stuff it in the binary.

It still amuses me that the shift from analog devices to digital shifted where the specialization was required so drastically. A 555 could be built from a handful of discrete components (resistors were just long lengths of wire, capacitors were just two plates with a gap, and diodes were whiskers; transistors were the exception), but programming analog devices was considered a black magic art. Now, with IDEs and reference books most people can write some code if they sit down and follow a book (like building a crystal radio from a kit back in the "before my time" days) but building the hardware at the most basic level (logic gates on silicon) is magic beyond all but a few.

Comment Re:Pointless (Score 1) 131

So what if there is? Assuming that your organization did audit 7.1, and found no problem, what makes it a risk now? Sure, you wouldn't want to migrate to 7.2 in a years time, and any fork from 7.1 would require a new audit; but I would hope that if you put that much effort into it that you would audit 7.2 internally or any further fork version as well, which would leave you with either a 'this is clean' or 'this is fishy' answer.

I don't doubt that many large organizations are looking at directions to migrate, since the 7.1 public audit won't be done for a while and the security of even the old version is thrown into question (and a cursory audit by even crypto pros can miss things) so the lack of trust seems obvious. I just don't understand the sudden increase in lack of trust when compared to "hey, this code by two guys we don't know provides some pretty heavy encryption that takes a Ph.D. in maths to understand and check." I do, however, understand the need of a large corporation to plan future migration, and that knowing what you'll be using next year or in 5 years is important, and the audit of 7.1 might not be finished or may turn up flaws by then. It's the short term trust change that I don't get.

Slashdot Top Deals

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...