Forgot your password?

Comment: Re:That makes no sense. (Score 1) 180

by muridae (#47428777) Attached to: Ask Slashdot: Best Dedicated Low Power Embedded Dev System Choice?

We don't know what the OP is attempting to compile. It might be just some code to run inside the Arduino bootloader, or it might be a whole muLinux and a local GCC for the target. Heck, the Debian distro for the BBB might not have a binary for the target muC, which means compiling GCC before compiling the code (worst case). There is also the possibility of OPs chips not being supported in GCC. For example, I just picked up some Cypress PSoC boards; the tools for them are only available right now on Windows. Not ideal, but writing some C so GCC could mangle the analog portions of the PSoC would be more uncomfortable imho.

Unknown target devices, unknown tools, leaves me suggesting something a little more capable than a embedded device. While a windows x86 device might not be ideal for all folks, it might be necessary for some target device's toolchain. I have a several year old quad-core i7 with a geforce 560 and spinning drives that runs on less than 100 watts; one could use an older second hand laptop like that, or a newer even lower power one with an SSD. It's not what the OP asked about, but the OP also didn't provide enough information to presume that they knew what their target embedded muC might always be. And for the general case of other readers who are reading this, I think a small laptop is still the best choice.

Sorry for all the mu, I can't seem to get the html for μ to work

Comment: Re:That makes no sense. (Score 1) 180

by muridae (#47423689) Attached to: Ask Slashdot: Best Dedicated Low Power Embedded Dev System Choice?
This, 100 times this.

Even an old 100 watt laptop will compile your code many times faster than something like the Black or Ras Pi will. A gig of RAM and swap space, something that embedded systems don't normally have, will make a huge difference. Just throw a small SSD (or boot from USB stick) in an old 2-code i3 with a crap graphics card, and what your code compile faster than a 5 watt embedded device will even launch your IDE or get through the first source code file.

Comment: Re:Is it still braindeadly single-threaded? (Score 1) 138

by muridae (#47423519) Attached to: Dwarf Fortress Gets Biggest Update In Years

He's a maths Ph.D. not a Computer Science or IT Bs.

I think that, had DF been programmed in a language that was more maths friendly, and less "write it like algebra on paper and let the compiler do magic" C then it might drastically increase the code efficiency. But the times he's been offered help with someone else writing a UI, he grasps that the UI needs a stable game API to make calls to, or a means to pass messages, and the game is not in that state nor is he going to re-do the UI each major version to keep it up to date. The same applies to pathing, it's not his field of maths (go read his thesis, it's very far from code related). A threaded pathfinding library has been offered to him, but since the objects change and the map representation changes, he presented some past problems that libraries couldn't cope with and the coders reading the discussion (myself included) wanted to launch kitten and whale rail guns.

Hell, tiles of his maps already store so much data, that it would make almost as much sense to use a memory intense pathfinding and allow each tile to determine the closest (gem/food/water/workshop of each type/etc) than to have each dward parse the map tree each turn to avoid running into each other.

Comment: Re:These don't seem "critical" (Score 1) 50

There is much that is NOT public knowledge.

For example*snip* How many milliseconds on and milliseconds off must I send the command to end up 120 degrees out of phase? What kind of out of phase protection relay is in service?

360 degrees of phase every 1/60th of a second. 120 degrees is 1/3rd that, so minimum of 1/3 of 1/60 or 1/90th of a second....that would be common knowledge to anyone in the USA who hears 60 cycle hums on electrical lines or as line noise. Whether that would be how long the breaker would need to be off to get the generator that far out of phase, I don't know. But I really want to dig through this paper and see if the person I know who does know how long it takes, and warned the DOE and DHS about it, is mentioned by name.

Comment: Re: Have 2 keys with different uses (Score 1) 560

by muridae (#47328925) Attached to: Mass. Supreme Court Says Defendant Can Be Compelled To Decrypt Data

This kind of thing is where having 2 different keys for an encrypted volume would be good, like a key for personal usage, and another key for usage when under duress.

The normal key would unencrypt the volume for you to use as normal, and the "duress" key would cause the volume to automatically do a secure data wipe of the volume file.

What ever you do, if you think you are under legal duress, DO NOT DO THIS!

The first thing even the local cops are trained to do is to make an image of a volume. They do this for two reasons. The first is the legal: don't destroy evidence that the defense may call into question, and keep a chain of custody. The second is because people have tried exactly what you suggest. If you gave them a password that destroyed evidence, then even if they still had the original drive you have "attempted to destroy evidence" and interfered with the investigation, and probably a few other things that will add to your sentence and provided the evidence of the crime just by doing it in front of the police!

The TrueCrypt 'two encrypted areas in a single visible area' is fine if the police can not prove, through digging through windows, that you regularly keep two encrypted drives mounted (say X and Y). If you always mount both as drive X: and don't have other OS signs that it is used for multiple objects (drive UUIDs are unfortunately telling) then you are safer. But if you normally keep X and Y mounted, they'll want proof of what Y is; saying it's a USB stick, letting them plug it in and seeing it automount to G and not having other ids match would be . . . unpleasant.

Comment: Re:I lost the password (Score 1) 560

by muridae (#47328879) Attached to: Mass. Supreme Court Says Defendant Can Be Compelled To Decrypt Data

Destruction of evidence, hindering a police investigation, and so on. And unless it is done at the flash memory chip level, they could get an image of the data.

What might be useful is something like an old article I read on randomly changing 'root's password on a *nix system, so it was next to impossible to log in as root. A user with permission could still use sudo to do what was needed, and "sudo -i" would be available for a single admin. Or the practice of changing the encryption key to /tmp or the swap partition on reboots; the user has no way to recover the data from those locations once a reboot has occurred. A secure encryption method that keeps even the intended user from using it would be a very had thing to sell, unfortunately.

Comment: Re:That's not what I took away from this... (Score 1) 347

by muridae (#47311253) Attached to: Evidence of a Correction To the Speed of Light

Franson's idea, as I understand it, is that during the small window between creation and annihilation, the massive particles are under the influence of gravity, which bleeds off energy. When the pair recombines, it results in a reduced velocity of the photon.

I read it as just barely changing the vector of the light, not the velocity. All photons travel at c, but gravity could make the path of travel curve more than previously thought.

Comment: Re:Light odyssey (Score 1) 347

by muridae (#47311229) Attached to: Evidence of a Correction To the Speed of Light
Neutrinos have virtual particle interactions as well. Only low energy photons seem not to (that I remember diagrams for, maybe they do too).
So ALL THINGS are made by the devil except for infrared light and AM radio. Seems to explain why looking at things makes you question stuff (visible light is a lie!) and only AM radio tells the truth.

Comment: Re:Is there a 'less nerdy version'? (Score 1) 347

by muridae (#47311201) Attached to: Evidence of a Correction To the Speed of Light
I'd correct the photon "zig-zagging".
The guy is saying that we know photons can very quickly turns into an electron and it's friend a positron. They almost instantly turn back, but since the electron and it's friend are bigger and heavier than a photon of light they are affected by gravity more.
So, if the author is right, the light we saw took a different path to get to us. Just a little bit different, enough to add an hour over the course of 163,000 years.

Comment: Re:What gets corrected? (Score 1) 347

by muridae (#47311149) Attached to: Evidence of a Correction To the Speed of Light

Yup, it only affects a small percentage of photons for a very brief time. Schrodinger's equation and the rest of QED let you work out how many photons in a given burst over X amount of time. For most of our observations, in laser labs and other 'short' distances the effect shouldn't even be noticeable. But it might change astronomical measurements by a good bit. (well, 1.7 hours over 168,000 years, 1x10^-9; more or less, since light-years traveled and years traveled aren't identical at that distance due to expansion effects)

And if it affects photons, it will affect other particles as well. Maybe it explains the two neutrino bursts; if one burst traveled in a straight line and the other had a virtual particle interaction.

Comment: Re:Ummm (Score 1) 347

by muridae (#47311013) Attached to: Evidence of a Correction To the Speed of Light

That's what the article seems to suggest, yes. And that the virtual particle pair, if they exist for real time, would move at less than c for their short life-span. But the major change from Earth's perspective is that the gamma rays we saw did not travel in the straight line that the neutrinos did.

That might also explain the second neutrino burst (maybe, wild guess from a programmer). If some of the neutrinos went through a virtual particle state (Z boson, I think?) then they would also arrive at a different time. That would account for neutrinos that made the trip with no virtual particles, those that slowed down due to the mass of Z boson interactions, and, according to the research summarized in the article, all the gamma rays that went through a virtual particle phase and dealt with gravity. If it all works that way, it would be beautiful science explaining more things we thought we understood. Ahh, science!

Comment: Re:Ummm (Score 1) 347

by muridae (#47310953) Attached to: Evidence of a Correction To the Speed of Light

Bloody good question. They are called virtual particles, though. If forced to answer, I would suspect that the energy added by the observer traveling fast enough to blue shift the light that far, 50,000 times the wavelength (talking about a 500nm green down to 10picometer gamma) and 50,000x to 100,000x energy in keV, would require a good portion of c and would reduce the apparent distance covered to a lower amount that does not offer a high enough chance of a virtual particle interaction.

But that's just me making stuff up and pulling a WAG.

Comment: Re:Ummm (Score 1) 347

by muridae (#47310841) Attached to: Evidence of a Correction To the Speed of Light

No, we aren't talking about visible photons. The emissions from the supernova were neutrinos and a gamma ray burst, the visible light travels still separately because of the other things in space that it interacts with that are transparent to gamma energy and above. But, yes, over the very large distances between us and the supernova it was not just a few photons that traveled at less than c for some time, but the chance rose high enough that it was nearly all of the photons.

All EM radiation travels at the speed of light. High energy photons can, briefly, become virtual particle pairs that do not travel at the speed of light. The article author noted that the chance, over the time and distance between us and this specific supernova, was high enough that it would account for all of the gamma ray and higher energy photons traveling as particle pairs for some part of their trip and that time would account for the known time delay. This only applies to gamma rays above (i think) 511 keV (one of the gamma rays emitted in an electron-positron annihilation. might need to be 1022keV for a single ray to form both particles from a single photon; ask a particle physicist, not a programmer like me). According to Alpha, a 500nm green photon has only around 2eV. Violet light gets up to 3 eV and a little higher; still not enough to create any particle. E=mc^2, so you need a good deal of energy just to create a very tiny electron.

Comment: Re:Don't mess with "c" (Score 1) 347

by muridae (#47310733) Attached to: Evidence of a Correction To the Speed of Light

It isn't a fixed length of time or distance (same thing at the speed of light in a vacuum, excepting spacial expansion). It's a statistical chance; each high energy photon has a chance at each and every point in time to split into an electron-positron pair (annihilation of the pair create gamma and higher photons, so it should only be those photons that split) and then those will travel for some time, being effected by gravity and all the other forces, before re-combining into a photon.

That's my complex way of saying "eh, I dunno, I got far enough in physics but that's above my head to figure out." If you want to understand the Schrodinger equation, or can find a Feynman diagram that lists the chance over time, good luck. I tried googleing phrases I thought would get me an abstract or brief but came up empty.

"The geeks shall inherit the earth." -- Karl Lehenbauer