Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Who's liable when it fails or backfires? (Score 1) 304

When making something safer, we can expect more lawsuits, not less, as we might naively expect. It doesn't matter if we can statistically show that a new technology saves lives. If auto manufacturers put this feature into their vehicles, suddenly it is their fault when the feature fails to prevent an accident, or causes causes an accident where one would not have occurred. Previously, the driver would be liable.

Comment not paradoxes (Score 1) 231

Those aren't paradoxes. So space is created. How is that a paradox? Did someone say space is not allowed to be created?
So energy is created. That violates conservation of energy, but conservation of energy is simply a law that we formulated from experience, and later proved using Noether's theorem by assuming that the laws of physics are time-invariant. Well, it's not valid to extrapolate from our small-scale experiences to the universe, and the laws of physics probably aren't time-invariant at cosmological scales.
Nobody really knows how to calculate the energy of the vacuum, and that's why we have to use renormalization. The 10^120 figure is really a very rough ballpark estimate using dimensional analysis. There's not any solid theory to back it up.

Comment Re:more simplifications and fewer cats, please (Score 1) 197

I think people are moving away from the Copenhagen interpretation to other interpretations such as consistent histories, decoherence, and many worlds. Bohmian interpretation is another option, but I find it inelegant and it doesn't hold too much sway.

Personally, I feel that consistent histories* is the best. In this interpretation, the cat is simply dead or alive. We don't know which until we check, but the cat's state didn't change when we opened the box. Note that whatever is enforcing consistency does not obey causality---the laws of quantum mechanics are essentially symmetric in time (more accurately CPT). In some sense, the future is "prewritten" though we have no way to measure it, and the current state of the universe is required to be consistent with the future state. So if the cat is dead in the future where the box is opened, it's already dead with the box closed.

I prefer the "block universe" depiction of the universe as a stationary 4D object, since it seems to be easiest to reconcile with relativity. Relativity of simultaneity makes no sense if the future isn't already written. We call this 4D universe object a history, and all the events in the history have to be consistent with laws of physics. It makes no difference if you think of the past causing the future or the future causing the past: they are just there and neither is created from the other. In the many worlds interpretation, every possible (e.g. consistent) history exists, but I think one is sufficient.

*I might be mistaken on what consistent histories is. My description is my personal interpretation, which might coincide with the definition of consistent histories.

Comment Re:von Neumann probes (Score 1) 391

Travel time. The galaxy is some 100000 light years across. Using available fuels, what fraction of light speed can a probe hope to achieve? Let's suppose they run on DD fusion energy. The reaction gives off about 0.4% of the rest energy of the fuel, so a reasonable estimate of potentially attainable specific impulse is 0.004*c, assuming the probe is mostly fuel. Using Tsiolkovsky rocket equation, how fast can a probe reasonable reach? This depends on how much fuel the probe eats up at each stop. Assuming a probe mass of 100g, lets suppose the probe eats up a mass of Jupiter to create fuel at 10% efficiency at each stop. Well, that gives a delta v of ~.25*c. (Since there is a logarithm, the result doesn't change that much if we eat a sun or a saturn.) Useful cruise speed is half of that. Ok, that is still enough speed to conquer the galaxy in a few million years.

Multiplication factor: how many probes need to be sent out after each stop such that there is enough to spread over the galaxy in a reasonable time (there are ~10^11 stars). This is increased by the fact that many probes will fail to reach the destination for various reasons, so some redundancy is needed. We want to choose a multiplication factor such that the probes will cover the galaxy in approximately the same time as it takes for one probe to travel across the galaxy. Assume probes travel at 0.1*c, and it takes 10^6 years to traverse the galaxy. Let's assume a distance of 20 light years, or 200 travel years between stops. So we have 5000 stops in 10^6 years, so we need a multiplication factor of
f = 1.005 * redundancyFactor.
Ok, that's small enough to not make much difference in the resource needs.

Hmm, I intended to show that it was unfeasible, but it still looks like it might be physically possible, given extremely powerful probe technology.

Slashdot Top Deals

Money is the root of all evil, and man needs roots.

Working...