Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Submission + - New quantum relativity work (perfdrive.com) 1

jd writes: A new attempt yo produce quantum relativity is in the works. This time, the physicists have taken the line that if you allow for faster-than-light particles, you can solve a lot of the difficulties of merging the two theories. But it comes with a consequence.

You end up with three time dimensions and one spacial dimension.

The argument is that special cases constitute the real difficulty in merging the two ideas, so the physicists looked for a way to generalise outside the normal bounds, and to have relativity (despite its classical nature) produce the randomness in quantum mechanics.

From the article:

Quantum mechanics is an incredibly successful theory and yet the statistical nature of its predictions is hard to accept and has been the subject of numerous debates. The notion of inherent randomness, something that happens without any cause, goes against our rational understanding of reality. To add to the puzzle, randomness that appears in non-relativistic quantum theory tacitly respects relativity, for example, it makes instantaneous signaling impossible. Here, we argue that this is because the special theory of relativity can itself account for such a random behavior. We show that the full mathematical structure of the Lorentz transformation, the one which includes the superluminal part, implies the emergence of non-deterministic dynamics, together with complex probability amplitudes and multiple trajectories. This indicates that the connections between the two seemingly different theories are deeper and more subtle than previously thought.

Comment Not a useful question atm (Score 2) 21

There seem to be three standard rules.

1. Anything that is not demonstrably impossible is technically possible.

2. Within the set of what is technically possible, we need only look at the subset of simplest explanations.

3. Within the subset of simplest explanations, we need only consider those for which the level of evidence equals or exceeds the improbability of correctness.

A brain microbiome is technically possible, but it is not in the subset of simplest explanations, nor is the level of evidence sufficient. As such, It fails both the second and third tests.

To me, this does not mean we reject it outright, it means we simply don't consider it at all for right now. We neither accept nor reject, we simply put it to one side and see what scientists find in future. It's not a model we can usefully explore or make predictions with that would permit falsification.

Scientists are finding all kinds of new communications channels and behaviours within the brain. Clearly, our knowledge is nowhere near adequate to determine what is required. Let's get that sorted first, and then decide if there is anything left that needs a microbiome explanation.

Comment Competitiveness (Score 4, Informative) 154

The first rule of competitiveness is to eliminate things that are harmful to business. The quality of work done follows a simple curve with a reflex point around hour 7, where productivity is close to zero but not actually negative.

After hour 7, work has an increasing number of errors, such that productivity is actually negative. This work costs the business money. That isn't being competitive, that's being stupid.

Studies show that a four day working week, with 7 hours of work per day, yields the greatest productivity per unit cost. Less than that OR more than that, productivity yield per unit cost falls.

We also know that such a work week would reduce stress-related health conditions and illnesses.

If you want highest profit margins, then you pay a fair day's wage on an ideally 28 hour working week, but a 32-35 hour working week (4 days at 8 hours or 5 days at 7 hours) will still boost productivity.

Infosys' cofounder is a moron.

Comment Wrong approach. (Score 1, Interesting) 60

A much better system would surely be to generate a Class III certificate suitable for a sub-certificate authority that expires maybe once every two years, and then generate short-lived certificates (3 months seems fine) using the Class III cert to sign with.

The first reason for this is that the renewal system is your vulnerability. If the renewal system is distributed this way, you'd have to break it for every user independently.

The second reason is that you can use GPG's recommended practice of using a different key for each recipient, so you now have to break every senter/recipient pair independently, too.

However, if you run all connections between two nodes over an encrypted tunnel, an attacker can't distinguish a renewal from any other operation. The traffic is indistinguishable.

That's the beauty of encrypted tunnels vs Just an encrypted packet - attackers have to attack the lot, not just the bits they want.

Comment My predictions. (Score 4, Interesting) 104

Some of these are snarky, but frankly those responsible deserve it.

1. Documentation, particularly of Linux kernel internals and filesystems, will remain poor, leading to many re-invented wheels and a sustained naivety about why things work or don't.

2. Application developers will remain incompetent in testing, resulting in a defect density between 10x and 100x that of the kernel. This will result in an increasing number of embarrassing failures that nobody will take responsibility for because they regard it as soneone else's problem.

3. The likes of IBM, Microsoft, and Oracle will continue to eat away at the edges of Open Source, both for direct profit and - as Oracle has gone so well in the past - to cripple competition.

4. The poor state of collaboration and the intense egos will result in projects continuing to implode and die rather than get picked up by others, as happened with both Reiser filesystems and is very likely to happen with bcachefs. It will also result in projects being steered by vanity, rather than good engineering, as happened with GCC prior to EGCS replacing the original codebase.

5. Despite all of this, businesses that can't afford to replace their IT infrastructure will migrate from Windows 10 to Linux on the desktop, and from VMWare (where you now have to buy a minimum of 32-core licenses) to open source virtual machines.

6. Linus Torvalds is foreign, a proponent of freedom, and thus the natural opponent of the corporate donors to Trump. Expect him to be amongst those Trump expels.

Comment Bizarre. (Score 3, Interesting) 52

Laser range finding is trivial. You don't look at features to gauge distance from the ground, you measure it directly.

Speed, I can sort of understand, you need to measure the movement of something and the atmosphere is very thin, so points on the ground are easy.

If features are hard to spot, then the rest really three options - improve resolution so you can spot and track smaller variations, or improve the number of wavelengths examined, or the number of wavelengths differentiated - in either case, this is to be able to detect a wider number of things that can be varied.

These are all doable, but building them radiation hardened isn't trivial and the first and third will add to the weight, requiring a larger, heavier, and therefore more energy-hungry, drone.

Maybe build two types of drone - one for investigating, but a second one that acts as a mobile recharging station, so that your recon drone only needs minimal recharging capabilities.

Comment Re:Wrong priblem (Score 1) 152

OK, so let's see what you're saying.

1. The programmer uses doxygen comments to document all calls and all structures
2. The programmer uses flowcharting software to document modules
3. The programmer produces a diagram showing how modules are intended to interact
4. The programmer documents the network APIs for RPC or CORBA via markup as above
5. The programmer documents other in-house network structures via markup as above
6. The programmer writes MAN pages for any command-line services
7. The programmer writes a basic user interface guide for internal use
8. The DBA documents database tables and database topology
9. The software architect produces a document detailing the specification, and a second document analysing how the above meets that specification

These are the basics.

This leaves the technical writer with converting the above documentation into a consolidated developer's guide, a consolidated user guide, and a white paper on how the software meets business needs.

You need precisely how many technical writers to do this?

One per application team, tops, and you can probably consolidate that further.

If you assumed more, you didn't read my post, and that's kinda your problem, not mine.

If companies don't have enough writers? Still not my problem, I don't tell companies how to work around their sheer incompetence and inadequate staffing levels. I tell companies how to do it right.

Comment Re:Easy Workaround (Score 1) 123

That alone is not enough. You'd have to also block all multicast. Multicast doesn't use endpoints and cam be tunnelled over.

You'd also have to block Network Mobility (NEMO) and the use of Home IPs for IP mobility.

Even that's not enough. You'd have to disable DNS forwarding across the gap, as you can tunnel over DNS.

Because software routing is trivial, it would also be necessary to block any roofnet that circumvented the block.

Network bisection on a self-healing system filled with security holes and protocol-permitted reuse is not trivial.

Comment Wrong priblem (Score 4, Informative) 152

1. Developers should NEVER test their own code beyond basics, they are too familiar with the code and cannot be trusted to test it correctly.

2. Developers should RARELY document their own code, beyond markup for documentation systems. The design SHOULD have been written long before any code was written, and user guides that don't describe the integrated system aren't helpful. You want technical writers for this.

3. The time spent coding is irrelevant. If the design was poor, then most of that time is spent on debugging and technical debt. Furthermore, different languages take different amounts of code to do the same thing. There are only two useful numbers: the percentage of hinderences eliminated, and the degree of improvement. In the words of Metallica, Nothing Else Matters.

Comment Hmmm. (Score 1) 13

We've a wide variety of quantum gas clocks which measure time somewhere between a hundred to a thousand times better than the caesium clock.

https://en.m.wikipedia.org/wik...
https://phys.org/news/2017-10-...

True, this isn't a million times more accurate, but it's still pretty good. You really do need clocks of extremely high precision to do experiments on quantum relativity or dark matter, because both of these involve incredibly small deviations.

It would presumably also improve gravitational wave detectors, as they'd be able to detect absolutely trivial fluctuations.

In terms of regular relativity, quantum gas clocks can measure the time dilation changes caused by moving the clock up a single flight of stairs, so around 10-15 feet. Presumably, you could use this to map the gravitational variations over the Earth's surface, but I've no idea how this compares in resolution to traditional gravitational mapping.

If it's better, then orbiting Mars or Venus with a nuclear clock would be very interesting.

Various articles talk about improving GPS, but if we don't have a gravitational map of the planet of sufficient resolution, then the noise introduced by such fluctuations would drown out most of the benefits, surely.

Comment Re:FUD (Score 1) 136

The end of the ice age, in the sense meant by those ice cores, would have been 12,000 years ago. This warming is completely unrelated and is far, far faster.

My information is from peer-reviewed papers, academic books publushed by and for academics, and science conferences. This data was examined by scientists paid by the Koch Brothers to find errors, but those scientists concluded the data was absolutely correct and beyond reproach.

If even the academics paid for by the "skeptics" concur that this is unrelated to the ice age but is entirely due to CO2 output by humans, you've lost the argument.

Slashdot Top Deals

If money can't buy happiness, I guess you'll just have to rent it.

Working...