Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment The objection ignores Bostrom's basic argument (Score 5, Insightful) 414

The objection in question ignores Bostrom's basic argument. Bostrom's primary argument for being in a simulation boils down to the observation that it is very likely that an advanced civilization would have the ability to run very accurate simulations. Moreover, one of the things they'd be obviously interested in would be their own past ancestors; if that's the case, then over the very long period that such civilizations will exist one will expect many more "copies" of people on ancient Earth than any of the originals, unless one expects civilization to die out well before we get to that technology level. If the laws of physics are simulated badly enough that we can notice, then they aren't doing an effective ancestor simulation, so the objection here doesn't make sense.

There are a lot of issues with Bostrom's argument; for example, one might question whether simulations of that level of detail will ever be able to be made on a large scale. But the argument being made here doesn't grapple with the fundamental issues.

Comment Re:FTLOG -WHY SHOULD WE CARE? (Score 1) 90

Sure, it is possible that these will turn out to be useful and cheap enough to be common devices in the home. My comment was what was likely. And if that does happen, it will be very far off (just as there were about 40 years between that IBM remark and when personal computers were a thing). In order for quantum computers to be common enough for personal use, two things need to happen: first, the computers will need to be small and cheap enough that they can be in fit in the home attached to a classical computer; one might imagine something like a special quantum processor for the processes that take advantage of it. Second, and this is really important: we would need algorithms that the quantum computer can use to do things that are common that a classical computer does slower. Right now, most of the things that a quantum computer can do asymptotically better than a classical computer are highly specialized things like factoring integers that most people don't need. If we found good quantum algorithms to do things like graphical processing that might change.

Comment Re:Quantum supremacy tests will come first (Score 5, Informative) 90

Because if I understand quantum theory correctly, it both works, and doesn't. There is no measurement for a half binary state in a binary world of absolute on and off.

I'm not sure what you mean by "it" here, but pretty much every interpretation of this is wrong. In fact, measurement of quantum superpositions do return specific classical states, with a probability based on the superpositions.

I think pursuing analogue supercomputers might be a better place to start.

We have specific theorems about what analogue classical computers can do. See for example http://www.sciencedirect.com/science/article/pii/0196885888900048 and https://arxiv.org/abs/quant-ph/0502072. In general, analog computers cannot do error correction and can when used to do optimization get easily stuck in local minima.

A more reasonable argument would be "We need more money to continue milking this quantum cow that never produces anything."

Quantum computing is still in its infancy and is best thought of as still in the basic research category. But even given that, there's been massive improvement in the last few years, both in terms of physical implementations (how many entangled qubits one can process) and in terms of understanding the broader theory. One major aspect where both the experimental and theoretical ends have seen major improvement is quantum error correction https://en.wikipedia.org/wiki/Quantum_error_correction.

Comment Re:FTLOG -WHY SHOULD WE CARE? (Score 4, Informative) 90

It is unlikely that most people will see quantum computers in their day to day lives. But one will see the many improvements that they give. For example, there's strong reasons to think that quantum computers will make doing chemistry simulations easier, resulting in more new interesting things in different contexts, including medicines. For similar reasons, one expects that quantum computers will make it easier to design better classical computers.

Comment Quantum supremacy tests will come first (Score 5, Interesting) 90

One of the major issues is the need for actual empirical evidence that quantum computers can do things that classical computers cannot with reasonable time constraints. Right now, the general consensus is that if we understand correctly the laws of physics this should be the case, but there are some people who are very prominent holdouts who are convinced that quantum computing will not scale. Gil Kalai is the most prominent https://gilkalai.wordpress.com/2014/03/18/why-quantum-computers-cannot-work-the-movie/. It is likely that before any 50 bit quantum computer we'll have already answered this question. The most likely answer will be using boson sampling systems https://en.wikipedia.org/wiki/Boson_sampling which in their simplest form give information about the behavior of photons when scattered in a simple way. Scott Aaronson and Alex Arkhipov showed that if a classical computer could efficiently duplicate boson sampling with only a small increase in time then some already existing conjectures in classical computational complexity had to be false. (In particular, the polynomial hierarchy would have to collapse and we're generally confident that isn't the case.) Boson sampling is much easier to implement than a universal quantum computer, although no one has any practical use of boson sampling at present.

All of that said, the "a few years" in the article is critical- it isn't plausible that a 50 qubit universal system will be sold in 5 years. But 10 or 20 years are plausible. It also isn't completely clear how practically useful a 50 qubit system would be. At a few hundred qubits one is clearly in the realm of having direct practical applications, but 50 is sort of in a fuzzy range.

Comment All my friends in NSA are looking (Score 5, Interesting) 251

This is completely anecdotal; I'm a mathematician and I know a lot of people who work for the NSA. Almost every single one of them right now is quietly or not so quietly looking for other work. At least one of them has an undated resignation letter in their desk ready to go if they are asked to do anything that they find morally questionable (and this is someone who has generally defended NSA's actions in the past). The morale at NSA right now is in a massive slump.

Comment Makes the proposed SLS mission even more a waste. (Score 2) 195

There's a proposal for the first SLS mission to be an around the moon shot http://jalopnik.com/nasa-may-send-astronauts-around-the-moon-on-the-first-t-1792586594. There are a lot of problems with this; Amy Shira Teitel discussed it in detail https://www.youtube.com/watch?v=cdrEzIlecIk&t=3s. This would make it even more of a bad idea. Right now the SLS mission proposal is just highly unsafe, redundant, and not part of a coherent program. This would make it super-super redundant.

Comment Here's what it means (Score 4, Informative) 167

Here's what it means: One major aspect of modern cryptography are "hash functions"- a hash function is a function which essentially has the property that in general two inputs with very small differences will give radically different outputs. Also, ideally a hash function will also make it hard to detect "collisions" which are two inputs which have the same output. In general, hash schemes are used for a variety of different purposes, including determining if a file is what it claims to be (by checking that the file has the correct hash value).

Every few years, an existing hash system gets broken and needs to be replaced. MD5 is an example of this; it was very popular and then got replaced.

One of the major currently used hash schemes is SHA-1. However, a few days ago, a group from Google described an attack that allowed them easily find collisions in SHA-1 (easy here is comparative- the amount of computational resources needed was still pretty high). The group released evidence that they could do so but didn't describe how they did so in detail. They gave an example of two files with a SHA-1 collisions and they also described some of the theory behind their attack. What TFS is talking about is how based on this, others have since managed to duplicate the attack and some make some even more efficient variants of it; so effectively this attack is now in the wild.

Comment Re:What should happen and what will happen (Score 1) 143

If you are a large organization, you can afford more.

Yes, but the point is the way it scales; If you are tiny you can reasonably assume that the almost no occasions will occur when you need to do multiple hashes in a small amount of time. If you are larger then you end up with a lot of extra RAM that you aren't going to use regularly but will need to use during peak log-in times. I agree that you can probably afford more, but getting corporations to do so is difficult; at the end of the day, everyone cares about their bottom lines.

RSA is old, broken crypto which should be migrated away from.

This suggests that you have some very opinionated and somewhat unique views.

I hate to resort to appeal to authority, but the actual analysis required to prove it is way more effort than I have time for this morning. Take a look at keylength.com, it has a host of authoritative references.

I'm familiar with many of the references there, so if there are specific ones you'd like to point to (given the large number there) it might be helpful. But I will note that what they say there agrees to a large extent with what I wrote earlier, in that they explicitly say that they are trying to provide key sizes for a desired level of protection.

It's a valid counterexample because RSA key generation, and, to a much lesser extent, RSA private key operations, are computationally expensive enough to stress low-end devices (an issue I often have to deal with... I'm responsible for some of the core crypto subsystems in Android). But it's a weak counterexample because RSA is not modern crypto. It's ancient, outmoded, we have some reasons to suspect that factoring may not be NP hard, using it correctly is fraught with pitfalls, and it's ridiculously expensive computationally. And even still, the common standard of 2048-bit keys is secure for quite some time to come. As that stackoverflow article you linked mentions, the tendency has been to choose much larger-than-required keys (not barely large enough) rather than tracking Moore's law.

As discussed in the same stackexchange link, the key choice is due to infrastructural reasons (and in fact I specifically mentioned that in the part of my above comment you apparently decided not to quote). What actually happens is that we use keys that are larger than required and then use them for a *long time* before jumping to larger key sizes when we really need too. Again, the failure to perfectly track Moore's law (or even improvements in algorithms) is infrastructural, and similar issues will apply to many other crypto systems.

Frankly, I'm concerned that you claim to be someone who has done serious crypto work when you say that "we have some reasons to suspect that factoring may not be NP hard, using it correctly is fraught with pitfalls" because this indicates some serious misconceptions; first it isn't that a suspicion that factoring may not be NP-hard. We're very certain of this. If factoring were NP hard then a whole host of current conjectures that are only slightly stronger than P != NP would have to be true. Since factoring is in NP intersect co-NP if factoring were NP-hard then we'd have NP=co-NP we'd have the polynomial hierarchy collapse. Moreover, since factoring is in BQP by Shor's algorithm we'd also have NP in BQP, which we're pretty confident doesn't happen.

But there's a more serious failure here; which is pretty much no major cryptographic system today relies on an NP-hard problem, and reliance on such is not by itself a guarantee of success. For example, Merkle–Hellman knapsack was based on a problem known to NP-hard and it was broken. Similarly, NTRUE has a closely related NP-hard problem but it isn't actually known to be equivalent.

There's also another serious failure here; being reliant on an NP-hard problem isn't nearly as important as being reliant on a problem that is hard *for a random instance*. It isn't at all hard to make an NP-complete problem where the vast majority of instances are trivial. In fact, most standard NP-complete problems are easy for random instances under most reasonable distributions. 3-SAT is a good example of this; while there are distributions which seem to give many hard instances with high probability, naive or simple distributions don't do that.

I do agree that RSA is not ideal in terms of some aspects especially concerns about computational efficiency. But the idea that RSA is "broken" is simply not accurate. And criticizing it as old misses that that is one of its major selling points; the older an encryption system is the most eyes that have looked at it. In contrast, far fewer people have looked at elliptic curve cryptographic systems. Moreover, the one unambiguous way that RSA is actually broken (in the sense of being vulnerable to quantum attacks) applies just as well to ECC.

I suspect that some of our disagreement may stem from the fact that many of the terms we have been using have not been well-quantified, so the degree of actual disagreement may be smaller than we are estimating.

Slashdot Top Deals

Make sure your code does nothing gracefully.

Working...