Comment Re:StarCraft 3 confirmed! (Score 1) 73
Do these not count?
Banjo-Kazooie Nuts & Bolts (Xbox 360)
Perfect Dark Zero (Xbox 360)
Killer Instinct (Xbox One)
Battletoads (Xbox One)
The complexity might be 2^-1000*2^n operations for a specific implementation. Then it takes 1 operation for n=1000 and 2 operations for n=1001. (Maybe the processor implements the problem up to n=1000 in hardware and for bigger n some work needs to be done in software.) This is an extreme example, of course, but it is within the definition.
and somewhat on this (it's correct, but I think you're missing the point of what this means):
The statement that a problem is O(2^n) means that there exist algorithms for the problem whose speed grows asymptotically no faster than 2^n for large n, so for each of those implementations, the maximum amount of time that implementation takes to run on a specific input size can be written as a function f(n)...
From the wikipedia link you specified:
Although developed as a part of pure mathematics, this notation is now frequently also used in computational complexity theory to describe an algorithm's usage of computational resources: the worst case or average case running time or memory usage of an algorithm is often expressed as a function of the length of its input using big O notation. This allows algorithm designers to predict the behavior of their algorithms and to determine which of multiple algorithms to use, in a way that is independent of computer architecture or clock rate. Big O notation is also used in many other fields to provide similar estimates.
An algorithm with O(f(n)) computational complexity requires on the order of f(n) operations to complete for input size n in the worst/average case (it could much require fewer in the best case, but we don't care about the best case, in general, or in this discussion, in particular).
A problem that is O(f(n)) has no known algorithmic solution that has computational complexity better than O(f(n)).
An "operation" in the Big-O sense refers to the logical/mathematical concept of a fundamental unit of computation (think Turing Machine). Whether an operation is implemented wholly in hardware or software (or half-and-half or whatever) is completely irrelevant to a discussion about it's computational complexity. An operation is simply a step of execution that must be performed by an algorithm and Big-O tells you very approximately how many of them it will take for it to arrive at a solution (on average/worst case).
Anyways, I think my point - that computational complexity kills Kurzweils singularity speculation - stands, but this topic has long since faded from front page discussion, so I'll let it drop.
D.
That doesn't sound right.
From your example, a computer that can run an O(2^n) algorithm on up to n=1000 inputs in less than or equal to 1 second, must be able to execute at least 2^1000 operations per second! You couldn't do that with computer composed of 1 processor for each atom in the universe, each running in parallel.
Also, n is the number of input to the algorithm. It has nothing to do with its implementation. I think the value you are thinking of is k, which is a scalar that represents computational overhead of the algorithm, and that is certainly implementation dependent. Big-O says nothing about k.
Lastly, you're right, there are good approximations for many hard problems. However, there are many hard problems for which no good approximation can be found (or at least finding the approximation is also a hard problem). For these problems any uber-AI will be no better than us, lowly, normal intelligences.
D.
The thing that bothers me about Kurzweil's super-intelligence/singularity thing is it completely ignores computational complexity.
Every problem has a strict lower bound on the number of operations (and/or amount of memory space) required to solve it. Even simple sorting requires that every element to be sorted is looked at, at least, once. In other words, in the best possible case, sorting can be accomplished in O(n). No amount of super-speed intelligence can reduce that.
Further, difficult problems stay difficult regardless of how much processing power you throw at them. Take this example from the wikipedia page:
[C]onsider a problem that requires O(2^n) operations to solve. For small n, say 100, and to assume for the sake of example the computer does 10^12 operations each second, a solution would take about 4 * 1010 years, which is roughly the age of the universe.
So, even ignoring all physical constraints, intelligence is computationally limited. No matter how much "smarter" we can make an AI (or whatever) there are problems whose complexity is great enough that any increase in intelligence quickly becomes irrelevant.
D.
ps. Apologies for any inaccuracies in this post. It's been quite a while since I took this stuff in university. If anyone more knowledgeable in complexity theory cares to correct me, I'd appreciate it.
In addition, Hasbro has delivered to Facebook, which hosts the Scrabulous game, a notification of copyright infringement under the Digital Millennium Copyright Act (or the "DMCA") requesting that they remove the Scrabulous application in the U.S. and Canada as soon as possible.
This burns me up in so many ways. >:(
Not only is this a trademark case rather than a copyright one (as far I can tell, no lawyer here), but as a Canadian, it disgusts me that they're using the DMCA to dictate what can or cannot be published in Canada.
We're having a tough enough time keeping crap legislation like that out of our country without amoral corporations smearing it across the boarder.
Forty two.