The truism about Artificial Intelligence is that once a cutting edge problem in AI gets solved, the masses just redefine it as "computer science algorithm". Image recognition was once the leading edge of AI. It's still AI, just not leading edge anymore (unless you're doing something completely novel, like doing it on a quantum computer). Intelligence *is* pattern recognition, of which image recognition is one type.
15 or 20 years ago, I was saying that because quantum computers perform multiple calculations on similar inputs simultaneously, they'll be perfect for the sorts of pattern recognition tasks needed for (artificial) intelligence. And now these smart people have figured out how to do it for the first time, albeit with a miniscule 4 qubit quantum computer.
But since quantum computing capabilities scale according to 2^n, where n is the number of qubits, a 24 qubit computer (i.e. 6 times the size of what they just built, requiring a molecule with 24 atoms) would be 2^20 = 1 million times as powerful as this 4 qubit computer just demonstrated. A 64 qubit computer would be 10^18 = 1 million million million times as powerful as this 4 qubit computer. Good-bye conventional computer encryption. And hello general-purpose pattern-recognition (i.e. the basis for strong artificial intelligence).
My first thought was that a vat of "carbon-13-iodotrifluroethylene" isn't exactly a general purpose computing device -- except that because their control inputs are a stream of radio waves pulses controlled by a conventional computer, it actually is a general purpose computer. And though I'm no quantum physicist / quantum computer scientist, it seems like it would scale reasonably easily: you just need to find larger organic molecules with similarly discrete nuclear magnetic resonance 'channels' (i.e. independently manipulable/separable by frequency).
I am beginning to sense the coming Kurzweil Singularity...
While it might do a good job at absorbing it must still emit blackbody photons.
True, but almost nothing at 300K, and almost none of that in the *near* infra-red.
Examples of human-competitive results using genetic programming (i.e. the algorithm we refer to as Evolution):
And their strict Two Fat Pipe policy means only skinny pipes left for content delivery.
I've never seen a post with 50% of its words spelled incorrectly. Unless it's in French? -- in which case, I guess your keyboard doesn't support accents.
Not a grammar nazi. Just couldn't resist on this one.
Sure, it may work exactly as hyped. But that doesn't matter.
Why would I want my hosting coupled to the framework I'm using, and to a particular database as well? If their hosting sucks, or if they raise their rates, I'm stuck.
Upon further reading (http://en.wikipedia.org/wiki/Centrality), methods that use an attenuation factor like I described are called Eigenvector Centrality, which Katz and PageRank are specific implementations of.
I would love to see that set of rankings.
I guess I have!
This article and open rankings work is great, but...
The default ranking we show you is by harmonic centrality. If you want, you can find its definition in Wikipedia. But we can explain it easily.
Suppose your site is example.com. Your score by harmonic centrality is, as a start, the number of sites with a link towards example.com. They are called sites at distance one. Say, there are 50 such sites: your score is now 50.
There will be also sites with a link towards sites that have a link towards example.com, but they are not at distance one. They are called sites at distance two. Say, there are 80 such sites: they are not as important as before—we will give them just half a point. So you get 40 more points and your score is now 90.
We can go on: there will be also sites with a link towards sites that have a link towards sites that have a link towards example.com (!), but they are not at distance one or two. They are called sites at distance three. Say, there are 100 such sites: as you can guess, we will give them just one third of a point. So you get 33.333 more points and your score is now 123.333.
Incoming links with degree one should be allocated 1 point. *yep*
Incoming links with degree two should be allocated half of 1 point = 0.5 points. *yep*
Incoming links with degree three should be allocated half of 0.5 points = 0.25 points. *NOPE* It actually gets allocated 0.33 points.
This means degree ten links still get 0.1 point? 10 hops away and they're still showing up significantly? That measure is broken. 10 hops away should score vanishingly small... 0.5^(10-1) = 0.001 points is much more reasonable.
The measure shouldn't be 1/n (harmonic centrality), it should be 0.5^(n-1). I would love to see that set of rankings.
That's a good thing to consider, but in general the natural gas distribution system continues to work just fine during a power outage. There's enough stored pressure in the system to last awhile (not sure exactly how long). And the compressors in the system are either powered by natural gas themselves, or they, too, have natural gas generators as electricity backups.
I just went through the same power failure, and came to a different conclusion:
Install a natural gas generator with an automatic switchover when the power goes out. The cost wouldn't be too different, I think, but this way you wouldn't even notice a power outage.
Someone was showing off, but didn't bother to confirm the spelling/punctuation.
The point of science ought to be to train you to think inductively
She inflicted COBOL on the world. She is the antichrist. Were she hired today, she would do similarly nefarious work.
Beware: Anyone supporting her work must also be one of her dark minions... Has Slashdot truly gone over to the dark side...?
As a mechanical engineer, I have only ever needed integral calculus outside of school work (including tutoring) three times:
1. With a friend, for fun, to win a bet. Yay, free beer!
2. To answer a particular question for work. Yay, happy boss!
3. Just now, for fun, to determine the required material stiffness for a cable hanging down from geostationary orbit (i.e. a space elevator cable) to support its own weight. Yay, Science!
Calculated minimum required material stiffness for space elevator cable: 4.9x10^7 N*m/kg.
This jives with what the 10x10^7 N*m/kg quoted on http://en.wikipedia.org/wiki/Space_elevator (referencing: Edwards, Bradley Carl. The NIAC Space Elevator Program. NASA Institute for Advanced Concepts). This would make perfect sense that he is assuming a safety factor of 2 (safety margin of 100%)
So, assuming that the nano-scale cross-linking issues mentioned previously in this thread do not reduce the tensile strength too much, and assuming we're okay with a safety factor of only 1.5 (50% safety margin), then we're finally in the ballpark with Carbyne having a material stiffness of about 7.5x10^7 N*m/kg.
We have the material; we can build it. So now, it's no longer a question of can the physics work, but rather a question of the political and business will to put in the engineering work to make this a reality.
Very, very cool.