Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

The Future of Computers 62

GrokSoup writes: "Great collection of semiconductor where-to-from-here articles in this month's MIT Technology Review. There are articles about molecular computing, quantum computing, DNA computers, and on and on. Fascinating stuff, all pointing to why the current semiconductor hegemony is by no means a "forever thing", as the kids like to say. "
This discussion has been archived. No new comments can be posted.

The Future of Computers

Comments Filter:
  • I see this hobbiton.org link is a bad thing. it auto posts itself. Don't click.

  • Don't click on the above link. It'll post a message as you to this forum.

  • I seem to remember the soviets tried to build a non-binary computer. I think it used three states, but it was so complex that it was never mass-produced. How is a quantium computer any different?
  • But it's an excellent way to waste moderation points...
  • Moderate this up or others are going to be click click clicking away :/
  • PC will die..[blah blah]..Wearable computers..[meaniningless drivel]..DNA, nanotechnology and quantum computers..[within 5000 years]..mobile computing..[blah blah]..e-commerce, e-society [kiss my e-ass]..Moore's law will not hold..[as predicted 15, 10 and 5 years ago]..

    Is anyone else really sick of this crap?

  • "The problem lies in Heisenberg's Uncertainty Principle, which states that you cannot know the position and the velocity of a particle at the same time...

    Almost, Heisenberg's Principle says you cannot know postion and velocity both to within a certain amount of exactitude.

    "...looking at the position leaves you with a 'blur of possibilities' for the velocity and vice versa."

    Sort of. Measuring the postion of a particle more and more precisely makes the velocity less and less certain (and vice versa). So it's not the looking at the position/velocity, it's the accurate measurement thereof.
    --
    Have Exchange users? Want to run Linux? Can't afford OpenMail?
  • This argument comes up again and again, almost on an annual basis. However, there's a well know quote that i think applies here:

    (Paraphrased) "640k should be enough for anyone..."

    The point is, there is always something else that just around the corner. When Bill said the above quote, he hadn't envisioned the wide-spread adaoption of the GUI, 3d 1st person games, digital media (DVD's), or the World Wide Web. And just as Bill couldn't envision these things, neither can we envision what could be the Next Big Thing.

    Now matter how fast you think your computer is now, someone, somewhere, is busy writing a peice of software that will make your computer seem like a Sinclair Z80 two years down the line.
  • Click the link all you like. Just make sure you have disabled JavaScript, as any security-conscious person would.
  • Jesus fucking christ...!
  • As processor power increases, the difference between good and poorly written software is becoming more dramatic. This is a greater issue that's being dwarfed by pissing games over 1GHz+ speeds. For example, twenty years ago it was common for developers to use VAX-level minicomputers to do development for emergine home computers (Atari, Apple II, early PCs). After all, who could write a decent assembler on a 1MHz system with hardly any RAM? But it turned out that native assemblers on teeny tiny systems were frequently outperforming the minicomputers by a factor of ten or more.

    Fast forwarding to today, consider any C++ compiler, say gcc or Visual C++. You can never have enough processor power for such a compiler. Even on top of the line machines you're still talking minutes to rebuild a medium sized project, and seconds for an average link. Now fire up Borland's Object Pascal compiler (buried inside of a RAD tool called Delphi). On a 200MHz machine the compilation time is up near a half million lines per minute. Link time is effectively zero. In general, compilation time doesn't exist. You have to have a pretty big project before you even notice that pressing F9 is taking any time at all. If you're using a 300MHz machine or faster, this is never even an issue. It's never an issue for almost any slower machine either, but I'm playing it safe. Now, yes, Object Pascal is simpler than C++. The compiler maybe doesn't do some of the nutty stuff that gcc does. But in the end, does it matter? A compile time of zero sure does make it easy to go in there and twiddle around with the code, making it go real fast.

    This kind of thing is going to be more and more common. Is a bulky application slow because the processor isn't fast enough or because it is bulky? Throwing more processor power at problems like this is a dodge.
  • Am I the only one who knows how to do URL hacking? :)

    If you want to see how the author did it (to see if its CGI or whatever), then go here:

    http://hobbiton.org/~zk65/ [hobbiton.org]

    There is a tar-zipped file there (which I haven't checked out) containing a do-it-yourself kit for ruinning Slashdot discussions.

    The author was also kind enough to leave an e-mail address, so you can express how much you love this piece of work (heh).
  • Sorry about this, folks. Didn't know the link was bad.
  • To me, it seems there is a flaw in the perceptions of
    those who make these so-called productivity studies of
    computers in the workplace. They seem to base an
    expectation of results on the premise that the
    computer is some magic box that immediately enhances
    whatever environment it is placed in simply by being
    there.

    The computer, like any other piece of equipment, is
    a tool. And the benefits to be gained by use of such
    a tool is in direct proportion to the understanding
    of its use passed on to the users thru supplemental
    information and training.

    These studies need to take into account the business
    practices that were there before the introduction of
    new technology; the amount of training provided to
    the end-users as that technology is installed; the
    level of understanding that those users gain in how
    to enhance their performance by using the technology;
    and the change in practices that result from such use.

    More importantly, the businesses themselves need to
    see this accounting as the steps they have to take
    to insure that the introduction of new computers,
    copiers, printers, networking, etc... actually is
    integrated along a plan that will lead to true
    productivity gains.

  • I'm sorry. The terms three-state(tm) or tri-state(tm) is trade-marked. You failed to give appropriate credit for same, thus you are in violation. Please place $1000 in the slot next to the keyboard to avoid further legal action!

    You have 10 seconds before this terminal self
    distructs 10....9....8....
  • I reflected on the direction computers are leading us to. I would like to quote an excerpt from Steven Levy's book on Macintosh history, which better explains my reflections:

    "...I sometimes question whether [productivity] is an illusion.
    As it turns out, this question has been bedeviling economists as well. A few years ago Gary Loveman, a professor at MIT [...], attempted to measure the productivity gains that came with the billions of dollars' worth of information technology purchased by American industry. Similar studies measuring the benefits of research and development had conclusively demonstrated that R&D was a solid investment, and that there was no reason to suspect that computer technology would be a different story. But when Loveman ran all the numbers, totaled the investments in information technology and then compared them to the productivity totals of the industries, he was startled, if not astonished by the results. 'There was no positive effect,' he said. 'There may even have been a negative effect.'
    This gap between accepted reality (computers make us more productive) and the quantifiable result (they don't), has come to be known as the Productivity Paradox. A true puzzler: if computers enable us to get so much work done, in a much shorter period of time... why can't we measure it? Where did the productivity go?
    [...] Still, I think the paradox is a useful tool to assess the hours we spend focusing on our tools instead of using them. [...] [Trying to discover the source of my computer troubles] was a process in which I had never engaged back in the bad old days when I toiled on a typewriter. In a certain sense, those days were not bad at all. I never spent a whole morning installing a new ribbon. Nor did I subscribe to 'Remington World' and 'IBM Selectric User'. I did not attend the Smith-Corona Expo twice a year. I did not scan the stores for the proper cables to affix to my typewriter, or purchase books that instructed me how to get more use from my Liquid Paper..."

    This long quote is useful to understand that there still is the need to get finally rid of false needs before asking ourselves what is the future of computer technology. Is it really a means to an end or is it becoming just an end and nothing more?

    Greetings.
  • by Matt2000 ( 29624 ) on Monday June 05, 2000 @06:11AM (#1025655) Homepage

    You seem like the kind of person who would have laughed at the first computers and stuck to his handy mechanical calculator.

    And of course having not read the articles, you add a bunch of e-commerce bashing in there for good measure.

    Nice work.

    Hotnutz.com [hotnutz.com] - Funny
  • You make a good point, that it is easier (read cheaper) for manufacturers to continue to update old tech than to run with new tech (largely due to costs of fabs, lack of expertise, etc). But crisis brings about change, and when we run out of creative ways to accelerate silicon, something else will have to step into the void. This seeming "crystal ball" article is more relevant than it may appear.

    -L
  • "Suppose we can get that(RAM memory storage time) up to several years," says Reed. "It would essentially be nonvolatile memory. Imagine how many times you wouldn't have to boot up Windows."

    Millions of times probably

  • I have a physics degree, and while I don't remember too much, I do remember this:

    Heisenberg's Uncertainty Principle refers to _momentum_ and position, and states that the product of the uncertainty in the momentum and position has a minimum value ( delta P * delta X >= h, where h is Planck's constant, P is a standard physics variable for momentum (no idea why), and X is the position). If you assume the mass is constant (disregarding Einstein's Special Relativity), then the momentum is proportional to the velocity, and what you said about velocity and position is roughly true.

    There are several other pairs of physical properties which are related in this way; energy and time, angular momentum and angular position, etc. I'm not sure whether HUP is used to refer to these other property pairs, or only the momentum/position pair.

    I'm sure I'm one of the few who care about such fine points, but I hate to leave misinformation out there unchallenged. I'm sure my fellow anal-retentives will thank me ;)
  • Although mostly a fictional novel, 'Timeline' by Michael Crichton. There's a lot of intregueing stuff and insight about quantum computing and quantum physics, definatly worth reading.
    - - - - - - -
    Oliver Sosinsky
  • Your argument on inflation makes no sense. If what you bought 18 months ago today costs 1/4 what it did then, that is not inflation, it is deflation, since the same number of dollars can now buy more, not less. Nor are you losing economic value, since you can get 4 times as much now as you did then. The fact that there are newer, faster computers that you might buy does not destroy the value of the computer you bought 18 months ago. Finally, since the producer produces and the consumer buys the computer knowing about its eventual obsolescence, there must be economic value to both the creation and purchase of the machine. Economic value is not being lost.

    As for this story, it is a link to interesting articles on future computing technology. Are there economic interests at stake? Certainly, but the bias in these articles is far less than those of Athlon or GeForce reviews, because the economic interests relating to future computing technologies are being greatly discounted because they lie so far into the future. The companies working on these technologies are not trying to attract loads of public investors nor are they yet selling anything. They don't particularly need puff pieces, since their technologies are being analyzed by other physicists, chemists, biologists, and computer scientists. AMD, on the other hand, DOES need puff pieces.

    So the economic interests inherent in this posting are much less than in other postings. But in the final analysis, I would question how much that even matters. CmdrTaco cannot do an in-depth review of every site from which he gets articles, and even if he could, he couldn't do an in-depth review/analysis of every journalist's conscious and subconscious motives. And EVEN if CmdrTaco could do that, who would we get to vet CmdrTaco?

    Sometimes you have to just have some trust, read the articles, and shut up.
  • IANA{applicable expertise}

    The original statement seems to confuse, or rather fuse, Heisenberg's Uncertainty Principle (as you point out) and Schroedinger's Cat experiment.

    In the cat experiment, a cat was placed in a sealed box, with a poisonous device (cyanide vial, IIRC). The device was trippable by a collision with a particle, which could either occur, or not. Actually the device would be tripped if a particle existed in the box or not - same net effect, different hypothesis.

    Anyway, without openning the box, we didn't know if the particle appeared and collided, or not - so the cat was simultaneously alive and dead. The point of the experiment (IIRC again, it's been years) was do show that a particle could either exist or not, and we couldn't tell, except by observing the consequences of the probability - or something like that. :)
  • In 2001, HAL already had it.
  • I think what all of these e-business people need is an e-nema.
  • Is anyone else really sick of this crap?

    I know I am e-sick of all this e-crap. It has been my experience that people are generally more comfortable with a clear separation between computers and everything else e.g. if you want to use the computer you go to your PC. Wearable computers (calling Dick Tracy?) may eventually take off, but we are a long way from that. Until then, I guess we have to put up with all the unnecessary wearable digital device hype and all of the abuse of the prefix "e-". Yick!

  • All this new technology.. This "MIT" sounds like a really great company.
    If their stock isn't too expensive, I'm gonna get a couple thousand shares tomorrow...

  • I'm sorry, I took the story at face value. Another web site has posted 4-5 articles that, individually, would very likely have attracted the attention of the /. community.

    As a story submitter provided a story listing all of the articles, URLs to each of them independently and also a URL to the top level site there didn't seem to be any need for editorial comment. The point of including this on the /. front page is that those articles are of interest to /. readers, and so here are the links and go find them.

    The economic ramifications of the technologies discussed in the articles are of potential interest, but are certainly not the prime motivation for mentioning the articles in the first place - instead it's more of a "hey, look at these - stuff that's cool".

    ~ced
  • A quantum computer works by trialling a superposition of wavefunctions or eigenfunctions (solutions of a matrix problem) as solutions of a problem (matrix operation). The resultant eigenvalues (conditions that show that the matrix problem is solved) are measured as observable results of an experiment (with probabilities gathered over a large number of experiments), and thus the eigenfunctions (solutions) are found.

    Because every eigenfunction (of an infinte set) can be tested at once, over a certain number of trials, if the set of eigenfunctions (possible solutions) is infinite or very large, but the set of eigenvalues (actual solutions) is finite and smaller the problem is reduced from an infinite to a finite, or large to smaller problem and hence can be solved a lot quicker
  • That's misleading. Every particle has a single state, described by its wave function. It's just that the wave function doesn't necessarily correspond to a definite value of an observable quantity.

    When you aren't looking at it, the state evolves deterministically according to the Schroedinger equation (classically) or Dirac's equation (if you take special relativity into account). For any given measurement you might make, most wave functions don't correspond to a definite value of whatever you are trying to measure: instead the wave function is a sum of states, each corresponding to states which do have definite values of the measurement. Given two observables, the wave function can be decompsed as a sum of states with definite values for either observable---but in general not both at the same time.

    To expand on that last bit, many measurements are incommensurable. (i.e. the states that have a definite value for one measurement don't correspond to states that have a definite value of the other---this is the case with position and momentum in Heisenberg's uncertainty principle). Although the wave function evolves deterministically when you aren't looking, when you do look, it randomly collapses into a wave function that has a definite value of whatever you are trying to measure (i.e. into one of the component states for the observable you are looking at). The probability of collapse into a particular state depends on the extent to which that state was a component in the original wave function---dominant components are more likely. This, in a nutshell, is a formal description of how observations affect the state of the system.

    Disclaimer: I'm a mathematician and not a physicist. This is my best attempt at explaining Von Neumann's approach to QM as I understand it, but it's simplified and I think physicists prefer different language. If I've messed up, I'm sure any physicists here will correct me.

  • Well I moderated down as many wow.cgi links as I could, but there are still quite a few left.

    Anyone else care to join in the fun?
  • I forgot to point out that my former post carefully removed all my moderations, so in effect..I just wasted a few seconds of your time in reading that...and this.

    /me bows

    I also do childrens parties.
  • Does anyone have any predictions as to when this is going to become available for consumer use?

    --
  • Whaddaya talkin about. Everybody knows theres going to be a big economic crash (as Microsoft has predicted to be the result of their negative judgement) and that computers will be a prized possesion of just a Select Few. We will be fortunate to get just the basics... food, water- maybe a comfortable old mattress under the bridge to sleep on. That is the untimely devolution of the digital age men- start storing up your canned goods now :-)

  • I been reading a lot of articles about great new techonology (in storage, or computing power, whatever)

    but you know what?
    I'm still using a magnetic, motor driven drive and a silicon printed chip in my computer. The same technology that I had in my XT and 286 quite a number of years ago.
    I'm not saying these things will never materialise, but a lot of great new products never make it to the marketplace due to all sorts of reasons (money problems, stupid law problems etc) and those that due require a lot of time to get from the testing lab to a consumer product.

    This is cool technology, but is there anything that's a little more here and now?

    --------

  • IANMAP, but quantum physics says that a particle can have many states at once, but when you look at it, it 'picks' or falls into only one state. so the quote seems backwards to me.
  • Interesting hole in Slash it seems. Anyone emailed Rob et al. about this yet? Or does anyone even know how it works?
  • by new500 ( 128819 ) on Monday June 05, 2000 @04:09AM (#1025676) Journal

    I am trying to understand this. Is this story a review of techreview.com or an endorsement or even an admission along the lines of "aha! here are some guys talking about some really interesting stuff, you should go and look".

    My point is, no disrespect to 'Taco, that this story looks to be the latter, and without any additional commentary or question or review of the linked content I think this is a poor and unstructured way to create a discussion.

    I mean, I now might have to read a whole other site - all of it - just to see what might be "on topic" or not :-)

    In conection with "The end of Moores Law" I want to ask just how much economic value is lost in a system where what you paid for 18 months ago is now worth one quarter, 1/4, 25% of what you paid according to the simple interpretation of that dynamic?

    I mean if your salary of 18 months ago were being lost to inflation at that rate, and the effect was spreading throughut the economy, we'd all be in Brazil during the 70's

    for the record, Brazil inflated itself to growth on the back of vast loans, much of which originating from Citibank, and all of which collapsed in '82 leading to a massive world wide liquidity crisis.

    The net effect of this was deliberately lowered interestrates post - Volker, and contributed to the junk bond (80's) and later equity (90's) booms we have become used to

    As the effect of computer obsolescence and the consequent demand for capital and working capital (usually loans) permeates evermore deeply our economy (is this maybe one argument against PC vs big iron trend promoters?) this dependancy becomes more acute.

    What I am saying here is that there may be *economic* reasons that bring to a halt technological phoenomena, and that a purely "geek" approach to the issues may yield diverging results from what is happening

    Back to my original "complaint" I really think it would be nice is submitted stories came with something at least resembling an editorial viewpoint. Maybe /. is the only place where you can proverbially print your paper and leave the inside pages blank "for reader's notes". Maybe editors here want to avoid infered advocacy.

    But surely the editorial point is "We here all have a (personal, financial) interest in the economics and technologlies developed under recent - very interesting and not necessarily fully understood conditions - is this set of linked articles a sign of things to come, or an indicator of impending (local systems) collapse?"

  • There is a link below that say's "this is more informative"

    Do not click it as it will auto-post itself to this page and it is very annoying. I see like 20 posts already.

    I only put this up here becasue it is a prominent place on the page and hopefully you will see this before you click on the link.

  • DO NOT CLICK THE "THIS IS MORE INFORMATIVE LINK"!

    It's a link that sends a message from yourself to this page.....

    Hmmm...I heard of this a while back, and thought it would have been fixed by now...

    dylan_-


    --

  • Okay everyone - I think we might have to avoid anything linking to

    http://hobbiton.org/~zk65/wow.cgi?nyAHVvq3yc right now.

    Something tels me that this is anotherone of those auto posting tricks designed to flood the forum

    Like the fact my browser tells me that's a CGI link maybe . .

    This happened a while back and was pretty sad - except giving us a interesting survey on browser/ OS usage to read ;^)

    I hope this doesn't get out of hand - or no one will read anything interesting today :(
  • In the future, high-tech computers will allow mankind to actually visit the moon! And there will be a byootimous princess there with hot grits and hot tits! Her name will be N. Portman. Hot grits hot tits Hot grits hot tits Gimme some o' dem Naughtier bits.
  • About how far back was that? I'm bored and in desperate need of reading material.

    Besides, I think it'd be kinda interesting to look at.

    --
  • I don't think we should be moderated down simply for clicking on a link!!!

    What gives?

    --
  • actually, there should be a pretty easy fix that one of the originators (G27) of this hack suggested...make it a necessity that people preview their postings.
    --
  • no you have this wrong...

    but quantum physics says that a particle can have many states at once, but when you look at it, it 'picks' or falls into only one state.

    It actually acts quite a bit differently. The act of checking it's position (or velocity) contains enough energy to totally obliterate the other measurement. If you get the exact position the error for velocity will be infinity, if you get the exact velocity the error for position will be infinity.

    The same things happens to women when you ask them what they want or how they feel. The trick isn't to ask, but kind of observe out of the corner of your eye.:)
    --
  • Moderating the worm-generated post down reduces the possibility that other folks will subsequently click on the link. Just like real Karma, slashdot Karma sometimes has nothing to do with good intentions... -carl
  • or an e-re-education.
    --
  • Seriously, what will the public do with all the power given to them by molecular, quantum and dna computers? I can just see it now, little kids breaking encryption algorithms in only a matter of seconds just for shits and giggles.
  • by CausticPuppy ( 82139 ) on Monday June 05, 2000 @10:28AM (#1025688)
    First, IWAPMBHFMN (I was a physics major but have forgotten much now)

    ... in which an atomic nucleus can be spinning clockwise and counterclockwise at the same time

    This is really a misnomer. It's a misinterpretation of intrinsic spin. Each particle in the nucleus possesses it, but the physical spinning (clockwise/counterclockwise) of the nucleus itself is not a quantum state. If it's a hydrogen atom, the nucleus is just a proton, so you can't really define a physical spin (in the rotational sense) although it will have an intrinsic spin. I think you can have an effective spin for a multiparticle nucleus, by summing up all the spins for the constituent particles, but I just made that up and it could be wrong. Labelling this peculiar quantity "spin" doesn't actually mean anything is spinning, in simply arises from the fact that particles appear to have angular momentum.

    It is a bizarre world in which matter itself dissolves into a ghostly blur of possibilities as soon as you try to look at it.

    Er... the Heisenberg Uncertainty Principle says that the error in position times the error in energy must always be greater than some minimum value. This means that the more precisely you measure ("look at") one quantity, the less precision you have in the other quantity at the same time. By that equation, if the error in one quantity approaches zero (meaning you're measuring near absolute precision), the error in the other quantity approaches infinity.

    The leads to the old Quantum Physics homework excuse: "Professor, I calculated the energy in problem #3 so precisely that now I have no idea where in the universe my homework is!"
  • more important than owning the technology is owning the people that created the technology. Funding may be given to increase the general brain trust and then those people offered better jobs "in the private sector.'

    Just a thought. Not a good one, but still a thought.
  • Sydney Weidman is right and MIT is wrong.

    Quantum mechanics describes the behavior of particles exclusively in terms of a "psi function". The square of the psi function (imagine something like p2(x,y,z,t)) is the probability that a particle is a certain place at a certain time. It's referred to as the psi squared function.

    But the theory only provides the probability. Obviously, when you look at the thing, you figure out where it is.

    There are various interpretations of this. One is wierder than the next - my favorite is the idea that a new parallel universe is created for each possible observation, and each universe is identical except that the observer saw something else. But this is regarded as a little uneconomical, as you end up with a LOT of inherently unobservable universes (whatever a universe is). I swear I'm not inventing this.

    But anyway, once you've seen the thing, the probability that it is where it is is one, and the probability that it is where it ain't is zero, so your psi squared function "collapses" to a boring function that is zero everywhere but one point, where it's one.

    Please note that changing the psi squared function (obviously) changes the psi function too, and (as I already said) that quantum mechanics describes particles exclusively in terms of their psi functions. This means that every observation inevitably changes the particle itself.

    If you can figure out exactly what makes the psi function collapse you get a Nobel Prize. Unless of course you're a grad student - then your professor does.

    The Heisenberg Uncertainty Principle is a related but different idea.

  • I wonder how much of this will be stifled and slowed down by specious litigation.
  • 'The Fabric of Reality' by David Deutsch. It doesn't cover up-to-the-moment technology, naturally, but it provides an ample explanation of how they work and a great deal of food for thought on their potential. It's well written too, but because of the concepts it tackles it will take most readers quite a while (well, it took me quite a while anyway).

  • ...at the same time, computers have reached a point where they can do everything we need.

    While my old 80 MHz PowerMac lacks in voice recognition (it does have it, though) it makes a fine webserver and mailserver. I can still use it to browse the web and write papers. My roommate still uses a 200 MHz Pentium with Win98 and some newer games with no complaints.

    One of the articles discusses that Moore's law is finally coming to an end. This might not be because it is harder and harder to make computers faster and faster as the article states. It may be because computers are finally doing everything we need them to. I know that having a smarter, better interaction with the computer will drive the industry for faster machines, but the proliferation of PDA's shows that we've passed the critical point of having enough computing power and not needing more. I mean that the computers are powerful enough now, that we are focusing on making them smaller and smaller.

    Hell, today's PDA can wip me at chess. That's as powerful as I need!

    --
  • As computers get faster and faster, thanks to new developments like this, the question becomes one of interaction with the comptuer - many devices that are novelties now (like voice recognition) will be not only feasible, but standard on systems running at that speed. Kinda sheds a new light of the "Equipment Faster than Operator" error we used to joke about at work...

  • Particularly interesting to me is the potential of Quantum computing. This article reiterates some earlier research I have read (IANA physicist) pushing the notion that quantum computers will be able to perform practically limitless operations simultaneously based on the size of the qubit array.

    This exacerbates one existing problem, and creates another one. The existing problem is the current software and data limitation. We cannot generate software to keep up with our hardware as is (operating systems in particular always lag a few years compared to hardware), and we cannot make digital all the data that existed in predigital form. It is certain however that as we started to "catch up", the automation of society would rise exponentially.

    The second problem is the economics of such machines. Performance costing would be quite difficult, so software would probably begin to play a much greater role in the economics of the hardware. "Data-friendliness" might actually become a concern over "user-friendliness". Unless of course they were to sell machines by their "qubitness":

    "I have a 64-qubit machine, but I want one of those new 128-qubit jobs!"

    -L
  • This is a world in which an electron can be in two places at once, in which an atomic nucleus can be spinning clockwise and counterclockwise at the same time. It is a bizarre world in which matter itself dissolves into a ghostly blur of possibilities as soon as you try to look at it.

    I'm not a physics student, but I seem to remember that the possibilities only matter when you're NOT looking at the system. Does anyone else know whether this is mistaken or not?

  • I was floored by the descriptions and application of Quantum Computing in the realm of Crypto Analysis. You have to bet that the promise in this field might bring about the reality even faster than we might think. All it needs is some direction and a ton of funding. I can think of at least one agency that would like to see it sooner than later....

    "I see nuthink... I know nuthink!"
  • ...whip...

    sorry.

    --
  • It's not mistaken, but rather incomplete. The problem lies in Heisenberg's Uncertainty Principle, which states that you cannot know the position and the velocity of a particle at the same time - looking at the position leaves you with a 'blur of possibilities' for the velocity and vice versa. That said, IANAP.

  • phurst post

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...