Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science Technology

Physicist Proposes a New Type of Computing 60

SpankiMonki writes "Joshua Turner, a physicist at the SLAC National Accelerator Laboratory, has proposed using the orbits of electrons around the nucleus of an atom as a new means to generate the binary states used in computing. Turner calls his idea orbital computing. Turner points to recent discoveries (including a new material that allows rapid switching of its electron states and new low-power terahertz laser technology) that could lead to the development of a computer with vastly improved performance over current technologies."
This discussion has been archived. No new comments can be posted.

Physicist Proposes a New Type of Computing

Comments Filter:
  • by Anonymous Coward on Wednesday March 12, 2014 @03:10PM (#46467065)

    The catch is that to generate a tight enough pulse of sufficient intensity to do this, you need an accelerator two miles long. But if you manage that, you can switch electron states 10,000 times faster than transistor states can be switched.

    Ok, so it won't be a portable device...

  • by NoImNotNineVolt ( 832851 ) on Wednesday March 12, 2014 @03:12PM (#46467077) Homepage
    Here comes the singularity!

    Disclaimer: posted in jest to rile up all the Kurzweil haters. Where's your "hit the limit of silicon" argument now, huh? :P
    • by Anonymous Coward

      Ray will be right eventually, but he is off on his time scales by wide margin. For one thing, in his estimates he adheres to the transistor = neuron fallacy. He then builds on this fallacy to estimate a time when the number of transistors on a chip will equal the number of neurons in the human brain. We are already at hundreds of millions on transistors on our chips!! And the human brain only has about 20 billion neurons!! We aren't that far away !!! [HEAVY BREATHING]

      The whole problem with this is that in r

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        For one thing, in his estimates he adheres to the transistor = neuron fallacy.

        To be fair, a digitally-switching transistor is almost infinitely simpler than a neuron, but you could make the argument that a transistor configured in analog mode that summed several inputs and acted as a decision maker is much closer to a neuron. The trick is getting all of those transistors working together in some sort of "analog computer" fashion, as the brain's network reconfigures itself quite a bit, which is a lot harder

        • by pla ( 258480 )
          Who's got the money to pony up for some experimental fab runs for billions of transistors with reconfigurable mesh network? This is basically an Intel i7 fab process we're talking about here, so think beeeeelions of dollars.

          You don't need your own dedicated fab, you just need your own masks. Those will run you on the order of 100-150k per layer (and a modern CPU like the i7 has around 20 layers).

          Still not cheap, but a few million vs a few billion means the difference between "not gonna happen" and "bo
        • To be fair, a digitally-switching transistor is almost infinitely simpler than a neuron, but you could make the argument that a transistor configured in analog mode that summed several inputs and acted as a decision maker is much closer to a neuron. The trick is getting all of those transistors working together in some sort of "analog computer" fashion, as the brain's network reconfigures itself quite a bit, which is a lot harder to achieve at billion-scale on a die.

          Using human neurons as a model for the future of computing might not be the utopia that we are all dreaming of....

        • Sure go right ahead - keep imagining how the machines will be built.

          Next thing you know it's Skynet.

      • Re:Ray was right! (Score:4, Informative)

        by ceoyoyo ( 59147 ) on Wednesday March 12, 2014 @03:42PM (#46467485)

        You're making up numbers. We've had billions of transistors on chips for some time now. The XBox One's main chip has five billion transistors. And that's just one chip. The Titan supercomputer has nearly 200 trillion transistors.

        If the transistor doubling time remains about the same, you can equate any number of transistors you like to a neuron and Kurzweil's prediction still won't be off by much. Such is the nature of exponential curves. Sophisticated objections to his predictions don't involve transistor counts.

        Nobody knows how much of a neuron you need to build a brain. If you actually have to simulate it, possibly at the quantum level, then no number of transistors may be sufficient. You can probably get around that problem by not using regular transistors though. Sufficient artificial neurons might actually be easier to build - noise and interference are probably not as harmful as they are in regular computing, and may actually be beneficial.

      • by narcc ( 412956 )

        Ray will be right eventually, but he is off on his time scales by wide margin.

        That's good news. I was getting concerned about my construction plans. I'll have another load of bricks dumped in my yard every week until my new mansion emerges.

    • I admit you got me at first. I guess I was never a fan of people determined to turn science and technology into religions. Those topics are already cool enough as they are. Plus there are enough faith-based alternatives for that kind of thing if it feels like it's something you need in your life.

      • by Anonymous Coward

        all predictive based sciences are now religions. thanks for that update.

        • They're not. But there seem to be a whole bunch of people who like to turn to science or technology for some type of transcendent experience or something.

          "Oh almighty computer, how powerful you are! Surely your intellect will excel beyond us puny humans soon. I am so unworthy. *Grovel*"

          It's just a desire to have something to take the place of what the faithful crowd use some omnipotent god for. All over a tool that can do pointless drudgery work quickly and efficiently so that us humans can spend our time

          • Ack! Should have read more carefully before posting. Not "pointless drudgery" - there's definitely a point to it. More like tedious drudgery to support the interesting bits.

  • If his idea will prevail, we will see a whole new world of technology's.
  • by Tyler Durden ( 136036 ) on Wednesday March 12, 2014 @03:21PM (#46467205)
    When I first read the headline I thought the physicist was offering a computational model alternative to the Turing Machine. It sounds like he's offering a new type of computer, not computing.
    • by thesandbender ( 911391 ) on Wednesday March 12, 2014 @03:35PM (#46467379)
      Actually, it could prove to be radically different than current computers/computing. Almost all current computers are based on binary logic, your bit is either on or off. Electrons can actually have several orbital states so it is possible that computing could be approached in a different manner. This assumes that logic could actually be performed with the orbital states and it's not just a bit store. All of this is quite a long way off though, per the article you currently need a two mile long accelerator to change the orbital state of an electron this accurately.
      • Hmmm, I'm not so sure. Unless I'm missing something in the article the proposal does not offer anything new toward quantum computing. The advantages listed are the ability to switch electron states very quickly to improve RAM speeds and being able to read the spin of electrons - both without requiring excessive power to drive it.

        I'm not sure how quantum computers compare to TMs. After some quick browsing it looks like they don't have the computational speed potential of the (only theoretical) non-determinis

        • You did mis-read the article. They're not proposing it as a quantum computing solution, nor are they proposing to improve RAM speeds by using electron spin. They're proposing to use the electron orbital state to store information. Currently a charge (multiple electrons) are used to store one bit. This solution would allow one single electron to store one or more bits. This could be used to produce faster storage but it has other applications as well, such as faster switching logic. The end result woul
          • In your first reply you mentioned that computers are based on binary logic - on or off. I thought you were getting at quantum computing where you can have a combination of the two.

            From the article - "One is the discovery of a material that allows electrons to switch states really quickly that could improve magnetic random access memory speeds by a factor of thousand." So, yeah, that's essentially what I said.

            If the difference is that a single electron can store on or more bits then this is definitely equiva

      • I read it that even if the orbital states ain't the variable, the fact that there are 8 electrons in the outermost shell enables a byte to be stored per atom. On a computational level, instead of doing binary arithmetic, one would now be doing Base 256 arithmetic, where there would be 256 states in all. Although given how well binary has worked in creating all the bases we have - Hex, Octal and others, the best would be to leave the computational base @ 2, but use the valence electrons to store an entire
        • I read it that even if the orbital states ain't the variable, the fact that there are 8 electrons in the outermost shell enables a byte to be stored per atom.

          Wouldn't that only allow storing three bits, not eight? You can't tell which of the eight electrons are in the outermost shell, just how many there are, so the possible values are 0-8, not 0-255. Nine unique states gives you three bits plus one state left over.

      • Quantum computers already eschew binary thinking with the way that they manage their data, but they are still simply Turing Machines, albeit, theoretically much faster Turing Machines. But given enough time and memory, a classical computer is capable of perfectly simulating a quantum computer, and at least based on the summary, it sounds like the same would be the case here.

        This may be something neat, but unless it offers something more than a new way to represent bits, it won't mean that we can solve new s

        • by Zalbik ( 308903 )

          This may be something neat, but unless it offers something more than a new way to represent bits, it won't mean that we can solve new sets of problems.

          Exactly. The problem of a "new type of computing" is a math problem, not an engineering one.

          If we ever see a new kind of computing, it will be due to theoretical computer science / mathematics, not physics/engineering.

      • by Anonymous Coward
        It's currently possible to store more than one bit. It's done in MLC [wikipedia.org] flash. It's not worth doing with normal logic, but it's certainly possible. There's no reason to believe storing more than one bit per orbit would be worth doing here either, but it's so theoretical, it's hard to say much of anything predictive. One you got beyond a single bit, then you have issues with sensitivity to certain thresholds. It's generally better to keep things simple. Simple is usually more reliable and faster.
      • by Zalbik ( 308903 )

        Actually, it could prove to be radically different than current computers/computing

        Yes, but not in the way the GP was hoping (barring a major breakthrough in mathematics/theoretical computer science)

        All computers (even quantum computers) are basically the same. They are all Turing Machines. Some are just much faster than others. This machine won't be radically different, regardless of what the hardware is.

        Car analogy:
        If existing PC's are gasoline-driven cars, the GP was hoping for an airplane. What t

        • Yeah, not so much a "hope" for me though. When I read the title I just really doubted they meant to say what it sounded like they were saying. And sure enough, they didn't.

          There very likely isn't any computational model that can solve any problems that some TM equivalent method can't. It's just a matter of doing them faster.

  • by pieisgood ( 841871 ) on Wednesday March 12, 2014 @03:21PM (#46467221) Journal

    So we can switch states really fast, which is excellent, but how fast is our observation? If the observation needs to be made in order to switch to the next gate then we have our bottle neck. The article was sparse on details and didn't seem to answer this question.

  • Spintronics (Score:2, Interesting)

    by Anonymous Coward

    Whatever happened with Spintronics [wikipedia.org]?

    In theory these systems could be great. What I worry about is if they will be stable enough.

    Of course, this is using orbitals, which generally are a more stable element with regards to electrons and their speedy existence.
    I don't think they decay spontaneously, do they?

    With all these ideas, it makes me wonder what one is going to come first, this, optical computing, quantum computing, superconductive computing, ternary computing and others.
    I'd love to see Ternary, persona

  • by NapalmV ( 1934294 ) on Wednesday March 12, 2014 @03:32PM (#46467359)
    Does it run Office?
    • 2nd most important.

      The most important question would be, "How well does it run Crysis?*"

      * I haven't kept my finger on the pulse of gaming for some time; is Crysis still the benchmark for ridiculously complex and detailed graphics?

      • Crysis 3 is the new king of card tests, followed by Battlefield 3/4 and Metro: Last Light. Crysis 1 sees some benchmarking still, but since it can be maxed out fairly easily now (60FPS at max settings, 1080p on a single 280X or 770) it's no longer a true system-killer.

        If you're asking where Crysis 2 is on the list, well, it isn't.

  • Can we do the same thing with Earth's orbit around the sun?

For God's sake, stop researching for a while and begin to think!

Working...