Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

The Future of Computing 182

webglee writes "What will the relationship between computing and science bring us over the next 15 years? That is the topic addressed by the web focus special of Nature magazine on the Future of Computing in Science. Amazingly, all the articles are free access, including a commentary by Vernor Vinge titled 2020 Computing: The creativity machine."
This discussion has been archived. No new comments can be posted.

The Future of Computing

Comments Filter:
  • Who is Vernor Vinge? (Score:2, Informative)

    by resonte ( 900899 ) on Thursday March 23, 2006 @11:25AM (#14980432)
    In case you wanted to know

    Vernor Vinge is a sci-fiction author who was the first to coin up the term singularity, and uses the idea in some of his novels. Linkie: http://mindstalk.net/vinge/vinge-sing.html [mindstalk.net]

    If you would like to read one of his books I would suggest Across Realtime, which touches on this subject lightly. Although his other stories are somewhat less palatable for me (but I've only read three).

    Other authors who delve more deeply into singularity issues are Greg Egan (hard going, but definatly worth reading) http://gregegan.customer.netspace.net.au/ [netspace.net.au], Charles Stross's Accelerando http://www.accelerando.org/_static/accelerando.htm l [accelerando.org], and .

    Science fiction is odd as a genre since the authors minds are affected by the technology they see possible at the time of writing. Science fiction writers in the past depicted a future with minimal use of networked computers for instance. So the theme seems to change over time, whereas other genres remain pretty static.

  • by Anonymous Coward on Thursday March 23, 2006 @03:11PM (#14982310)
    Check out http://imagination-engines.com/ [imagination-engines.com] which is a US company founded by an AI researcher Dr. Stephen Thaler. In summary his systems are composed of paired neural nets in tandem where the first is degraded/excited to produce 'novel ideas' (the 'dreamer') and the second is intended as a 'critic' of the first system's output, or a filter for 'useful' ideas.

    In real-life applications, it was used to invent a certain oral-B toothbrush product.

    At one time the site's literature announced that 'invention number (CM Creativity Machine) produced invention number 2 (STANNO Self-Training Artifical Neural Object)

  • by HiThere ( 15173 ) * <charleshixsn@ear ... .net minus punct> on Thursday March 23, 2006 @03:26PM (#14982431)
    There have been languages that made a great start at handling this problem. Unfortunately they died.

    One of my favorite examples is Prograf, a data stream language. It was excellent...well, sort of. There wasn't any good way to textually represent it. And it was proprietary. And it was written for the Mac. Small * small * small. As with many good Mac products, it died attempting the transition to MSWind. But the real problem was that while programs were logically small, physically they were HUGE. A graphic printout would have lines of control leading all over it. It was literally like reading a flowchart, except that the flowchart didn't abstract the program, but included every necessary detail for execution. This meant that a program written for Prograf and printed would be about three or four times the size of a similar program written in, say, Fortran IV or Ada (I'm picking moderately verbose languages). Perhaps 6 to 8 times the size of one written in Python. This made thinking about the programs very difficult.

    If you want another example, you could look at the Helix database, but that didn't abstract away the if/then statement. Still, it was another good program killed by no convenient way of seeing large chunks of the code at once.

    Until we can develop a true AI, the best progress that we can make will be based on chunking. There are various ways of doing this, libraries are one popular way. So is "higher order languages". Every language that steps above binary is created through a chunking of lower level concepts into higher level ones. The if/then construct itself is a chunking of lower level concepts (usually test and branch if zero/not-zero, but other tests occur). Very few languages have chunked the if/then construct away, however. Prograf is the only one I can think of. (Unless you allow the Lisp (cond()) statement...which is really a series of if tests in one wrapper. And I think that current Lisp dialects also include a simple If test, but I'm not sure about that.) However you might check J (a language descendant from APL). It also handles chunks in a highly divergent way, and may have eliminate the if/then construct.

    What would be nice would be if someone could resurrect Prograf with it's warts polished off. I remain convinced that this was a language with great promise that was stiffled by ... well largely by circumstance, but the lack of a viable printed representation was also significant.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...