The Future of Computing 182
webglee writes "What will the relationship between computing and science bring us over the next 15 years?
That is the topic addressed by the web focus special of Nature magazine on the Future of Computing in Science. Amazingly, all the articles are free access, including a commentary by Vernor Vinge titled 2020 Computing: The creativity machine."
Who is Vernor Vinge? (Score:2, Informative)
Vernor Vinge is a sci-fiction author who was the first to coin up the term singularity, and uses the idea in some of his novels. Linkie: http://mindstalk.net/vinge/vinge-sing.html [mindstalk.net]
If you would like to read one of his books I would suggest Across Realtime, which touches on this subject lightly. Although his other stories are somewhat less palatable for me (but I've only read three).
Other authors who delve more deeply into singularity issues are Greg Egan (hard going, but definatly worth reading) http://gregegan.customer.netspace.net.au/ [netspace.net.au], Charles Stross's Accelerando http://www.accelerando.org/_static/accelerando.htm l [accelerando.org], and .
Science fiction is odd as a genre since the authors minds are affected by the technology they see possible at the time of writing. Science fiction writers in the past depicted a future with minimal use of networked computers for instance. So the theme seems to change over time, whereas other genres remain pretty static.
Creativity Machine: it already invented its v2.0 (Score:2, Informative)
In real-life applications, it was used to invent a certain oral-B toothbrush product.
At one time the site's literature announced that 'invention number (CM Creativity Machine) produced invention number 2 (STANNO Self-Training Artifical Neural Object)
Re:Don't overestimate... (Score:3, Informative)
One of my favorite examples is Prograf, a data stream language. It was excellent...well, sort of. There wasn't any good way to textually represent it. And it was proprietary. And it was written for the Mac. Small * small * small. As with many good Mac products, it died attempting the transition to MSWind. But the real problem was that while programs were logically small, physically they were HUGE. A graphic printout would have lines of control leading all over it. It was literally like reading a flowchart, except that the flowchart didn't abstract the program, but included every necessary detail for execution. This meant that a program written for Prograf and printed would be about three or four times the size of a similar program written in, say, Fortran IV or Ada (I'm picking moderately verbose languages). Perhaps 6 to 8 times the size of one written in Python. This made thinking about the programs very difficult.
If you want another example, you could look at the Helix database, but that didn't abstract away the if/then statement. Still, it was another good program killed by no convenient way of seeing large chunks of the code at once.
Until we can develop a true AI, the best progress that we can make will be based on chunking. There are various ways of doing this, libraries are one popular way. So is "higher order languages". Every language that steps above binary is created through a chunking of lower level concepts into higher level ones. The if/then construct itself is a chunking of lower level concepts (usually test and branch if zero/not-zero, but other tests occur). Very few languages have chunked the if/then construct away, however. Prograf is the only one I can think of. (Unless you allow the Lisp (cond()) statement...which is really a series of if tests in one wrapper. And I think that current Lisp dialects also include a simple If test, but I'm not sure about that.) However you might check J (a language descendant from APL). It also handles chunks in a highly divergent way, and may have eliminate the if/then construct.
What would be nice would be if someone could resurrect Prograf with it's warts polished off. I remain convinced that this was a language with great promise that was stiffled by