Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Comment Re:But don't equate coding with comp-sci (Score 1) 132 132

Yes, there is a-lot of programming involved today, but Watson is not just a search engine. So-called "deep learning" algorithms are really neural network simulations. They are programmed because most people don't have neural chips and so to create neural networks, people have to code then as a simulation. That's why IBM has now developed its "True North" chipset, with a million silicon neurons per chip. These are not programmed - they learn - and they run 1000 times faster (and with much less power) that equivalent simulations. Also, if you look at the code for deep learning neural network simulations, you will see that it is an implementation of neural network topologies (e.g., a cascaded restricted botzman network) and training algorithms (e.g., contrastive divergence) - you can't work on that code unless you understand those algorithms - the real work is in developing and fine-tuning the algorithms - not in the coding. Most of the people who work on that code are PhDs in AI - not programmers per se.

It is also true that there is a-lot of "glue code" to make these systems scale, but that is the type of code that I think will eventually be replaced by machine learning systems. But I don't have a crystal ball - we shall see!

Comment Re:But don't equate coding with comp-sci (Score 1) 132 132

Deep learning systems are not computers. They are neural networks. They are not programmed. If one runs such a system on a conventional computer, one is actually simulating the neural network - it will run 1000 times faster if you run it on a true neural network without a computer.

Comment Re:But don't equate coding with comp-sci (Score 1) 132 132

Actually, that is not true. Deep learning systems can learn to do things like weld. In the case of deep learning systems, the training involves having the system read a-lot of information. It "learns" very much the way that our brains learn. It is not like Prolog, which is a logic based system. Deep learning systems are networks connected by weighted paths, like the brain. Comparing these systems to a brain is premature - we don't yet understand the organization of the brain, but deep learning systems are neural networks like the brain is, and they can learn unstructured tasks by trial and error and by being shown - just as a welder would learn.

Comment Re:But don't equate coding with comp-sci (Score 1) 132 132

Yes, true, but those programmers are PhDs who have in-depth understanding of machine learning systems. And much of what they do is mathematics. Also, neural network "programs" are really just simulations of neural networks, e.g., "restricted boltzman networks" which are the key to current "deep learning" systems. If one uses an actual neural network (as in IBM's True North chip), there is no conventional programming.

Comment Re:But don't equate coding with comp-sci (Score 1) 132 132

Yes, I lived through CASE as well. I think you are right, this is not going to happen tomorrow. But I think it is coming. Watch the TED talk about deep learning - it is very enlightening about current prospects. I am anticipating that there will be further improvements - current learning systems cannot replace a programmer, but it seems to me that it does not have far to go, and there will be quite a lengthy period, I think, in which people will need to advise the systems and "operate them", the way that doctors operate the current Watson medical system. But it is anticipated that Watson will not need to be "operated" forever - that it will be able to act on its own. I guess we will have to wait to see what happens! :-/

Comment Re:But don't equate coding with comp-sci (Score 3, Interesting) 132 132

No, and perhaps I am wrong. But it is early days. Hinton's breakthrough in 2006 has opened up machine learning to a wide range of things that we thought impossible with those types of systems. Look at what is being done with IBM's Watson system - it has shrunk from a room to three pizza boxes and it is being used for medical diagnosis. Also look at their "True North" chips - these are not computers, but neural chips: each chip has a million neurons on it, and it can form connections to any other neuron on the system. This is early days, and we are still learning how to organize and train these systems. It is interesting that Hinton's breakthrough was in a training algorithm. I think that the handwriting is on the wall, but I could be wrong. But what I expect to see in 10 years is analysts working with machine learning systems to define requirements and the system takes it from there. Remember that systems like Watson are not programmed - they are trained, and they read the same things that you and I read (Watson has read all of wikipedia), and can listen and speak. They have already proven that they can do original research and have original insights that are beyond the reach of people due to the complexity.

Comment But don't equate coding with comp-sci (Score 2, Interesting) 132 132

As long as they don't equate programming/coding with computer science. Coding is likely to be obsolete in a few years - replaced by deep learning systems as those systems increase in capability, and so the last thing we should do is steer kids away from math and toward coding. Computer science - as opposed to coding - is timeless and will continue to evolve - and dramatically change, with a greater emphasis on how to create and use machine learning systems. But somehow I doubt that public schools will understand these issues.

Comment Programming has no future beyond 20 years (Score 1) 306 306

I agree that understanding computers is important, but there is much more to the "computing landscape" that programming. Remember that to a hammer (i.e., programmer), everything looks like a nail (i.e., a program). Machine learning is the new paradigm, and there is no programming. IBM's "True North" chip is a neural network chip - not a programmable CPU. In 20 years (maybe sooner) no human will be programming. So we should not be telling kids that being a programmer is a "career".

Comment Re:No. (Score 1) 507 507

Well, it is up to them to agree. It is indeed your job to propose ideas, but users get to decide if your idea is the right one. In Agile that role is called the Product Owner (PO). The Product Owner owns the project. The dev team interprets the PO's requests, and writes those down as "stories". The PO must agree that the story is what he/she actually wants before the team works on a story. When the story has been implemented, the PO decides if is REALLY what he/she wants. If it is not, then the team still gets credit for completing the story, but the PO has the right to write a new story that changes things. The PO is paying for the project, so they can do that as long as they want. However, the team is supposed to help the PO to track progress toward the overall goal.

Comment Re:No. (Score 1) 507 507

There are certainly Agile teams that work that way, but that is not how well run teams operate. Agile has a reputation for not planning, but the opposite is true - there is a-lot of planning in Agile. Well run teams also design. The difference is that Agile teams _allow_ requirements to change as a result of customer feedback, instead of trying to stick to a set of "agreed upfront requirements". Agile teams work at a steady pace, producing working software at regular intervals, instead of waiting for a huge release six months down the road - when it is often discovered that the design is unworkable. Agile projects can go off the rails just like any project, but in an Agile project, you find out very soon, instead of toward the end. There is no magic bullet for this stuff. The intent of Agile is to divide the work up into small items that are deliverable incrementally. It is not rocket science. You still have to do that dividing intelligently, and you still have to practice software design. Some in the Agile community disparage design somewhat, advocating test-driven development, but that point of view has been largely discredited. See

When some people discover the truth, they just can't understand why everybody isn't eager to hear it.