On an earlier post, tqft (619476) kindly left an informative reply. That and some random articles on strong AI have had me thinking. It's Sunday, the coffee is hot, and I feel like jotting down a few thoughts.
A summarization of some other's thinking goes kind of like this: The mind is too complex to be implemented in silicon or by digital computers. Digital computers are deterministic, and thus could not be used to implement strong AI or intelligence.
That all sounds good, if you lived 500 years ago IMO, but when you consider information like that passed on by tqft (link above) you have to think more about it.
Yes, a simplistic linear program cannot imitate intelligence. Even a complex multi-threaded linear program cannot do so. This is easy enough to agree to. If it was not true we'd already have Strong AI among us.
What I see is that seemingly every day another discovery is made about the human brain, human physiology, and the body in general. Recently there was a discovery about a new enzyme in the human and rat brains. Yes, that changes how we need to study the brain with regard to genetically managed functions. We as a race have mapped various areas of the brain that are involved with specified functions. If you think of not a single computer but a group of them working in concert, each has only a small number of 'tasks' to work on. I don't mean try imitating the human mind on an IBM super computer but on a huge cluster of them.
Why so much HPC power? Simple: we are only now finding out the many ways in which neurons pass information and form data sets. Think of how we remember things, associate things, process information, process geophysical location, and many other things that most of us take for granted since it has been part of our lives since day one. Each of those processes needs to be managed, to be parsed and fed to various other processes. Nothing linear about it.
We know that memories act on each of us. The smell of burning weed will cause each of us to have a different memory brought immediately to mind. Why? It's not something we are programmed to do, so where does it come from? It comes from our total previous experiences and what we personally have marked as important among them. This puts subjectivity in our minds rather than objectivity or programmed response.
Strong AI researchers should be working on imitating parts of the brain by function, not by form. Imitating 10 billion neurons is not going to do it, imitating the function of that group of neurons one function at a time is... IMO.
Vast amounts of information must be processed in different ways at the same time. A quick example is that of a person seeing a picture of a plastic duck, one of those pull toy things with wheels on it.
Even as you read that, you had things come to mind. What were they? Some of us who surf the less wholesome parts of the Internet will think of
Today's computers do not have the ready ability to store and process that volume of data. Even we humans cannot store all of the experiences of humanity, thus we have different reactions to any stimulus. Social functions help us to have the same memories and reference points. This is a problem for which there is relatively no simple ready answer. Memories.
Some who argue that getting the intelligence of a dog or cat into silicon is a good start, but that still has the problem of storing and processing memories. Even that seems a distant goal. Memory that last no longer than 'power down' just won't do it. Creating software that can put together memories even for simple things, like the experience of being in the living room and moving to the kitchen, and understanding what 'outside' means will be difficult without a functioning brain.
I think we need to work on putting supercomputers in very small spaces. Maybe the size of a USB memory stick so you can plug many of them into a system to work together.
Seems a long way off