This is getting closer to the true issue here, no-one can actually point to a "thought". We can run MRIs, we can do all the fluorescing in rat brains that we want, but at no point can we, as humans, point to a thought.
All we can see and know about, at the moment is the machinery. The brain is just the machinery for our minds, neurons, synapses, etc. A computer system that is entered for the Turing test (or Deep Blue, or the Jeopardy machine(forget its name)), is again just that, the machinery. Each set of machinery is doing processing of some description that is observable and quantifiable, but as we do not understand the mechanism that turns the processing in the brain into "thoughts", we cannot tell if a computer thinks... Perhaps we are killing many computers each day as they are unable to meaningfully communicate their ability to think to us.
I'm steering well away from self-awareness here, as this is a misnomer. Sentience is not necessarily about self awareness, as a computer can be taught to recognise itself, process information about itself, even be selfish (as some has posited is required for sentience), rather sentience is more rather used as a bucket to separate one set of processing from another. Is a tiger more sentient than a fly? They both have a certain level of information processing, and without the ability to show that one "thinks" while the other does not, be cannot portion out sentience to one or the other.(1)
So if we cannot show that humans, much less animals, much less computers think, what are we left with? Complexity of processing, not the amount of processing but how complicated a process can become. Neuronal structures are excellent at this, thousands of connections per neuron allow for a massive amount of complexity of processing. Each process balances up elements that might not even appear to be relevant to the process, such as feedback from the autonomic nervous system, whether you are hungry or not or pain from your tooth trying to get your attention (and therefore suppressing other inputs). Add in non-processing factors from external influences, taken any pain killers? How about some opiates?
Until the complexity of processing that happens in our brains are matched by the machines we build, we are unlikely to see anything that we could identify as "thinking" on a par with ourselves, the Turing test is not a test for an intelligent machine, it is essentially a processing test round a Markov chain.
(1) Behavioural tests here are insufficient as all these prove is that the behaviour of the fly or tiger is unexpected by our own definition of what a sentient creature would do, which makes the whole thing subjective.