Comment Re: May be big for AI (Score 1) 33
It is actually pretty deep with AI. Long ago here I was writing about how memory that minimizes erasure should use less energy than flash memory with its erase cycles, and that is what MRAM does. Spin changes only consume energy with bit change instead of full erase voltage discharge. (Landauerâ(TM)s principle) If you look a brain architecture (20 watts) vs AI simulations (200,000 watts to do similar) on difference is the brain maximizes non erasure. Thoughts are combinatorics selections that fire others based off learned weights. Only in learning (weight change) are things erased. The computed simulation of the same erases from processors with each step. So thereâ(TM)s that. Then the other thing is for general compute, AI is a statistical stochastic process modeled deterministically with fake randomness added. There is probably a way energy cheaper way to do it using natural noise and randomness.