Zero? No, that is incorrect---both in theory and in the normal conversational context.
Did you read your own links?
Per Landauer's principle, it takes a small amount of energy. In that same article, it states that modern computer consume millions of times the theoretical minimum. So, technically, the energy requirement is non-zero, and practically it can be quite high.
The limits of computation have a great deal to do with energy, as any given computation must occur on some physical medium, and that medium consumes energy while operating. It is extremely myopic to claim that energy has nothing to do with the limit of computation.
IBM, Intel, and the other guys have all done a lot of work to reduce the energy required for computation. The number of operations per watt has skyrocketed in my lifetime---and can continue to do so at the current rate for quite some time. Energy consumption and thermal constraints limit computational capacity at every level, and to claim otherwise is simply ignorant or disingenuous.