Comment Re:Is aggression really survival+ for tech. societ (Score 3, Insightful) 532
So what about aliens? It's likely that any advanced civilization would have had to overcome or suppress inward-facing aggression in order to remove a significant threat to its own existence, and that could be done through various means such as artificial selection, genetic engineering, tyranny, changing the substrate of the mind from a biological brain to a more easily modifiable artificial information processing artifacts, etc. But such a civilization is still faced with another threat to its longevity. In a universe with accelerating expansion (such as ours), there is only a finite amount of energy and matter within a given Hubble volume that can be used to do work (in the physics sense), for things such as supporting life processes (this is because the expansion of space itself is not limited by the speed of light, and only gravitationally bound portions of the universe -- such as our local group of galaxies -- won't be blown apart; everything beyond will eventually be forever out of reach).
Given this, advanced galactic civilizations are competing for limited resources (energy usable for work). In the very distant future, that would lead to conflict as most available resources are either allocated or contested, and few are left unclaimed. At that point, immense numbers of lives would be destroyed by the losers. It's more ethical and efficient to instead destroy competitors when they're as few in number as possible. This is why sterilizer probes have been suggested as the most likely policy of any advanced spacefaring/colonizing civilization. An advanced civilization has little incentive to suppress outward aggression. Sterilizer probes are self-replicating artifacts sent out to eliminate any life they encounter other than their original creators.
The argument against us sending out sterilizer probes as soon as nanotechnology or biotechnology is advanced enough is that our civilization will be perceived as an aggressor and more likely to be punished. The problem with this argument is that cooperation in game theory problems such as prisoner's dilemma works well as a solution in general only if there are sufficiently many rounds (and even then, only in specific circumstances; see the article that was discussed on Slashdot just a few days ago: http://science.slashdot.org/st...).