Now lets say that the list of investors is empty, the code runs, and distributes $0 to no one, then his account drops by $1 mil.
Ok, I understand everything up to the "and distributes $0 to no one", and as weird as that sounds, would technically be correct. But why would the computer system then deduct $1M from the account afterwards if no money was distributed?
Seems to me, no matter how many investors there are, that the amount deducted would be equal to the sum of each investor payout. So if there is no investors, and no money is disbursed, then the sum of that money dispersed, would be $0.00 and no money would be deducted from the account.
Why would you use the original amount in the account to base what the account withdawal total would be, that is asking for problems even if there are investors. For instance if you devide the amount by a number of investors, and the results end up having fractions of cents involved, you'd want to round to the whole cent, and then add up what each investor is getting to account for the rounding of fractional cents. Otherwise, you would have pennies disappearing from the account and into thin air.
"Just think, with VLSI we can have 100 ENIACS on a chip!" -- Alan Perlis