As others have pointed out, deploying EC2 instances automatically is fairly easy using the well-documented EC2 APIs.
The difficult part about distributed computing is synchronizing the work between available instances. For this, you might want to look at RabbitMQ or other queueing servers. One way to do this would be to have one thread (on your computer) generating problem instances, while you spawn spot instances on EC2 as desired, which consume the work and report the results. I suspect you could accomplish something similar using Hadoop/MapReduce.
We will miss you.
The second class will not be very useful to you. I've heard this rumor propagated time and again, and no one can ever give me a convincing argument why such a class would be useful, other than for graphics and numeric computation.
The first class would be much more useful. Algorithms is the more or less the study of the math of programming. If you are seriously considering programming, you should learn this topic in great detail. Judging by the number of topics covered, I am assuming this is a lower-level course. You should definitely take at least one low-level computer science theory course!
One other area you may want to look at is logic -- look for Dijkstra's book "A Discipline of Programming".
I like Clipperz. You don't need to have anything installed, which is nice. They host your passwords in encrypted form.
First, this is pretty cool. Enough said about that.
Unfortunately, I don't think this will be useful for solving NP-complete problems. For those of you who don't know much about algorithms, NP-complete problems are hard to solve because they become much harder as you make the problem "bigger". It is perfectly possible for problems to be solvable in a reasonable amount of time for small problem sizes, like n=3 that the authors of this article solved.
The paper explains that because bacteria can multiply exponentially, they can multiply until they have enough nodes to solve the problem. Well, there's a problem with that thinking. Bacteria, like computers, need resources. Presumably, if you double the bacteria's food/resources, you will not find an exponential growth in the number of bacteria that can be sustained. If this is true, then there is certainly a problem size that will make using bacteria intractable, which negates the benefits of using bacteria.
The people that wrote the article in the story you mention had no idea what they are talking about.
Peter Gutmann, one of the experts in this area, specifically responded to that article (see further epilogue).
What stops someone from recording a human looking at the page, and then replaying that behavior from a bot?
Also, will humans actually want to send the information needed for this to remote websites? I don't really want a website to know what part of the page I'm looking at.