CowboyRobot writes: The advantage of parallel programming over serial computing is increased computing performance, achieved by way of reducing latency, increasing throughput, and reducing CPU power consumption. Two approaches for optimizing are to make a program run faster with the same workload (reflected in Amdahl's Law) or to run a program in the same time with a larger workload (Gustafson-Barsis' Law). "Gustafson noted that problem sizes grow as computers become more powerful. As the problem size grows, the work required for the parallel part of the problem frequently grows much faster than the serial part. If this is true for a given application, then as the problem size grows the serial fraction decreases and speedup improves...History clearly favors programs attacking and solving larger, more complex problems, so Gustafson's observations fit the historical trend. Nevertheless, Amdahl's Law still haunts you when you need to make an application run faster on the same workload to meet some latency target."
Real programmers don't bring brown-bag lunches. If the vending machine
doesn't sell it, they don't eat it. Vending machines don't sell quiche.