It really depends upon the size of the sutff you are doing. If you are going to recompile the same stuff over and over and the dataset will fit in memory... you most likely will get little to no benefit. Linux (Vista and others) cache every single file until some app needs memory and pushes it out. It sounds like he's doing it on a box by himself (not a server shared by 5000 other people), and with memory so cheap... unless you are compiling something huge I'd guess that you probably not have to disk again after the first time it read it in (as long as there isn't another app ran that eats up all the memory, forcing out cached files from buffer cache and at some point freed up all the memory again and the compile is ran again).
For a point app for a single user, spending less on SSD and buying more memory would probably give you much more benefits.