Davemania writes: I am working for a research group that requires to do a large amount of data analysis (each of these files could be up to 1 gig in size). We're planning on buying up to 10 pc with linux on them to do these scientific processing (matlab, etc ) but what would be the best configuration for these 10 linux boxes ? Preferably, we would like to maintain all the data on one server and send it out to the Linux box for processing automatically. What approach would be the best, clustering ? OpenMosix, Rock Cluster ? or just simply drop files into the share folder and remote desktop in ?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's now on IFTTT. Check it out! Check out the new SourceForge HTML5 Internet speed test! ×