Davemania writes: I am working for a research group that requires to do a large amount of data analysis (each of these files could be up to 1 gig in size). We're planning on buying up to 10 pc with linux on them to do these scientific processing (matlab, etc ) but what would be the best configuration for these 10 linux boxes ? Preferably, we would like to maintain all the data on one server and send it out to the Linux box for processing automatically. What approach would be the best, clustering ? OpenMosix, Rock Cluster ? or just simply drop files into the share folder and remote desktop in ?
Slashdot Top Deals
The computer can't tell you the emotional story. It can give you the exact
mathematical design, but what's missing is the eyebrows.
- Frank Zappa