Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Who cares? (Score 1) 685


I'm sick of talking about this... who cares what it is? Potential Possibilities:

A). Malware--Another piece of malware on a windows system... who cares... they deserve it if they use windows (and by that I mean it's only a matter of time until they caught something else anyways)

B). Virus -- a yet unidentified virus etc. Once again, who cares, there are millions of these things out there.

C). Symantec Rootkit -- once again, who cares, people have got to be snorting something if you don't think feds have surveillance code in windows to start with... it's just one more group monitoring us; big deal.

Either way, I'm just tired of reading about it on all the websites I frequent and I'm looking forward to laughing about this later when someone does figure out which of the above it was (the solution to which also does not affect me).


Comment Do you REALLY need a NAS? (Score 1) 517

It sounds to me like you need to better define the criteria which require a NAS. Whenever looking into new equipment it's always helpful to define the ideal situation in which your new equipment would function... for me based on what you mentioned I would say the ideal would be:

-0 wait time for transfer
-Portability of data

At no point did you mention other people needing the data while you're running the tests. Based on those criteria I would shy away from a NAS. Unless you're willing to shell out a grand for a Linux solution, a 1 TB external drive with an eSata connection will be ~$900 cheaper and possibly perform better to those criteria above. When you're not doing the testing you can simply plug it into an idle machine for people access the data over the network and to perform backups.

I'm a big fan of home-brewed Linux Raid NAS solutions, I setup and maintain a few myself for several different organizations and I regularly transfer 100+ gig dumps of files... however I would avoid doing all of this if I could. Keep it simple... it's much easier to just carry the data with you from machine to machine if your files are exceedingly large... after all HD's are cheap now-a-days.

On another note if you do insist on a NAS, one thing I've noticed is that Filesizes can easily affect transfer rates. The smaller the average filesize, the larger the individual overhead is in comparison to the file. In other words if you're transferring 100meg files and the overhead is a theoretical 1kb then the majority of time is spent moving the file but if you're transferring 1kb files and the overhead is still 1kb then you spend just as much time on the overhead as on the files themselves.

One trick I've learned to speed large transfers of small files along is to turn your files into a storage-level-compression tarball on the fly with a blowfish cipher and pipe the output over ssh to the other machine, where it is disassembled on the fly as well. This means that you're only transferring one file across the pipe and there is less overhead to the total transfer. This trick keeps the transfer rate on my machines steady at around 10-20mb/s as opposed to 6-7mb/s and those numbers are on a standard configuration with no special hd's or raid implemented. I can only expect the numbers to be better with raid. Just things to think about.

Slashdot Top Deals

Nothing succeeds like excess. -- Oscar Wilde