Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
User Journal

Journal Degrees's Journal: Backing up OMG number of files - Parallel::Forkmanager did well 3

In my last JE, I asked how to back up a directory structure with way too many subdirectories and files in it. One of the ideas was to do one tar job per top-level directory. Since this one didn't take any more disk space, the SAN guys wanted me to try it first. I had to learn how to use Perl's Parallel::Forkmanager. It worked well.

I have 256 top level directories, so I forked one tar job per directory. We took the VM that runs the server and configured it for four CPUs. I configured Parallel::Forkmanager to only allow four concurrent forked tar jobs. What was a 70+ hour single tar job became a 30 hour batch of 256+ tar jobs. The new scheme runs in 43% of the time of the old job. That's great.

Thank you for the advice. I'm using it well. :-)

This discussion has been archived. No new comments can be posted.

Backing up OMG number of files - Parallel::Forkmanager did well

Comments Filter:

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...