Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×
User Journal

Degrees's Journal: Backing up OMG number of files - Parallel::Forkmanager did well 3 3

In my last JE, I asked how to back up a directory structure with way too many subdirectories and files in it. One of the ideas was to do one tar job per top-level directory. Since this one didn't take any more disk space, the SAN guys wanted me to try it first. I had to learn how to use Perl's Parallel::Forkmanager. It worked well.

I have 256 top level directories, so I forked one tar job per directory. We took the VM that runs the server and configured it for four CPUs. I configured Parallel::Forkmanager to only allow four concurrent forked tar jobs. What was a 70+ hour single tar job became a 30 hour batch of 256+ tar jobs. The new scheme runs in 43% of the time of the old job. That's great.

Thank you for the advice. I'm using it well. :-)

This discussion has been archived. No new comments can be posted.

Backing up OMG number of files - Parallel::Forkmanager did well

Comments Filter:

Any programming language is at its best before it is implemented and used.

Working...