Journal Degrees's Journal: Backing up OMG number of files - Parallel::Forkmanager did well 3
In my last JE, I asked how to back up a directory structure with way too many subdirectories and files in it. One of the ideas was to do one tar job per top-level directory. Since this one didn't take any more disk space, the SAN guys wanted me to try it first. I had to learn how to use Perl's Parallel::Forkmanager. It worked well.
I have 256 top level directories, so I forked one tar job per directory. We took the VM that runs the server and configured it for four CPUs. I configured Parallel::Forkmanager to only allow four concurrent forked tar jobs. What was a 70+ hour single tar job became a 30 hour batch of 256+ tar jobs. The new scheme runs in 43% of the time of the old job. That's great.
Thank you for the advice. I'm using it well.
Nice. (Score:2)
This gives me an idea about some slow file transfers...
Re: (Score:2)
jdownloader
Re: (Score:2)