
Journal Zarf's Journal: Behold the power of pipe 3
tar -czf - [list of dirs and files] | \
ssh [user]@[server] "cd /path/to/place/files; tar -xzf -"
I learned this one from one of my bosses a few jobs back. There are a few unixy tricks in that command that took me a while to grasp. I've been using this trick for years now and I thought perhaps you would like to see it.
You create a tar archive and specify the file "-" which is your standard out. Then you list the dirs and files you wish to grab. Since the compressed tar is being sent to stdout you can catch it with a pipe "|" and send it to ssh (the "\" is just a way of breaking a command between lines).
With ssh we log into a server as a given user. Then we ask ssh to run a command for us. Since the ssh session will login to the machine at the user's home dir we ask the command session to change to a path, then run the tar command. This time we are extracting the tar archive we caught with pipe by specifying the file "-" to the tar command.
The result is that the files get tarred, compressed, and shot over the network to a remote machine which then immediately untarrs decompresses and drops the files in place. We let tar do all the work of error handling and handling non-regular files, compression is optional, and there are no archives left over.
If, however, we wanted to create a remote archive without ever consuming storage on the local machine we could instead do this:
tar -czf - [list of dirs and files] | \
ssh [user]@[server] "cat > /path/to/place/file.tar.gz"
If processing power was limited on the local machine then we could do this:
tar -cf - [list of dirs and files] | \
ssh [user]@[server] "gzip -c > /path/to/place/file.tar.gz"
ssh [user]@[server] "cd
I learned this one from one of my bosses a few jobs back. There are a few unixy tricks in that command that took me a while to grasp. I've been using this trick for years now and I thought perhaps you would like to see it.
You create a tar archive and specify the file "-" which is your standard out. Then you list the dirs and files you wish to grab. Since the compressed tar is being sent to stdout you can catch it with a pipe "|" and send it to ssh (the "\" is just a way of breaking a command between lines).
With ssh we log into a server as a given user. Then we ask ssh to run a command for us. Since the ssh session will login to the machine at the user's home dir we ask the command session to change to a path, then run the tar command. This time we are extracting the tar archive we caught with pipe by specifying the file "-" to the tar command.
The result is that the files get tarred, compressed, and shot over the network to a remote machine which then immediately untarrs decompresses and drops the files in place. We let tar do all the work of error handling and handling non-regular files, compression is optional, and there are no archives left over.
If, however, we wanted to create a remote archive without ever consuming storage on the local machine we could instead do this:
tar -czf - [list of dirs and files] | \
ssh [user]@[server] "cat >
If processing power was limited on the local machine then we could do this:
tar -cf - [list of dirs and files] | \
ssh [user]@[server] "gzip -c >
You rock (Score:1)
Rsync is your friend (Score:2)
Yeh, it's kind of cute that tar works on stdin/stdout :). However in the last couple of years I've come to deepend on rsync at times like this:
Here the Music folder on my desktop is sync'd with the laptop. It's also very nice for moving code around.
Re:Rsync is your friend (Score:2)
Only this set of copies was failing out on certain files. Tar keeps going, rsync stops (if I recall correctly... we tried scp and rsync first and got failures due to special files). Also we are able to compress the files before sending them over the network presumably saving bandwidth.
I'd actually seen this type of command the first