Become a fan of Slashdot on Facebook


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Journal Zarf's Journal: Behold the power of pipe 3

tar -czf - [list of dirs and files] | \
    ssh [user]@[server] "cd /path/to/place/files; tar -xzf -"

I learned this one from one of my bosses a few jobs back. There are a few unixy tricks in that command that took me a while to grasp. I've been using this trick for years now and I thought perhaps you would like to see it.

You create a tar archive and specify the file "-" which is your standard out. Then you list the dirs and files you wish to grab. Since the compressed tar is being sent to stdout you can catch it with a pipe "|" and send it to ssh (the "\" is just a way of breaking a command between lines).

With ssh we log into a server as a given user. Then we ask ssh to run a command for us. Since the ssh session will login to the machine at the user's home dir we ask the command session to change to a path, then run the tar command. This time we are extracting the tar archive we caught with pipe by specifying the file "-" to the tar command.

The result is that the files get tarred, compressed, and shot over the network to a remote machine which then immediately untarrs decompresses and drops the files in place. We let tar do all the work of error handling and handling non-regular files, compression is optional, and there are no archives left over.

If, however, we wanted to create a remote archive without ever consuming storage on the local machine we could instead do this:

tar -czf - [list of dirs and files] | \
    ssh [user]@[server] "cat > /path/to/place/file.tar.gz"

If processing power was limited on the local machine then we could do this:

tar -cf - [list of dirs and files] | \
    ssh [user]@[server] "gzip -c > /path/to/place/file.tar.gz"

This discussion has been archived. No new comments can be posted.

Behold the power of pipe

Comments Filter:
  • I understand those commands, even without your explanation. I just would never have thought of them.
  • Yeh, it's kind of cute that tar works on stdin/stdout :). However in the last couple of years I've come to deepend on rsync at times like this:

    alias rms='rsync --progress -Pave ssh'
    alias rds='rsync --delete --progress -Pave ssh'

    rds Music laptop:

    Here the Music folder on my desktop is sync'd with the laptop. It's also very nice for moving code around.

    • Yes, now that I think about it rsync would do nearly the same thing. But there are good reasons not to use rsync. (One of them being that neither box has rsync installed.)

      Only this set of copies was failing out on certain files. Tar keeps going, rsync stops (if I recall correctly... we tried scp and rsync first and got failures due to special files). Also we are able to compress the files before sending them over the network presumably saving bandwidth.

      I'd actually seen this type of command the first

The decision doesn't have to be logical; it was unanimous.