Follow Slashdot stories on Twitter


Forgot your password?

Bash Cookbook 278

Chad_Wollenberg writes "Anyone who has used a derivative of Unix over the past 20 years has used Bash, which stands for Borne Again Shell. The geek in all of us makes us want to extend our ability to rule the command line. To truly master a Unix environment, you need to know a shell, and Bash is easily the most popular of them. Any Unix/Linux/BSD administrator knows the power at your fingertips is fully extended by what you can do within the Bash environment, and all of us need the best recipes to get the job done." Keep reading for the rest of Chad's review.
Bash Cookbook
author Carl Albing, JP Vossen, Cameron Newham
pages 598
publisher O'Reilly
rating 9
reviewer Chad Wollenberg
ISBN 978-0-596-52678-8
summary A good book for intermediate and above users of Bash
Enter Bash Cookbook. Properly named for the series of O'reilly books that gives you valuable information on subjects in the form of recipes, this book was refreshing in that it was properly organized, and surprisingly contemporary, even citing Virtualized platforms as a way to try out different OS's for Bash. The book does a good job of pointing out the different operating systems that do run Bash, even citing Cygwin for Windows. They also use the POSIX standard, so that all of the examples are portable across platforms.

Bash Cookbook is by no means for the feint of heart. It seems that the book is meant for intermediate and above users of Bash. However, the first several chapters do a significant job of over viewing basic concepts of Bash navigation and combing simple commands. The book quickly changes gears to complex statements on how to get things done in Bash.

By Chapter 7, Bash Cookbook extends out of Bash commands and begins exploring combining the power of bash scripting with useful command such as grep, awk, and sed. To quote the authors, "if our scripting examples are going to tackle real-world problems, they need to use the wider range of tools that are actually used by real-world bash users and programmers." And that is exactly what they do. This chapter alone gave me the ability to do more in the command line environment simply by explaining the functions of the scripts put forth. That is something that any reader, intermediate to expert, can take from this book. The detailed explanations really do give everyone the ability to learn something about the commands, and the references to additional resources often lead me to the computer, looking up further details.

I found Chapter 11 to be very useful (pun intended) finally grasping some concepts on the find command that have previously escaped me. From Chapter 12 on, the book focuses on writing useful and complex scripts. This is where the book really begins to shine for the Unix enthusiast and system administrator. The scripts found in Chapter 12, and their elaborate descriptions begin to show the true power of Bash scripting, and how much you can automate. Chapter 14 is about securing your scripts, and is a heavy read, but well worth reading for any administrator that would be using their scripts in a production environment.

Just when you think this book has reached its limits, it gives very handy customization examples in Chapter 16 on how to configure and customize Bash. And also goes into common mistakes made by the novice user. Combine all of that with the Appendices for quick reference, and this book has not left my side since it arrived. While I would not recommend this book for the novice user, I would recommend this book to any system administrator that has to work with Unix or Linux. If nothing else, the examples given here are full of good, reusable code to make tasks easier in your day to day functions. Well done.

You can purchase Bash Cookbook from Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Bash Cookbook

Comments Filter:
  • by gardyloo ( 512791 ) on Wednesday August 13, 2008 @02:06PM (#24587077)

    I may even buy the book based on the review.

    Leaving aside stuff like not for the feint of heart, which is just poor editing, what the hell does I found Chapter 11 to be very useful (pun intended) mean?

          Maybe it's the ultimate meta-pun, where there was no pun in the first place, but the author pointed out that one was intended, so one was slipstreamed into the statement.

  • by khasim ( 1285 ) <> on Wednesday August 13, 2008 @02:11PM (#24587187)

    Who already own "sed and awk" and buy a book that is supposedly about Bash scripting only to find out that the advanced section replicates what is in the sed and awk book. Which is very annoying.

    Not that it's a bad book. I just believe that it should have been more focused on Bash only scripting.

    If you want to learn about sed and awk, buy the sed and awk book. If you want to learn Bash scripting, there are a LOT of more useful sites online.

  • Largest BASH script? (Score:1, Interesting)

    by Anonymous Coward on Wednesday August 13, 2008 @02:15PM (#24587245)

    I wonder what is the largest BASH script ever made?

    7000 lines of BASH code: []

  • by al0ha ( 1262684 ) on Wednesday August 13, 2008 @02:25PM (#24587451) Journal
    Bash is cool and I suppose this book is decent, though I've found UNIX for Programmers and Users to be the most useful as it covers Bash, SH and KSH. KSH rocks!
  • by ylikone ( 589264 ) on Wednesday August 13, 2008 @02:30PM (#24587549) Homepage
    This book covers the GNU Bourne Again Shell, which is a member of the Bourne family of shells that includes the original Bourne shell sh, the Korn shell ksh, and the Public Domain Korn Shell pdksh. This book is for anyone who uses a Unix or Linux system, as well as system administrators who may use several systems on any given day. Thus, there are solutions and useful sections for all levels of users including newcomers. This book is full of recipes for creating scripts and interacting with the shell that will allow you to greatly increase your productivity.

    Chapter 1, "Beginning bash" covers what a shell is, why you should care about it, and then the basics of bash including how you get it on your system. The next five chapters are on the basics that you would need when working with any shell - standard I/O, command execution, shell variables, and shell logic and arithmetic. Next there are two chapters on "Intermediate Shell Tools". These chapters' recipes use some utilities that are not part of the shell, but which are so useful that it is hard to imagine using the shell without them, such as "sort" and "grep", for example. Chapter nine features recipes that allow you to find files by case, date, type, size, etc. Chapter 10, "Additional Features for Scripting" has much to do with code reuse, which is something you find even in scripting. Chapter 11, "Working with Dates and Times", seems like it would be very simple, but it's not. This chapter helps you get through the complexities of dealing with different formats for displaying the time and date and converting between various date formats.

    Chapter 12, "End-User Tasks As Shell Scripts", shows you a few larger though not large examples of scripts. They are meant to give you useful, real world examples of actual uses of shell scripts beyond just system administration tasks. Chapter 13, "Parsing and Similar Tasks", is about tasks that will be familiar to programmers. It's not necessarily full of more advanced scripts than the other recipes in the book, but if you are not a programmer, these tasks might seem obscure or irrelevant to your use of bash. Topics covered include parsing HTML, setting up a database with MySQL, and both trimming and compressing whitespace. Chapter 14 is on dealing with the security of your shell scripts. Chapters 15 through 19 finish up the book starting with a chapter on advanced scripting that focuses on script portability. Chapter 16 is related to the previous chapter on portability and is concerned with configuring and customizing your bash environment. Chapter 17 is about miscellaneous items that didn't fit well into any other chapter. The subjects include capturing file metadata for recovery, sharing and logging sessions, and unzipping many ZIP files at once. Chapter 18 deals with shortcuts aimed at the limiting factor of many uses of bash - the typing speed of the user and shortcuts that cut down on the amount of typing necessary. The final chapter in the book, "Tips and Traps", deals with the common mistakes that bash users make.

    All in all this is a very handy reference for a vast number of the tasks that you'll come across when scripting with the bash shell along with well-commented code. Highly recommended.

  • by daedae ( 1089329 ) on Wednesday August 13, 2008 @02:36PM (#24587669)

    True story:

    A guy I go to school with (I'm in CS, he's in physics) used to talk about how he was a Bash wizard. Since he was generally talking about writing scripts to submit jobs to a PBS-based cluster, I assumed he meant just in terms of rapidly submitting a large variety of jobs. One day he complains to me that his simulations were slow and wanted me to look at it with him to help him speed them up. So I say fine, send me the file...

    He'd written a particle physics Monte Carlo sim using Bash and Linux command line tools (in particular, there were calls to bc everywhere).

  • by Tetsujin ( 103070 ) on Wednesday August 13, 2008 @03:13PM (#24588255) Homepage Journal

    The lack of structured data and live objects is a feature, not a bug. The fact that everything is a string, and that everything can be piped between all the different commands means that you can string together commands in new and exciting ways that nobody ever thought was possible. Making all the commands pass around different types of objects means that all the other commands have to be aware of all these other datatypes, and have to know how to handle them.

    This is true of textual data as well - you're simply glossing over the complexity of serializing and re-parsing any non-trivial data structure in textual form... You've ignored the fact that for any two tools to work together, their assumptions about the structure with which data is encoded over the stream have to match.

    See, if your text-encoded data is simple enough that you can simply choose a character as a field delimiter and another as a record separator, it's easy to split your data into individual records and fields again. It gets a little harder if you want to provide for the possibility that your records may actually contain the delimiter characters (then you're into parsing - at least enough parsing to distinguish between an escaped character and an unescaped one) Setting up the format so you can really encapsulate anything - that reaches the point where it's worth having a tool whose only job is interfacing with this format...

    The problem here is that normally when you want to interpret some encoded form of a piece of data, you first translate it into something that's easier to work with. But in BASH (and most CLI shells, I believe) there is no "more convenient form" to which you can readily translate any moderately complex structure. (Consider, for instance, how you would implement an XML parser for Bash - as an external command it simply wouldn't work... it'd have to be done as a Bash plug-in module, and even then its capabilities would be limited. And then suppose you want to filter the parsed data and pass it to another process? You've got to re-encode it, and then the next process has to know how to decode this encoding (probably still XML) as well...

    I contend that the "encode everything as a string" mentality was an asset, due to computer limitations in the time period in which the convention started - but these days I think it's pretty limiting.

    If you want something with objects and structured data in the shell, then there's MS PowerShell. But maybe there's a reason it hasn't caught on yet.

    I think Powershell has been progressing (in terms of popularity, I mean) quite satisfactorily... The reason it hasn't caught on with me, however, is 'cause I want a Unix solution (that is, runs on Unix, and fairly Unix-styled), not a Windows one - and while I agree with the logic behind some of their design decisions (like the verb-noun convention for command names) I don't like the consequences (the language is much too verbose for my tastes...)

    I think it's a step in the right direction but not quite what I'm looking for.

    Assuming Powershell hasn't been embraced and taking that as a sign that one particular facet of its design was a bad idea is pretty laughable. Any new tool takes time to be adopted - and Powershell is a tool for a fairly small niche - CLI users on Windows.

  • by dkixk ( 18303 ) on Wednesday August 13, 2008 @06:20PM (#24591211)

    For this reason, I wish that things like the zoidberg shell [] would hurry up & mature. (Yeah, yeah, I would work on it myself except I would probably be about as useful as the namesake Dr. Zoidberg is as a doctor.)

    "Ask not what you can do for your shell, ask what your shell can do for you!"

    I've always avoided shell scripting as much as possible. Here is an example of the kind of thing that drives me nuts about shell.

    $ touch "file 1" "file 2" foo bar
    $ for i in *; do touch "$i.titles"; done
    $ ls
    bar bar.titles file 1 file 1.titles file 2 file 2.titles foo foo.titles
    $ grep -l beer `find . -name \*.titles -prune -o -print`
    grep: ./file: No such file or directory
    grep: 2: No such file or directory
    grep: ./file: No such file or directory
    grep: 1: No such file or directory
    $ find . -name \*.titles -print -o -exec grep -l beer {} /dev/null \;
    ./file 2.titles
    ./file 1.titles

    One needs to remember to quote the touch used in the for loop because of the spaces in some of the names. However, when using find with backticks, the spaces in the names become part of the delimiter of the list of file names. An alternative version of the find command works, but only if one remembers to use the later instead of the former.

    The idea behind Zoidberg is not new or unique. From the copyright for Zoidberg, it seems that it was first released in 2003. However, scsch [], to Scheme what Zoidberg is to Perl, seems to have been around at least as long as 1994. And one should remember Genera [], an operating system in which everything was Lisp from the ground up, and dates back to the early 80s. A Lisp listener was a "shell", from what I can tell, having only a second hand familiarity with it.

    But I personally like having the combination of Emacs and inferior processes, especially SLIME. With it, I'm able to switch between Emacs GUI, Emacs lisp commands, Emacs key commands, Common Lisp, or even just running a standard bash shell all from within one environment. So I can use the shell command approach, returning one big string as a list of files, with the same problem of file names with spaces looking like separate elements of the list, Emacs lisp code, or even Common Lisp code.

    LAMP-USER> !$(find /home/dkick/tmp/duff -name *\.titles -prune -o -print)
    /home/dkick/tmp/duff/file 2
    /home/dkick/tmp/duff/file 1
    LAMP-USER> (remove "titles" (list-directory "/home/dkick/tmp/duff/") :test #'equal :key #'pathname-type)
    (#P"/home/dkick/tmp/duff/bar" #P"/home/dkick/tmp/duff/file 1"
    #P"/home/dkick/tmp/duff/file 2" #P"/home/dkick/tmp/duff/foo")

  • by Just Some Guy ( 3352 ) <> on Wednesday August 13, 2008 @10:16PM (#24593625) Homepage Journal

    You don't have tab-completion enabled. I don't mean as in complete-the-filename but as in complete-the-command-arguments. For example, in Bash and Zsh with completion enabled, you can enter ifconfig <tab> and it will complete the interface name, or ssh -o<tab> to get a list of options that can be set (and then go on to tab-complete their possible values).

    I will never again voluntarily use a shell that doesn't support this.

  • Weakest. Pun. Ever. (Score:2, Interesting)

    by Wowlapalooza ( 1339989 ) on Wednesday August 13, 2008 @10:34PM (#24593829)

    I found Chapter 11 to be very useful (pun intended) finally grasping some concepts on the find command that have previously escaped me

    I think the Chapter 11 = bankruptcy folks have it completely right; the reviewer's attempt at humor was completely bankrupt here.

Many aligators will be slain, but the swamp will remain.