Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Linux Commands, Editors, & Shell Programming 118

norburym writes "Mark G. Sobell is well known for several comprehensive and well-written volumes: A Practical Guide to Solaris; A Practical Guide to Red Hat Linux: Fedora Core and Red Hat Enterprise Linux (2nd ed.); and A Practical Guide to the Unix System (3rd ed.). It seems only natural for the author to follow these exceptional examples with yet another excellent book entitled A Practical Guide to Linux Commands, Editors, and Shell Programming . Read on for Norburyms' review.
A Practical Guide to Linux Commands, Editors, and Shell Programming
author Mark G. Sobell
pages 1008
publisher Prentice Hall PTR
rating 9
reviewer Mary Norbury-Glaser
ISBN 0131478230
summary Linux Commands, Editors, & Shell Programming


While the author has covered some aspects of this material in A Practical Guide to Red Hat Linux: Fedora Core and Red Hat Enterprise Linux, there is more than enough here (in-depth coverage of the vim and emacs editors; tcsh; sed and gawk; and the command reference section that comprises Part V) to make this a new and exhaustive volume that should take top dog in anyone's Linux library.

Sobell splits the book into six parts with Chapter 1 acting as a preface to the rest of the book. It gives a history and an overview of Linux and discusses distinctive aspects of the operating system that make it different from others. We've all heard and read the arguments before: Linux is superior to Windows, TCO is lower with Linux, Linux is not proprietary, etc. but Sobell avoids this display of arrogance and superiority by treating the origins and features of Linux as an evolution of best practices and common sense. As such, we're not left with a suspicion that the author has blinders on. To the contrary, the reader can proceed with an open mind to learning the intricacies of Linux and the command line.

Part I isn't geared for experienced users of Unix or Linux but it does serve as a good skimming point for those sysadmins who may need to brush up. For the beginner or the novice, however, these four chapters give a compact and succinct introduction to using Linux and set the stage for the sections to follow. Chapter 2 begins with logging in from a Terminal, including emulation, ssh and telnet. The author explains how to tell which shell the user is running; how to work with the shell; and how to use help, man and info. Chapter 3 is a catalog of basic utilities with all the usual suspects: ls, cat, rm, cp, mv, grep, hostname, head, tail, sort, echo, date, etc.; compressing and archiving tools: bzip2, bunzip2, gzip, tar; locating commands: which, whereis, apropos, slocate; commands used to get user and system information: who, finger, w; and commands used for communication: write, mesg. Sobell gives each utility a brief but thorough description of its function, appropriate syntax and practical uses. Chapter 4 is a complete treatment of the Linux hierarchical filesystem: directory and ordinary files; absolute and relative pathnames; how to work with directories; hard and symbolic links; and access permissions. Chapter 5 is where the reader gets a closer look at the shell. Sobell covers command line syntax (command name, arguments and options), processing and executing the command line, standard input and output (including pipes and a really nice explanation of device files), redirecting standard input and output, running a program in the background and aborting it using kill, generating filenames and expanding pathnames using wildcards/special characters, and utilities built into the shell like echo (and how to list bash and tcsh builtins).

Part I is a comfortable read. It moves along quickly and with quite a bit of information but not so much to overwhelm. By the conclusion of Chapter 5, the beginner or novice can feel pretty competent with the CLI.

Part II is dedicated to the vim editor and the emacs editor, both enjoying a chapter to themselves. Sobell happily avoids adding fuel to the already flaming fire of which editor is "the best." Chapter 6, "The vim Editor," and Chapter 7, "The emacs Editor," both use a tutorial approach to demonstrate the use of each text editor. The author includes a brief history of the development of the editor before giving a fairly complete lesson on creating and editing files within that particular editor. Some highlights of Chapter 6 include: vi clones; details of vim commands like join, yank and put; and advanced editing techniques like using markers and creating macros. Chapter 7 features: an explanation of emacs key notation and key sequences; incremental and nonincremental searches; advanced editing techniques like using the buffer to undo changes; using Mark and establishing a Region; yanking killed text; and manipulating windows (splitting and adjusting, for example).

Learning at least one editor to a level of competency is an absolute must. Sobell provides excellent instruction on both vim and emacs and along with the tutorials and the exercises at the conclusion of each chapter the reader will be sufficiently proficient in both to choose a favorite.

Part III, "The Shells," discusses the Bourne Again Shell (bash) and the TC (tcsh) shell with careful detail to each interpreter/language. The author stresses that bash, rather than tcsh, should be the shell of choice for programming and this is reflected in the instruction set forth in each of these two chapters.

Chapter 8 concentrates on bash: shell basics (startup files, redirecting standard error, simple shell scripts, separating and grouping commands, job control, directory stacks); parameters and variables (shell and user-created variables, variable attributes, keyword variables, special characters); processes (structure, identification); history mechanisms (reexecuting and editing commands, referencing events using !, use of the Readline Library); using aliases; shell functions; controlling bash features and options (using command line options and the set and shopt builtins); and a description of how bash processes the command line (command line expansion).

The TC Shell (tcsh) gets equal attention in Chapter 9. The author aims to show how tcsh differs from bash while providing a broad overview of the shell: shell scripts; entering and leaving tcsh; tcsh startup files; features common to bash and tcsh (and how tcsh implements them in a different manner) including command line expansion (tcsh calls it "substitution"), history, aliases, job control, filename and command substitution, and directory stack manipulation; redirecting standard error using >&; command line (word completion, command line editing, spell correction); variables (substitution, string variables, arrays, numeric variables, using braces, shell variables); control structures (if and goto, interrupt handling using onintr, if...then...else, foreach, while, break and continue, switch); and tcsh builtins.

Part IV, "Programming Tools," is the logical progression from the previous discussions of editors and shell basics. Sobell splits this part over four topics: programming tools, programming bash, gawk and sed.

The focus of Chapter 10 is programming tools. In particular, attention is given to writing and compiling C programs. Sobell shows how to check for your GNU gcc compiler and then gives a C programming example with a simple C program that converts tabs to spaces while maintaining columns. He takes this a step further by compiling his example C program to create an executable file. He also addresses shared libraries, fixing broken binaries, using GNU make to resolve dependencies, debugging techniques, threads, and system calls for filesystem operations and for processes control. I especially like the inclusion of the make utility. Sobell provides a nice graph that shows dependency relationships and uses an example makefile to illustrate dependency lines and construction commands. The rest of the chapter deals with source code management and using the CVS (concurrent versions system) utility and TkCVS (a Tcl/Tk-based GUI to CVS).

The next chapter is a return to bash with more detail to shell programming. The author uses this section to cover control flow contructs (if...then, if...then...else, etc.); file descriptors; more detail on parameters and variables (array variables, locality of variables, special parameters like $$ and $?, positional parameters like $#, $0 and $1-$n); expanding null and unset variables; bash builtin commands (type, read, exec, kill, etc.); and expressions (including a table of bash operators). The chapter concludes with the creation of two example shell programs: a recursive shell script that illustrates recursive operations and a "quiz" shell script which presents questions with multiple choice answers. The author walks through both of these step-by-step and points out potential pitfalls as he creates and executes a working design. Sobell should be congratulated for putting together a well-balanced and complete chapter. The exercises are thoughtfully constructed.

The Gnu awk (gawk) utility and the sed (stream editor) utility complete the final two chapters of the book. Both chapters include syntax, arguments, options and a fair number of examples.

Part V is the command reference section and this constitutes a volume in itself. This is, essentially, a printed version of man pages of utilities and shell builtins. Sobell gives us a bonus above Linux man pages, though: he includes extremely useful and pithy examples with each entry along with interesting discussion and notes sections. I would love to see the "Command Reference" as an electronic, searchable version! Perhaps as a CD included with the book in future, instead of in print.

The Appendixes make up Part VI. Regular expressions used by gawk, sed, vim, emacs, etc. are described in Appendix A. Help options, including Web sites for documentation on Linux systems (GNOME, GNU, KDE, etc.), Linux newsgroups and mailing lists, software packages (CVS, Freshmeat, Sourceforge, Tucows-Linux, etc.), Office suites (AbiWord, KOffice, OpenOffice, etc.), and how to specify a Terminal make up Appendix B. The last appendix shouldn't be ignored or overlooked: Keeping the System Up-To-Date. This section describes yum, Apt and BitTorrent. Kudos to the author for reminding readers to maintain their systems and providing good instructions on how to do so.

A Glossary of terms and the Index conclude the book.

The layout of the book is well designed: the typography is comfortable to read and, although physically hefty, the dimensions of the book give the reader a nicely balanced paperback. Nothing fancy but excellent quality and eminently readable with delineated examples and good font choices.

Every chapter begins with a brief introduction and ends with a chapter summary, exercises, and advanced exercises where appropriate. The exercises are a highlight of the book: Sobell has obviously given these a lot of thought and they are exemplary of the chapter topics that they reference. Answers to even numbered problems can be found at the author's Web site.

Overall, it's hard to find anything to complain about here that wouldn't sound inconsequential and trifling. No mistake, this is a big book: Part V alone (command reference) is a volume in itself. But I can't see anything extraneous or non-essential here. The author combines all the important features and tools together with the appropriate and necessary references.

Sobell has compiled an extensive volume that both newcomers to Linux and experienced users will find extremely useful. Once in hand, A Practical Guide to Linux Commands, Editors, and Shell Programming becomes not only a complete tutorial but also an invaluable resource that will be referenced time and again. This is as close to a textbook as you can get without being tormented by dry sterile language; Mark G. Sobell clearly has a command of his subject and he exudes a passion that infuses his writing and clearly elevates this book above any mere manual. This will become a standard and as such, a "must have" for anyone serious about learning command line scripting.


You can purchase A Practical Guide to Linux Commands, Editors, and Shell Programming from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Linux Commands, Editors, & Shell Programming

Comments Filter:
  • by MyIS ( 834233 ) on Wednesday October 26, 2005 @02:24PM (#13883333) Homepage
    Sobell happily avoids adding fuel to the already flaming fire of which editor is "the best."

    /me secretly winks at author for placing the vim chapter first

    • by Anonymous Coward
      /me secretly winks at author for placing the vim chapter first

      /me agrees with your positive assessment. The author has wisely saved the best for last.
    • Re:vim vs emacs? (Score:5, Insightful)

      by VolciMaster ( 821873 ) on Wednesday October 26, 2005 @03:29PM (#13883880) Homepage
      apparently he doesn't believe in listing alphabetically, then :)

      Yes, I'm an emacs user. I can also use pico when nothing else is available.

      As religious as the emacs v. vim wars get, the real point is whether or not you can get the task done with a tool. I've met people who were really, really good at using pico, equal to my level of comfortability with emacs. I've met vim users who are really efficient. And I know lots of people, like myself, who are quite productive in emacs.

      Whatever tool you use, if you can be productive, ie get the job done in the specified time-frame, I honestly don't care what editor you use. That's the cool thing about having lots of editors to choose from, you can pick the one that fits how you think.

      I think in emacs. One of my best friends thinks in vim. We're equally productive in our environments, which is what really counts.

      • Right.

        What gets me is when one is not installed on a system. Why not install all of them? You can probably install the 3 you mention and have all of the bases covered with minimal disk space.

        I actually ran into a sysadmin that would only allow vi to be on his systems. He would actually go in and remove the others. This lead me to learn vi since I had to use "his" systems. One will always find vi or vim, and most of the time one will find emacs installed. So I use vi. I have had a lot less luck with

        • I have never understood the war over these

          From what I understand, it comes down to differing ways of approaching the problem at hand. vi (and vim) was written in a moded fashion. All navigation, writing, and editing is just a key away. emacs is modeless, so accessing editing functionality is done through meta (esc) combos and ctrl sequences. I find it very natural to use the ctrl and meta sequences, since I came from the two-fisted world of Macs first, where everything is a Cmd or Option combination. I lea

    • /me secretly winks at author for placing the vim chapter first

      You're right, it's better to get the bad news out of the way first.

  • by SlashdotMirrorer ( 669639 ) on Wednesday October 26, 2005 @02:24PM (#13883334)
    Any bearded terminal hacker will tell you, when asked about the book they learned from, that while they went from one book to another in their early days, the most productive thing they ever did was learn how to read the documentation already available to them. This includes man pages, changelogs, posts to dailydave, and source code. A lot of people completely neglect these wonderful sources, and even go so far as to reinvent the wheel writing their own guides (see early Mandrake and Redhat releases).

    Someone should write documentation on how to read documentation (especially source!)!
    • by MyIS ( 834233 ) on Wednesday October 26, 2005 @02:43PM (#13883490) Homepage

      I know a lot of people that would prefer a physical book over online docs anytime. The key difference between the manpages and a guide like this is how the latter places everything into some sort of context. That is, a novice would get a much more useful "bird's eye" view of the available toolset by reading a guide on Unix user management as opposed to directly researching manpages for adduser, passwd, finger, etc. Mind you, I love reading manpages, but I think I could have saved a lot of my own learning time back in the day, if there was a nice tutorial nearby and I wasn't so stubborn to do things the hard way.

      Of course, manpages are impossible to avoid in day-to-day work, but a good guide is invaluable as a kick-start.

    • by Anonymous Coward
      I personally have found that documentation books can be very useful. I have my monitor stacked on top of several right now to achieve the correct height so that it is more comfortable to use my computer. I have another pile on the floor acting as an ottoman.
    • by clintp ( 5169 ) on Wednesday October 26, 2005 @02:52PM (#13883555)
      Any bearded terminal hacker will tell you
      Except that many modern systems that would like to call themselves Unix (or Unix-like) don't ship man pages. They ship info pages.

      Garbage. Blasphemy of the highest order.

      Shaved the beard in '98, but have been hacking Unix since '85.
      • by John Whitley ( 6067 ) on Wednesday October 26, 2005 @03:39PM (#13883955) Homepage
        In all fairness, this isn't the *nix distributions that are at fault: it's the upstream software developers. Aside from a few valiant distro package maintainers, all docs seem to come from upstream these days. I totally agree that info pages used to suck rocks, mostly because they seemed to be an excuse for really sparse documentation encased in really bad hypertext. Fortunately, the quality of the documentation (and the hypertext organization) has improved considerably over the years...

        • I don't understand this argument at all. You pay RedHat and SUSE good money for packaging. If something about the packaging sucks (docs are an inconsistent mix of info, man, html, and text), you expect them to fix it and not shrug their shoulders and point upstream.
      • Too bad I blew all my mod points this morning, otherwise I'd have modded this insightful as well. Honestly I'd rather install windows than use info pages.
      • by Anonymous Coward
        Oh christ, someone needs to drag info out back and shoot it. I tend to forego documentation in favour of trial and error rather than struggle to remember how to navigate that awful heap of junk.
      • FYI, Debian policy is to have a clear and useful man page for each executable it ships. This means that there are debian-authored man pages for packages that only ship info pages upstream, or even no man page from upstream.

        -molo
        • Have you actually read most of the debian-generated man pages? They most often don't say anything more than "This man page written for Debian in order to comply with policy". If the program needs a manpage then write a good one, otherwise override the lintian check.
      • ``Except that many modern systems that would like to call themselves Unix (or Unix-like) don't ship man pages. They ship info pages.''

        You say that as if it is a bad thing. What's wrong with info manuals? The way I see it, info manuals offer useful extra features compared to manpages (hyperlinks!). I'd rather they shipped the docs as HTML and let me use my own browser instead of GNU info, but I don't see anything fundamentally wrong with info.

        As far as the quality of info manuals is concerned (other posters
      • Randall Waterhouse? Is that you?
    • Yes, but by running a Google search on, say, awk, you run a substantial risk of inadvertently launching an ICBM and initiating a global thermonuclear war.
    • by myvirtualid ( 851756 ) <pwwnowNO@SPAMgmail.com> on Wednesday October 26, 2005 @03:11PM (#13883707) Journal

      The parent makes two points: 1) that us terminal hackers went from book to book while they learned, and 2) that the documentation already available was where they learned the most.

      Re #2: No argument there, the information that comes with a well-documented UNIX is the best way to achieve wizard or guru knowledge levels. Not quite so well suited for getting as far novice, though: A lot of the man pages - at least back in the day - were written by experts who assumed the reader was close to expert, or at least was a C coding system hacker. Like it or not, not all of us were. C coder? Yes. Sysadmin? Eventually? Kernel hacker? Nope. Library hacker? Only at gunpoint. Shell hacker? Oh, yes, please anytime.

      How does one climb from naive to novice to comfortable to proto-admin, how does one get to the point that some of the denser available material starts to be beneficial, rather than a poor imitation of nethack? You are in a twisty maze of man pages, without doors or windows, and your dog has died....

      Context, and clear and lucid introductory material. That's the starting point. That brings us to...

      ...point #1: Bounce from book to book? Not so, this terminal hacker. When a job change took me from system 370, where I could hack REXX and JCL with the best of them, to UNIX, where I do better than hold my own in sh and sed, with a little awk/nawk/gawk thrown in, I sat down with one book. Just one. Sobell's Practical Guide to the Unix System. That's all it took: Three days of working through SunOS (3.?) and HP-UX (6.5, yikes) with that on my lap, and I had found my home! UNIX, beloved UNIX.

      After Sobell, I was ready to tackle the man pages and get some value.

      So am I going to buy this new tome? Well, let's think about that description: tome. Why is the Linux Sobell so much fatter than the UNIX Sobell was?

      Beware, flamebait: Possibly because some much of the information available with most distributions is so poor! Man pages that refer to info, and info pages that repeat the man pages word for word, save for the reference to info! Laughable, absolutely laughable.

      And, yes, as others have pointed out, there is much information available on the web. Where to start? And how to tell wheat from chaff? Context....

      Which brings us back to the Sobell Linux tome: Yes, I will more than likely by it, and I'll bet I'll recommend that others do, because my guess is that it will provide the one thing that all us Linux-proto-admins and Linux-proto-hackers need to get started and to start making sense of what's out there: Context.

      By the looks of the review, Sobell's provided plenty.

      Mr. S, hats off for doing it again, and thanks.

      • Re #2: No argument there, the information that comes with a well-documented UNIX is the best way to achieve wizard or guru knowledge levels. Not quite so well suited for getting as far novice, though: A lot of the man pages - at least back in the day - were written by experts who assumed the reader was close to expert, or at least was a C coding system hacker. Like it or not, not all of us were. C coder? Yes. Sysadmin? Eventually? Kernel hacker? Nope. Library hacker? Only at gunpoint. Shell hacker? Oh, yes,
    • by mrogers ( 85392 ) on Wednesday October 26, 2005 @03:45PM (#13883995)
      Most man pages are references, not tutorials - reading the bash manual page is a horrible way to learn shell scripting.
    • What about us people who don't know what commands do what? Man pages are fairly useless if you don't know which command to ask for.
      • by Coryoth ( 254751 ) on Wednesday October 26, 2005 @04:10PM (#13884193) Homepage Journal
        What about us people who don't know what commands do what? Man pages are fairly useless if you don't know which command to ask for.

        Heh. Never underestimate the value of spending a day working through man pages for every program in /usr/bin, it can be remarkably educational. I admit that's trickier on a lot of modern Linux systems that have so many programs in there, but you can skip over a lot of them (all the obvious KDE and GNOME apps) and only worry about the weird obscure looking ones you don't recognise. A lot of the time you'll do man [obscure command] read the summary at the top and go "Huh, didn't realise there was something to do that" and not worry about it. The point is that one day you'll be sitting there trying to figure out how to do something and remember "I read a man page for something that could do that..." a quick skim through ls /usr/bin and (usually after at most 2 or 3 missed guesses) you can pull up the manpage for what you want. Sometimes knowing that there is something that can do [obscure task] is the most important part. Other than that, reading the man page through for some of the more common commands that have a vast array of switches (ls, grep, cut, sort, etc.) can make you realise that some of those can do things you really wouldn't have expected them to be able to , and suggest cunning new ways to use them is tasks where you might not thought they would apply.

        Just knowing what's available and all the various things that can be done with the tools is a very valuable lesson, and worth a little time investment. It takes less time than you might think to skim through and at least become acquainted with what all the various obscure programs in /usr/bin actually do.

        Jedidiah.
        • A very good point. Also, by reading through the common util's man pages, in addition to knowing there's a tool to do a certain task (which you might forget, or not even understand at the time), you start to get a "big picture" understanding of unix. You learn how to look at a new task and break it down in to categories: "this part is very general and there must be a tool to do it", "this part is specific to my task but could be done using general tools", and "ok, i'm going to have to write some code here"
      • Try the -k option to man followed by something you're interested in. It will give you a list of relevant man pages. For example, if I type:

        man -k compiler

        I get a list that includes bcc, gcc, javac, luac, ocamlc and other compilers. You will sometimes get a lot of stuff that you aren't really interested in, and it may miss things that you would like to check out, but its still very handy. It will often remind of the name of a program you can't recall or suggest useful things to read about.

      • What about us people who don't know what commands do what? Man pages are fairly useless if you don't know which command to ask for.

        apropos?

        On my system, I needed to find all the tools that involved mpeg operations because I simply didn't know, so I used:

        $ apropos mpeg

        and was greeted with:

        .br mp2enc [mp2enc] (1) - Simple MPEG-1 layer-II audio encoder
        .br mplex [mplex] (1) - MPEG 1/2 program/system stream multiplexer
        SDL::MPEG (3pm) - a SDL perl extension
        YUV4MPEG2 [yuv4mpeg]

    • by Coryoth ( 254751 ) on Wednesday October 26, 2005 @03:52PM (#13884070) Homepage Journal
      Any bearded terminal hacker will tell you, when asked about the book they learned from, that while they went from one book to another in their early days, the most productive thing they ever did was learn how to read the documentation already available to them.

      While the available documentation (man pages usually) are nice, there are a few books out there that quitte simply are great references and provide better material than the standard documentation. My personal favourite is UNIX Power Tools [oreilly.com]. It is mostly just a vast, vast collection of tips, tricks and cunning insights into the finer points of UNIX, its shells, and the various command line tools you can expect to find from some exeptionally experienced long time UNIX hackers. It's not a book you sit and read cover to cover, it's the sort of book where you go "I wonder how I could do [obscure thing]?" look it up in the index (another great feature is that the book has one of the most comprehensive and impressive indexes - finding out how to do what you want to do is easy), flip to the relevant section and start reading. Each section is heavily (and well!) cross referenced so in the middle of reading how to do [obscure thing] you read a comment about [other thing you hadn't realised you could do] and have to go and read that section too. An hour or so later you realise that you really need to get back to work and do [obscure thing], but you now also know many cunning ways to exploit UNIX that you would never have thought of yourself, and certainly wouldn't have realised you could do just from reading man pages.

      Jedidiah.
      • ``(another great feature is that the book has one of the most comprehensive and impressive indexes - finding out how to do what you want to do is easy)''

        This is what sets apart good books from great books: having a good index. Not just for informative books, but for any books, even fiction. It's also one place where electronically available information will always have the edge.
    • My book [tux.org] teaches how to read documentation. It's the subject of the first chapter, actually. (Slashdot reviewed my book, long ago, FWIW. [slashdot.org])

      Since my book is not aimed directly at programmers, I didn't cover reading source code. I did cover basic instructions for debugging documentation when it's wrong, though.

    • Someone should write documentation on how to read documentation

      $ man man
    • by Anonymous Coward
      I actually purchased this book a couple months ago shortly after I finally made the fateful jump to Linux (Ubuntu) from Windows.

      The problem with the man pages is that I had seriously no idea how powerful the terminal could be. I had never seen file redirection, pipes and had only used a very small subset of common utilities previously. Obviously, I doubt if I ever would have found information on these things that I had no idea existed in the first place.

      I learned a helluva lot from this book (and quickly) b
  • Why buy the book? (Score:2, Insightful)

    by MLopat ( 848735 )
    Why would anyone spend the money on this book?

    The information presented in it is freely available all over the web, often with more insightful examples, tutorials, etc. I just don't see the justification in spending x dollars on a book when a simple google search will yield a better result for free.
    • by hildaur ( 86126 ) on Wednesday October 26, 2005 @02:41PM (#13883475) Homepage
      Google searches (and online documentation found by other means) are great for finding details about stuff you already know exists. They stink at helping you discover really useful tools it never occurred to you might exist.

      • While I think that's true for some topics, in the technology field there seems to be alot of information available. For example, look at this google search for Linux Shell Tutorial [google.ca]. Looks like alot of info to me.
        • Yeah, but how much of it is any good? There's no way to know which ones are easy to follow, provide good examples, are no longer live etc.

          With a book on paper, you tend to get a consistent style. Most of the decision process with books is finding an author that writes in a manner that is clear and helpful to you.

          With books, the tendency to reliability is greater (though arguably still none _too_ great), than that of a collection of search results.

          Also, books are just easier. You can take a book mor

    • by DonVino ( 925697 )
      Because sometimes I like it to read about something ordered and structured. And maybe there are things in this book, that you don't google that fast or maybe you find a tool, that you wouldn't have found with google - or maybe just not that fast.

      - often with more insightful examples, tutorials, etc -
      I you need more insightful examples, you can still google for it, but for a first impression this should be enough.
    • by Shadow Wrought ( 586631 ) <<moc.liamg> <ta> <thguorw.wodahs>> on Wednesday October 26, 2005 @02:58PM (#13883604) Homepage Journal
      I'm considering it because I'm new to Linux and my home computer is not hooked up to the internet. Using Google means either running back and forth to the other computer or finding answers during lunch. Sitting down with an actual book is still a handy thing, and will continue to be so, regardless of how helpful google can be.
      • I'm fairly new to Linux, too, and I don't know what I would have done without the internet at my disposal. I don't think that books are really the be-all to end-all to Linux installation. SuSE, Fedora and Ubuntu are all simple for different reasons (right now my distribution of choice is SuSE because I'm trying to get iFolder working) and for most stuff you won't need a manual, but if you do run into a problem, manuals will give you the general overview of how to do things, which is invaluable, especially
      • reference to reallifecoomics.com :

        Books? I heard of these, aren't they portable Websites??
    • by $RANDOMLUSER ( 804576 ) on Wednesday October 26, 2005 @02:59PM (#13883613)
      I do have an answer for this. I have two 3-inch binders full of man pages for the shell, awk, lvm, ioscan, etc, etc. Every single hardware-related thing, no matter how arcane, and every commonly used thing I can't remember all the arguments to. Normally they sit at my desk, but they always go with me for my bi-annual trip to the off-site disaster recovery center. One of the lessons I learned a long time ago is you have to have a machine running to use the on-line help. When the box is casters-up, you're on your own.
      • Well I agree that keeping notes is handy, but surely having them indexed on a computer is just as simple, more portable, and easier to pass around. And yes you have to have a machine running to use the online help, but how many people doing shell scripting don't have a second computer at their disposal should something happen to the first one?
      • One of the lessons I learned a long time ago is you have to have a machine running to use the on-line help.

        First, Assuming you arrive onsite to a machine "not running"( What no power? No Console? No X?) What situation are you going to use these binders for without the "machine running" Second...You go offsite with no system of your own?

        How much would you pay for a minitaure electronic USB-Rechargable battery operated Version of your binders? Ebook? Maybe a complete linux distro? man info html txt re

    • The information presented in it is freely available all over the web ...

      And having things all over instead of in one place is a Good Thing?

      Dunno about you, but it seems to me that reading a few hundred manpages or clicking through a few hundred links is neither practical nor comprehensive, which I believe is the purpose of this book.
    • Sometimes it is easier to have hardcopy at your side instead of flipping between windows to see what the Google results are and often your printer doesn't work for whatever reason or is nonexistant. A good book can also be curled up with at night before bed. Try doing that with a tower PC.
    • I would buy it. If nothing else then a reference when the internet os dopwn and i cannot quite remeber/figure somethign out. Saveing web docs and then viewing or p[rinting them to be viewed elsewere sometimes doesn't cut it.

      I remeber a time not so long ago when i was trying to script a simple "run after this happens " dialog to search the logfile and determin if it was a simple restart a service or if something needing more attention and then email different people based on the responce. I was on a producti
    • The information presented in it is freely available all over the web, often with more insightful examples, tutorials, etc. I just don't see the justification in spending x dollars on a book when a simple google search will yield a better result for free.

      How am I supposed to curl up next to the fire with a google search, or go camping with a google search. I need the book so I can read it wherever I want, whenever I want.
    • When your learning it is nice to have a thought out progression of subjects. Searching around and finding good pieces of information here and there isn't a good learning environment for a beginner. I read one of his other books when I was just learning linux and it made life much easier for me.
    • by wuice ( 71668 )
      Please get off your high horse. A book puts stuff together in a way that someone with no prior experience to a subject (who doesn't really know what to "google" for, and would like a comprehensive guide to a subject instead of trolling for a dozen links.

      What I hate about comments like these is that they're almost always bragging in a thinly-veiled way... "*I* figured it out on my own, why can't time-strapped IT professionals do the same?" The answer is the time it would take to compile such a comprehensive
      • Nope, not bragging at all. In fact, nothing to brag about. I havn't used a unix shell since I was taking computer science at uni. So I'm the furthest thing from an expert on this topic.
    • Why would anyone spend the money on this book?

      The information presented in it is freely available all over the web, often with more insightful examples, tutorials, etc. I just don't see the justification in spending x dollars on a book when a simple google search will yield a better result for free.


      There are at least three answers:

      1) The information you get from your "simple google search" may or may not be better than what you get from the book.

      2) A book provides all the information in one place, with a
  • by max born ( 739948 ) on Wednesday October 26, 2005 @02:42PM (#13883477)
    This sounds like a great book. I don't won't come across as being too negative but having been a Unix sys admin for nearly 10 years and having run Linux on a loptop as my sole OS for the past five years I think it's important for new folks to not rely too heavily on learning from books.

    The great thing about Linux is that all the definitive documentation (including the source code) comes with the OS. It's good practice to get into the habit of using these docs.

    Then again I appreciate that everyone has a different approach to learning.
    • The great thing about Linux is that all the definitive documentation (including the source code) comes with the OS.

      You know, I'm finding that for a lot of the "beginner" linux distro's, this is not true. Yes, the source is available but it doesn't come with the OS.

      Granted, not everyone wants to fill their harddrives with source tarballs or SRPMs on the off chance they might want to read them, but only a few distro's I know come with source, and those are not necessarily for noob's. the main exampl

  • by stanthegoomba ( 805724 ) on Wednesday October 26, 2005 @02:54PM (#13883578)
    'cat' and 'echo' ought to be enough for anybody.
    ^D
  • ... I wish I had the time and energy to learn all this, but I'd rather ask my linux geek (Obviously I am a geek too) friends to help me. How do I install the Mplayer plugin to firefox? (doing that crap always irritated me), istead of watching some good pr0n I had to unpack binaries, my balls and brain loathe you Linux (Not trying to be a troll, just kiddin')
  • I try to see people's point as to why they prefer books over technical docs, but I miss the point totally. You buy a book to learn about the computer you are using or the OS you are using? So where do you read it? In bed? I guess you could but it's not a story it's a book to teach you how to do something on the computer. So you'd be better off if you were in front of the computer while reading it. That way you can have hands on experience with the subject matter. If I'm sitting at the computer reading a boo
    • It's like a bootstrap mechanism. To learn to use the computer while you're using it is useful once you have some useful skills tied down. Without those skills though, hardcopy outwith the computer helping you gain those skills is entirely useful.

      Personally, I tend to follow your approach, but it's sometimes nice to have one less thing cluttering up your screen, by marking a few useful pages in a book on your desk. I also can't stand reading large amounts of text (perhaps to be found in your PDFs) on-screen.
      • Well, I see that point. I think that in that case, though, I'd just need the first chapter or so. I wouldn't need the whole book. Maybe just a pamplet to get me started. I mean, according to the review, the first chapter tells the reader how to use "man" and "info". Why can't people write books like that? Or pamplets even. They could sell them for 2 bucks instead of $40-100. I mean I could sit down and write 1000 pages. Does that mean I need to? No, not really. Everything I need to know is in the first chap
    • Don't knock a book. Tactile and spatial memory is useless for PDFs and man pages.

      For many of us, the physical format actually helps with recall. When I'm learning a new system or tool, being able to recall that the command I need is described close to the middle of the book and has a big bolded red header on the facing page is very handy.

      Sure, if you already know you need a particular command, man pages are great. But if you have no idea what the command is called, how the hell are you ever going to find it
      • I used autocomplete, when learning Linux. Type "a" hit tab, list all the commands. Type man "command".
        • Sure, but what if what I want is something like "less" or "nano"?

          Those aren't even remotely intuitive names.

          Using autocomplete is the same is listing /usr/bin, essentially, and useful only if you can remember the command name if you see it, or if the command is usefully named.

          Sadly, that is not a common practice with legacy shell commands.
    • 1. A good reference book is not just a regurgitation of the technical docs. Technical docs are generally best for answering the "how" question, but horrid at answering the "why" / "what" / "where" questions. Tech docs show you all the pieces, but rarely show you a really good overview of how it all fits together. (And even with printed material, one of my biggest gripes with authors is when they focus too closely on the "how" rather then answering some of the "why" questions.)

      2. I can jot notes in a bo
  • sounds a lot like my "unix in a nutshell" book -- which has all commands in reference format, a quick look at programs like vi (which is obviously the best editer [uncyclopedia.org]), sed, awk, etc. and a look at a few different shells and scripting with them.


    also, i like the, er, pun in the title.


    mrc

  • Tutorial? (Score:1, Flamebait)

    by kuzb ( 724081 )
    If you're a programmer who needs a tutorial to use an editor, the chances are good that the editor sucks. I love the Vi(m) vs Emacs arguments, because they are both highly unintuative editors that have an oddly steep learning curve where you really shouldn't need one. While I understand that many have some kind of strange love for these editors, I think it's misplaced. These editors were great when they were our only options for a powerful text editor, but they are outmoded by editors which do a far bett
    • Re:Tutorial? (Score:3, Insightful)

      Programmers use these editors almost every day of the year. On some days their very existance is manifested in the context of the editor for the entire day. To say that a shallow learning curve is the most important features a programming editor can have is flat out insane.
      • At what point did I call learning curve a feature? It's not a feature, it's the overall design. These editors have absolute shit for overall design. They should have hired a real UI expert to help them during the conception of these editors. Yes, it's possible to have a good UI in a console. Your program doesn't have to suck just because it's text. If you have to spend a significant amount of time fighting with your editor to get it to do something, you're wasting your time. Wouldn't you rather use t
        • Re:Tutorial? (Score:3, Insightful)

          by drinkypoo ( 153816 )
          Last time I looked (I try not to mind you) emacs did have the kitchen sink. Regardless emacs and vi both have purposes. First of all, emacs is not an editor. If you were expecting an editor, and you ran emacs, you did not get what you wanted at all. Emacs is a common LISP environment. Second, vi is old and fast. More importantly vi came from ed, and ed is a sort of cross between sed and an editor. The creator of vi has gone on record saying that if he knew vi would become so popular, he would never had writ
  • No Korn? (Score:3, Insightful)

    by hal2814 ( 725639 ) on Wednesday October 26, 2005 @03:51PM (#13884055)
    What kind of namby-pamby shell scripting tutorial goes over BASH and TCSH scripting and leaves out Korn?
    • This has to be the longest slashdot article...ever.
    • Although it's generally not very hard to get your hands on some kind of ksh, for the most part it's the least available shell around. Everything comes with bash and tcsh, not everything comes with a ksh and there's a handful of different ksh standards you could choose to support when writing one. Anyway, what's ksh got that bash doesn't? Bash is the standard now, the korn shell is an also-ran no matter how great it might be.
  • Not a Unix book (Score:3, Informative)

    by Arandir ( 19206 ) on Wednesday October 26, 2005 @04:20PM (#13884269) Homepage Journal
    Warning. This is NOT a Unix book. This is a Linux book. If you're looking for a quick reference on Unix commands, text editors and shell programming, so not make the mistake of getting this book.

    It's only about Linux. It says so right in the title.

    </sarcasm>
  • Presumably this book will tell me what console editor will edit OpenDocument files in a human-friendly way?

    Won't it?

    Vik :v)

The clash of ideas is the sound of freedom.

Working...