Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Linux Software

Boiling Down Slackware Linux to the Essentials? 9

noxious420 asks: "I need to crank out a large number of very basic Slackware based Linux boxes. I am familiar with the "canned" distribution sets that FreeBSD uses and I want to use a set similar to the "Minimal" - "The smallest configuration possible" set. In Slackware's Expert install mode, I have been able to boil it down to the very bare essentials that it needs to run, but there are some unused and leftover directories that I had to manually delete. I don't want to clean up these directories every time. Is there some way to get a true "minimal" Slackware install? "
This discussion has been archived. No new comments can be posted.

Boiling Down Slackware Linux to the Essentials?

Comments Filter:
  • Well, you could always tar xfvz whatever.tgz, delete what you don't want, then rearchive the resulting tree(s). Slack packages are just standard tarballs with an install.sh script that you may need to modify.
  • If a bunch of the machines are the same, install one one machine how you like it and then look into making an image (ISO or something) of the hard drive and basically just plastering that onto hard drives instead of actually installing. Also, if they are networked... there should be some way to install over the network to make it a lot easier. Or even find the offending directories in the install, make a new install which has them removed and burn that to a CD.

    _joshua_
  • by dlc ( 41988 ) <dlc@NOsPAM.sevenroot.org> on Tuesday March 28, 2000 @08:26AM (#1165591) Homepage
    I've been thinking about doing something similar for a while: create a "standard" personal distribution that can just be dropped into place on a new machine. It seems like the best plan would be to install a new machine exactly how you want it, and, on a spare partition, create one bigass tarball. Copy this tarball to a CD. Create a boot disk, minimal kernel with CD support. Boot off the boot disk, create the new partitions, and then untar the image from the CD. Is this an oversimplification? Maybe, but maybe not. There's really no reason why Linux has to be more difficult than this. Of course, this assumes that the machines have similar hardware, which is often the case when you get machines in batches. darren
    Cthulhu for President! [cthulhu.org]
  • I don't recommend this, even though theoretically it should work. I found that machines that are supposed to be the same often aren't. There may be a different rev level of chipset and a bunch of other little things. Most probably won't cause a train wreck, but I bet some will, in which case the cause will be far from easy to determine. One time I got a couple of shipments of Dells back-to-back and sure enough, some of the machines in the 2nd shipment had a slight difference in CD-ROM - probably wouldn't have added up to much, but considering that Linux is pickier about hardware than NT and NT is a LOT pickier than Win98/95/etc., I wouldn't rely on the disk-image trick for "cloning" machines.
  • are you sure that linux is pickier?

    What makes win32 pickier is the driver revisions. Linux drivers are generally written to support all versions of a piece of hardware since the drivers aren't shipped with the devices.

    The only major thing with this type of mirror, though, is you will have to reconfigure the network info to prevent fights between machines.
  • The main thing is to not use disk images, but instead what the origional poster said. One giant tar ball. I would love to see how many people could actually make an image of their harddrive fit on a CD. Even though this guy is sounding like he want's to put just a few megabytes of information on the harddrive, how many drives do you know are 650 MB or less now?

    The best way that I can think of to do this would make a simple tar ball like joshua said. Then on initial bootup have init run a shell script instead of the /etc/rc.d stuff. In this shell script you would set the host name and IP and that's about it. once that is completed write the config to disk and change /etc/inittab to run as normal and you are installed.

  • by RGRistroph ( 86936 ) <rgristroph@gmail.com> on Tuesday March 28, 2000 @01:31PM (#1165595) Homepage
    Rather than making a disk image, I think you can use the command dump ( see "man dump" ) to put a giant file on a cd which you will restore to the disk with the dump command.

    To get a target machine to the point where you can mount a CD and run dump, I suggest checking out some of the linux floppy distributions.

    There is one other option. There used to be a web site that offered you all the choices offered in RedHat's install script, and then created a custom install floppy which would automatically install according to those specifications, no human intervention necessary. So you could just go to this site (or it's equivalent for slack or whatever other distribuiton you want, or download whatever tool the site is running and use it yourself) and then make a boot install disk and just shove it and the cdrom in every machine. I think the boot floppy/dump from cdrom method will work faster.
  • by Zurk ( 37028 )
    install only the A and N series and select ask for each package in the install. i've crammed it down to 30 megs with full networking (lpd, httpd, ftpd + ethernet stuff + ppp,slip) and basic stuff. and ive installed it on multiple 386/486 boxes with 40 meg hdd's and 8 megs RAM. works great too. i recommend slack 4 or 3.5 instead of the flashy stuff (7.0)...
  • This works well. We rolled out a butttload of Linux and Win95/98 machines like this. I've done 'recovery CD's' for Win95/98 w/ this method, too. (Gee-- isn't that misuse of a GPL'd operating system and tools... *grin*) If you can swing it, a bootable CD makes it even handier-- albeit if you have to get into the BIOS to make the machine boot off of CD, the time-savings is negligable over just booting off floppy.

    As a more ambitious project, we made a little bootdisk w/ dhcpcd and support for the NIC's in the machines we were using, and used a little program called 'netpipe' to broadcast out .tar's to PC's hard disks over the wire. It _rocked_ to see 150+ machines pulling an image simultaneously at 800KB+ a second on garden variety 10Base-T.

If you think the system is working, ask someone who's waiting for a prompt.

Working...