From building two or three of these at home myself, my practical experience for someone wanting a monster file server for home, on the cheap, consists of these high/low points:
1. the other poster(s) above are 100% correct about the raid card. to get it all in one card you'll pay as much as 4-5 more hdd's, and that's on the low end for the card. decent dedicated PCI-E raid cards are still in the $300+ range for anything with 8 ports or more.
2. be careful about buying older raid cards. I have 2 16-port and 2 8-port adaptec PCI-X sata raid cards that are useless. why? they only support raid arrays up to 2tb in size. "update the firmware", you say. sure, let me just grab the latest, from 2005, I'm sure that fixes it. oh, wait, my raid cards already have that, and it doesn't remove that limitation. 8 drives, 16 drives, even, and they hard-code a limit of 2tb? lame.
3. I've seen nothing in a home-budget price range that performs as well as linux software raid. My 1.5 yr old 500$ tyan workstation mobo(S5397, in another computer) has dedicated SAS raid that can't seem to do better than 10mbyte/sec throughput. reading data from drives that individually bench out at 50-60mbyte/sec.
4. which leads me to: use linux software raid. It's much more configurable than any hardware raid card, both in supported raid levels and monitoring capabilities. raid disks/arrays can be easily moved from one machine to another, one controller to another, etc. I've moved most of my disks between machines and controllers at least once.
5. I've come to believe over time that what you're really looking for is X SATA ports, not "controller capable of doing raid over X disks". Use SATA "mass storage" cards, or raid cards that will let you use them in pass-through mode to access the individual disks directly in the OS. here you have to be careful you don't get bit by #1, 2, or 3 again, since some raid cards don't behave well when not actually doing raid (I'm still looking at you, Adaptec). this makes it easier and much cheaper, you can mix and match lower-capacity cards to get 8-20+ sata ports for raid.
5.1 "hw vs sw raid tangent" : what happens on a dedicated raid card when you run out of ports? you usually can't span raid cards, unless you get multiple identical fancy (aka expensive) raid controllers from the same manufacturer. all linux needs is hard drives recognizable by the BIOS.
6. when using software raid, buy a decent CPU. You don't need some quad-core beast, but you don't want to be waiting on the CPU to finish your raid calculations. any 2-2.5ghz C2D is probably more than adequate...I've drawn the line with anything under 2ghz.
7. kiss backups good-bye. the price of any decent backup system capable of covering this much storage is WAY over the price of this whole setup. Anything I really don't want to lose gets saved multiple places outside of the raid array, otherwise I factor the potential for data loss as a risk of operating this way. Personally I don't really see how you could do otherwise in a setup like this.
8. be prepared for bottlenecks. you're doing this on a home budget, you probably won't get 300mbyte/sec reads off of your array, no matter how many drives configured at what raid level. I can only get 10-20mbyte/sec across my gigE network going to/from my raid 5 array. This is probably due to the cheap PCI sata cards I'm using. I willingly make this trade-off to obtain the capacity I have for the price I spent.
If any of these points is an overriding concern for your intended use, then you'd have to re-evaluate the importance of all the other considerations.
For me, stability, capacity and price are top three, leading me to research linux-stable cheap sata expansion cards (which is just a nice way of saying, I buy and try probably 2x the # of controllers I actually use, to find ones that won't corrupt data, time out on random drive accesses, or simply not display the real drives to the OS, etc), and compromise by waiting a bit longer for network transfers. Usually the waiting consists of starting a big transfer and then multi-tasking other things in the foreground/elsewhere. local same-machine use of this raid array is a consideration I've never entertained, but should be included in the list for anyone who's thinking about building one.