Now how do you get the signal from those thousands of towers back somewhere to give them network access? The most common way is via copper or fiber cabling. Push enough towers out deep into every neighborhood to have minimal contention, good enough signal strength, and 99.9% coverage over the area (including all those old houses with nice thick walls that KILL signal) and you've probably spent as much or more than hiring a trenching/construction/OSP crew for a few months. And you still probably need 10-20% of those copper/fiber connections back to your headend to get to the network. And you have to provide power at several hundred locations instead of a small handful.
So why don't you just backhaul everything using multi-hop wireless? With a proper design, you're going to have one radio for subscriber use, and one for backhaul. So that's 2X the amount of equipment on every pole/tower. Then if you're over a large enough area that you can't do all the remote towers back to the main one, you're taking up bandwidth from every downstream tower coming back in addition to the upstream towers.
The amount of backhaul bandwidth you're going to need for T4->HE is the sum of all the bandwidth needed for T1, T2, T3, AND T4.
The idea has been thought of many times. Economics are the biggest reason it fails, not the technology. To get sufficient density, it costs a lot more than just running wires.
To get to work by 8AM, I would have to walk 3-4 blocks to the bus stop leaving by 6:40AM (not so nice when there's 6" of snow and -15F temps or when it's 90F+ at 7AM). I get on one bus to go about 2 miles away, then wait 15-20 minutes for another bus to get downtown.
Leaving work at 5PM, a 5 minute walk to the nearest bus depot at work MIGHT catch the 5:05PM bus, otherwise it's a 30 minute wait, then another transfer to another 15-20 minute ride and a 4 block walk uphill to home.
I could drive even with 7:30AM/5PM traffic in about 20-30 minutes either by Interstate or by a major through town federal highway. So I can give up an extra 1-1.5 hours a day of my time and walk several blocks in quite likely to be less than pleasant weather, or I can drive my car and pay about the same amount for a monthly parking pass as what a monthly bus pass would cost. Due to having children, I couldn't give up the vehicle, it would just mean different routes for the car and the bus.
Having visited cities like San Francisco, New York, Houston, and San Diego in the last year, cities that have well developed urban centers with public transit in mind seem to do much better with this than ones that were designed around cars and are trying to retrofit mass transit into them. The biggest difficulty in getting around NYC was figuring out whether to grab a cab, get on a subway, get a bus ticket, or get on one of the multiple trains. In several cases, there were at least 3 different options to get from A to B in roughly the same amount of time, though prices varied quite a bit. Subway was cheap, trains were pretty cheap, cabs were reasonable only because of the short distances.
I remember looking at these when they first came out and thinking it would be useful for sysadmins/coders who work in odd areas, but the form factor is pretty much useless on a plane/train, inside a rack, or anywhere else you don't have a full desk to set it on. And the fact that it was an 8.5lb laptop in the days of their competition getting down into the 5-6lb class. Coupled with the high (even for IBM) pricetag, it didn't do so well.
Your goals here are make it quick and easy to get stuff out of the box, configured, and back out the door as quickly and efficiently as possible.
Things I'd do to start:
If doing racks, consider shelves so you can slide equipment in and out quickly. Some racks will let you do shelves that mount to the sides rather than taking up 1U for a shelf, these may let you get more density in the rack and need fewer racks.
If doing shelves, don't stack equipment, try to put it like books on end, makes it a lot easier to get one piece out without moving a bunch of others.
Plenty of power cords/outlets where you need it, make sure if everything isn't a C13 that you account for this. Newer switches are starting to use C15s or C19s for larger equipment. Make sure you have a large enough UPS to handle startup current for all these devices. Constantly turning up/down equipment is hell on your power feeds, good clean UPS power is important.
Patch cables wired in and velcro'ed off to the rack where you need them, and run extras. That way if you have a suspected bad cable or a broken end you aren't worrying about replacing it right away to get the equipment out the door.
Terminal servers are a godsend in an environment like that. Configure them so you know that TS1, port 1 is the top (or bottom) device in the rack. Keep them in order or you'll be tearing your hair out why the wrong device just rebooted.
As someone else mentioned, USB barcode scanners if you have to do any kind of inventory tracking is a GREAT tool to have.
Separate but adjacent boxing/unboxing room with a sturdy table. And a sturdy cart to move equipment back and forth between them. You want to keep all the cardboard and styrofoam out of the equipment config area.
Has anybody suggested asking the current political candidates their views on SOPA? If you live in the US, and your Congressperson is listed as a Co-sponsor of the bill, or listed as an opponent of the bill, have you contacted them to voice your opinion? Votes are all that matters to politicians. A few hundred calls/emails to their office telling them that this is a flawed bill, and it WILL result in your vote going to their opponent can quickly change their minds on what matters to them.
That's the current list of SOPA co-sponsors.
Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (1) Gee, I wish we hadn't backed down on 'noalias'.