I missed "trouble tickets" - we ended up going with RT from Best Practical and linking into it. No need to reinvent the wheel.
Slashdot videos: Now with more Slashdot!
I recently helped a small wireless ISP get started, and one of the first things we did was put together a management application. It's grown to be moderately large, but a lot of the functionality required can be constructed from various free sources. My client is chugging away nicely with a Java-based (server-side) system, although it could have been written in any one of a large number of languages - Java was convenient for the available skill-set in the company [never overlook the value of using an environment with which the customer is already skilled!]. The Seam+Hibernate stack provides a very quick development path for most of the CRUD [Create/Read/Update/Delete] functionality that forms the backbone of any data-driven app - but it could just as easily be Access, or (insert favorite ORM here). We found a few commercial systems that could do all of this, but they typically cost more than putting this together ourselves - however, that's partly a function of who you have available.
The key to getting the project off the ground, in use, and genuinely useful was to identify the various areas that are essential - and automate them as building blocks within a framework. You're already used to Access+Excel+manual legwork, so you don't need to start with an amazing UI (although it is a good idea to come up with one that doesn't make the eyes bleed) - identify areas to automate (starting with the biggest pain points) and gradually reduce the pain as you can afford to do so. Also, don't be afraid to use various Free/Open Source packages to help out with sections!
The important chunks for my client (and probably for other ISPs) were:
* Customer database. This acts as the core for a lot of the rest of the system; database (and UI) for customers, their addresses and billing information, account history, etc. Includes tables linking to inventory to indicate what devices they have activated and the details (including billing plan, etc.)
* Inventory system. Lists CPEs, with status, location, ownership info and history. Linked back to the customer database, to make it easy to say "Johnny has this CPE, on that plan". This ended up growing lots of historical options for reporting, but those aren't really essential.
* Activation. This was the biggie for us. When a customer (optionally a new customer) is "activated" with a CPE/plan/etc. (a wizard helps you pick), it adds the appropriate history items and invokes scripts that setup the account in RADIUS and LDAP servers. This is the obvious place to include every step you need - but you can start as simple as "email tells you what to do" while you automate the steps.
* Deactivation. Suspending customers (typically for non-payment), handling CPE returns, etc. You can probably live without this immediately, but it is really nice to have.
* Billing. The first few iterations made CSV files for Quickbooks - that should be fine to get started. The most recent handles credit card payments, etc.
There's also a lot of management niceties, and these were integrated back into the main portal for convenience:
* Cacti for graphing various bits of the network, notably throughput and latency across the network. Very useful for planning.
* Nagios for monitoring the network and paging us when something isn't responding.
As it grows, you'll doubtless end up with a bunch of esoteric scripts also. We even have one that periodically uses an SSH session to log into various access points and records who is online in each sector, what their signal strength is like, and any de-registration events that have occurred. That highlights the biggest pitfall: it is really easy to get excited and try to program in the kitchen sink. If you don't focus on small/modular at the core, you'll end up with a mess - fast. We try to keep the core small, and then have the core UI link into various other tools as we create them.
I currently have a small (3 towers; 3 more going up in the next few months) WiMAX ISP as my primary client. They already had some appropriate frequencies available; if you don't, you either need to find some (schools are a good bet - many have some old licenses lying around that they don't use) - or go with unlicensed frequency bands. That will severely reduce your range/throughput, but has the advantage of being free.
WiMAX is a good fit for the rural model, but there's a fairly hefty setup cost. Most vendors require that you have an ASN-GW at the core of your network, which is a very large initial cost (both in setup time and actual purchase price). The large ones can easily run to a quarter of a million, with smaller models costing a lot less. My client is on NewNet gear (formerly Nokia-Siemens, formerly Motorola - corporate pass-the-parcel), and the setup was pricey - but it performs very well (they have plenty of customers getting 18-20 mbit/s down; upstream on WiMAX isn't so good, expect 3/4 mbit/s on a good day).
You can shave a LOT off the cost by using an open source core to the network (you can't avoid needing RADIUS, DNS, NTP, plus servers for actually running the business), and you could shave more off by going with someone like Alvarion who use a distributed ASN rather than an expensive core (in my experience, performance on Alvarion is decent but not on a par with the NewNet gear). You also need base-stations and antennas per site, but the cost there is quite reasonable in comparison (although "tower monkeys" are expensive to put the stuff up!).
By far the highest long-term cost is backhaul; you need a good connection to each tower (100 mbit/s for full capacity for a 3-sector, max 768 concurrent users). In many areas, dedicated fiber is really expensive - and you end up paying the telco you are trying to supplant. Microwave is a better option - you pay $10-15k up-front (plus FCC license if you need it), but there are no recurring costs. With fiber prices around here, it pays for itself in well under a year. There will also be the cost of your upstream Internet connection; that's incredibly variable by location.
The next cost is CPEs. Our experience has been that the fixed devices sell far better than the mobile devices (mobility isn't so useful when its only within your small network), and the outdoor CPEs need good installation to perform well. Expect to pay $150+ per unit, which can make for a high setup fee.
Finally on the money-side, there's the human cost. You'll want support, enough engineering muscle to monitor/fix your network, and any sales/business side you need. That can be hard to juggle while you get started: mouths to feed while you get enough customers to hit the magical "break even" point. It's a tough phase, and you have to be very careful to keep your spending within reach of this goal. That means you can expect to be working hard for very little for a while - but that's true of most start-up ventures.
It's also worth considering LTE. It's currently an expensive proposition to get into LTE, but you can cover your butt against the eventual inevitable transition. All the major WiMAX players are moving towards a dual-stack mode, allowing you to concurrently run LTE and WiMAX (on different frequencies) on the same gear. Most CPEs scheduled for next year are also dual-stack, so you can deploy WiMAX now and LTE later when you can afford the exorbitant cost of a packet-core (or packet cores come down in price). To do WiMAX well, you want 3-4 10mhz channels; if you can get adjacent frequencies, when you light-up LTE you can start by using one of the 10mhz channels - and gradually phase-out WiMAX adding bands to the LTE side. It isn't free future-proofing, but it's a lot better than knowing you will have to tear out all your gear in a few years.
I generally stick with maps, but GPS has saved the day on one occasion. My wife and I were visiting a small French city, and couldn't find our hotel - the city's one way system made no sense at all! Google Maps on my phone used GPS to find us, and then gave a route (including one-way turns!) to the hotel. We were only about a mile away, and it would have been easy on foot - but when the city is designed to take you in spirals AWAY from the downtown area (where we needed to be), some guidance was absolutely invaluable.
Without a GPS, we'd have just asked directions. Fortunately, we know enough French to get by with that - but only just!
"2 lbs of chocolate in cups" is actually a really hard question to answer; it's also a reason why recipes in the US make no sense. The cup is a unit of volume; 1 cup of water makes sense. One cup of molten chocolate might make sense, too. Recipes in the US tend to use cups as a measurement of a solid - for flour, chocolate chips, nuts, etc. Depending upon the size and consistency of your solid, the amount that fits in a cup will vary! A weight is much more sensible for measuring solids - 2 pounds of chocolate weighs (close enough to) two pounds, whether it's bars, chips, squares, liquid, etc. Same for nuts, and even flour (which can vary greatly in volume depending upon air content, how it's packed, and so on).
(The rule of thumb that 1lb = 2 cups works great for most recipes, but it's only even somewhat accurate for items whose density closely resembles water)
I've been married for about a year, and was in very much the same boat as the OP. I'm a programmer/sysadmin for a small business (I'm co-owner, and we're REALLY small - so I do a bit of everything). My interests are the usual heady mix of RPGs, computer games, painting miniatures, plus guitar playing, cooking and the obligatory taking things apart to see what makes them tick. My wife is an IT trainer at the local university, has an MA in English Lit, and is a semi-pro photographer. We just had our first anniversary, and so far - so good.
The first thing to realize is that marriage advice books are worth reading, but won't apply beyond the general case in most cases (even the jock/submissive cheerleader marriages they seem to target!). We've been through quite a few (as well as a "pre-marital class" - which was actually worth it mostly for the practical advice from people who've already tied the knot), and the commonalities are where the best gems hide; communication, division of responsibility, how to handle disagreements (you WILL have them - it is inevitable unless you are secretly robots), and a lot of the less enjoyable aspects in general.
Before you even get married, make sure the two of you really know one another. I can't stress that enough (and it's a lot easier to walk away crying before you tie the knot, and better for both of you if it wasn't a good idea in the long term). That means being completely honest about your past (and receiving the same); there's usually skeletons in the closet from the past that upset you, patterns of behaviour you need to avoid, things that (not entirely rationally) bother you, key-words that hurt (e.g. "stupid" is an insult I hear a lot in geek circles around here, but it's one of the most upsetting words you can say to my wife due to some bad associations from the past). It's unlikely that you've both managed to never be emotionally scarred, and those scars WILL come back and haunt you both if they aren't dealt with. This is a lot harder than it sounds - you make yourself really vulnerable when you share everything like that. On the other hand, you and your future wife are planning on spending the rest of your lives together, so if you can't be vulnerable to your fiancee now - you may need to wait/rethink (and vice versa).
There's a lot of practical advice that can help with living together, most of it documented. Love isn't enough, you actually have to be able to tolerate each other's lifestyle preferences - and be able to keep up with chores to keep the family home running as well. It really is hard work! Chore distribution (yes, just like in a college flop) is important; in my case, I love cooking, electrical work, keeping the various electronics working - so those were easy. My wife volunteered for laundry (finds it relaxing, oddly enough). We both hate washing up, but I picked that one up in return for not weeding the garden. And so on. We also loosely agreed who the final boss is on various issues. That is to say, just like when running a business, you decide who "owns" various issues. The owner is expected to take reasonable consultations, and try to find a mutually acceptable solution - but ultimately, if there's a disagreement, the "owner" makes a decision - and the team goes with it (even if they don't think it's the right choice!).
Make sure you know how to have an argument without killing one another, or brooding for weeks. That'll kill any marriage FAST. Make sure you both know that it's ok to retreat from an argument when it gets too hot - and discuss it more rationally later (but do NOT let it fester - you do have to have the discussion if you do that!). Try to at least kiss and make up before bed, it sucks sleeping next to someone while you ruminate on how wrong they are - and it's a nice idea to keep your associations with the bedroom positive! Avoid the "key words" that hurt above, don't even joke about divorce (that one's pretty much impossible to take back).
Schedule time together - at least 5 minutes every day, even if you just each go through the "how was today?" stuff (that's more important than you might think, since it keeps you in touch). Try to spend some quality time together at least once a week, somewhere other than the couch watching TV ("date night" works well for some, whether it's a dinner/movie or just a long walk in the park). We actually keep a family Google Calendar for that (we also keep an issue tracker for what needs to be done around the house - RT from Best Practical!).
It's a good idea to discuss future plans, too. Do you want children? Does she? How many? If money wasn't an issue, where would you (and her) like to be in 20 years? If things go badly, are either of you too proud to eat ramen together?
It's a lot to think about - because it's a big deal. I truly wish you the best, though - marriage is HARD (I've got the scars to prove it), but so far it's absolutely worth it.
I always assumed that it was because Paul was the only apostle whose status practically relied upon a resurrection, and more generally miracles (in fact, he's the only one to write that a material belief in physical miracles is a requirement for Christians).
Unlike the apostles who actually met Jesus in the flesh (and for whom his greatness wouldn't necessarily require post-mortem miracles), Paul was converted in a dream while recovering from his injuries on the road to Damascus (a story that, coincidentally, can also be found in earlier myths - but with a different protagonist and deity). When Paul (former Christian hunter!) showed up to talk to the apostles, he was turned away - and only admitted after some negotiation and the intervention of one of the already-present apostles (whose name temporarily escapes me).
I've read a few articles about "edit wars" between Paulinists and the early Christians, in which Paul gradually was inserted to a prominent role. If you don't believe in a resurrection, then Paul's conversion looks a little more suspect - even if you do believe in the remaining doctrine. (Paul is, interestingly, also the source of most of the "old testament doesn't apply anymore, except for some bits, but we're not going to tell you exactly which!" dogma)
We need a clear, unambiguous policy that nukes are absolutely forbidden for every state with no double standards.
It's a lose-lose situation. Sure, the double-standard argument is true, but that's only half of it. If the US or another world power were to actually disarm completely, how long do you think it would take for some dictator or terrorist group to take advantage of that opportunity?
This demonstrates a trueism often repeated in the Defense Studies community: arms control really only works when you don't need it. If you are on good enough terms to sit down and sign agreements, then you probably aren't about to blow each other up anyway.... if you ARE that hostile, chances are that negotiations aren't going to work very well.
(That said, I'm not sure I agree with Dr. Gray's famous assertion that a nuclear armed neighbourhood is a more polite neighbourhood - we'll find out, I guess)
There used to be a contest held (in the US military; I have this from second-hand sources so it may not be accurate) to see who could come up with the best weapons from Radio Shack. A couple of scientists once put together a workable nuclear detonator with off-the-shelf parts. They were, admittedly, REALLY good scientists - but now that the technology is reasonably well understood, the primary barrier to building nukes is fissile material. (The big deal used to be timing circuitry for the detonators - timers are cheap/easy now).
This is one cat that definitely won't be re-entering the bag...
(See The Effects of Nuclear Weapons, and also "The Pentomic Army" for sources on this)
The 200-300m quoted distance is the "100% probability of kill" range, I believe. Double that range, and the probability halves.
You also have to remember that Hiroshima was almost perfectly designed to be obliterated in a nuclear blast. The topography is that of a bowl, so overpressure actually wraps around rather than just releasing in an outwards pattern. Also, a lot of the buildings were made of very weak materials - residences had a lot of paper and wood, which a) burned really well, and b) did little to absorb the blast overpressure on the way through.
Nagasaki actually fared quite a bit better, as have various test ranges around the world.
In a modern concrete and steel city, the reflective/absorbitive properties of building materials considerably reduce the spread of blast overpressure on a lateral trajectory. Additionally, few cities are built inside a bowl (New Orleans excepted!) - so most of the time, the overpressure only hits you once.
There really are only four lethal mechanisms that accompany a nuclear blast inside the atmosphere: prompt radiation, fireball, blast overpressure (and sometimes a secondary overpressure from air rushing in to fill the resultant vacuum), and residual radiation.
Prompt radiation travels in a straight line, and is blocked quite effectively by earth, heavy metals and some types of clay. At larger distances, even curtains can help with the flash. If you are in direct line of sight to the flash, within lethal range - you are dead. If not, you're probably ok - and the radiation types released in the flash typically don't stick around.
The fireball is typically not very large, but will incinerate whatever it comes into contact with. Most modern designs try to air-burst, and the fireball often won't ever touch the ground.
Blast overpressure hits just like a conventional explosive: a sphere of rapidly moving blast pressure, reducing in power over distance, and also losing energy as it hits things. The same protections against prompt radiation help here: a good wall of dirt does wonders for stopping overpressure, whether it's from regular artillery or a nuclear explosion. Note that studies have shown that blast overpressure is the primary kill mechanism for regular nuclear bombs, just like any other bomb.
Finally, you get residual radiation. This can be avoided almost completely with a carefully designed airburst - most "fallout" and residual radiation occurs when dirt is sucked into the fireball and irradiated there. Burst high enough to not have the fireball encompass a lot of dirt, and you don't have very much long-term radiation. It's largely unknown what the long-term effects of residual radiation are; the area around Chernobyl didn't behave at all like the models we had!
Then there are different bomb designs to consider. A really small nuclear bomb behaves a lot like a really large conventional charge: you could set it off in a football stadium, and probably not worry too much about damage to buildings a few hundred yards away; man-portable nukes were designed on that assumption, as were things like the horribly-design Davey Crockett round.
"Neutron bombs", which really should be called "reduced blast bombs" focus on enhancing prompt-radiation release at the expense of a MUCH smaller blast/fireball (and consequently very little residual radiation). Why would you want to do that? a) It greatly reduces long-term contamination of your target area (meaning you might get to go there!), but more importantly b) it's FAR more effective at taking out tanks and similar. Tanks are really, very, very good at withstanding blast overpressure (it's pretty much their primary defensive purpose - survive artillery and shells while they move forward). It's not at all practical to burst enough regular nuclear weapons to reliably take out a distributed, dug-in tank force. However, they are almost entirely made of metal - and prompt radiation does a "wonderful" job of frying everyone inside them. So neutron bombs give you a chance to wipe out tank divisions, probably with less damage to neighbouring towns (which is why they were invented).
(As an aside - duck and cover is more effective than you might think because of the items stated above; when blast-overpressure is your primary kill vehicle, the same techniques you would use against a regular bomb, or a tornado, are really quite effective)
Lastly, you have "dirty bombs" designed to contaminate an area as much as possible (a pretty stupid idea, unless your objective is revenge), and "enhanced blast" bombs which try to maximize blast-overpressure and minimize other elements - for example, nuclear penetrating rounds designed to blast a bunker as hard as possible.
Weirdly enough, nukes aren't even that effective militarily. In Gulf War 1, a study was commissioned to see how many tactical nuclear weapons it would take to effectively destroy Iraq's dug-in entrenched forces. The answer was in the thousands, simply because they were dispersed and the blast-overpressure kill radius against tanks isn't very good. It was actually significantly cheaper to use conventional precision rounds.
A second study demonstrated that a huge fuel-air explosive is actually both more destructive and more contaminating than a small nuke.
Nuclear weapons aren't nice, and do a lot of damage - but there's a great deal of hysteria surrounding them. Yes, they make a mess. No, they won't end the world (tangent: nuclear winter theory, btw, was debunked and withdrawn by its proponents), or even render a country the size of the USA or Russia uninhabitable/in the stone age.
I have a g1 sitting here, running on AT&T. It was unlocked by purchasing an unlock code from a website - and all the market apps I've downloaded (including some for which I paid, such as Touchdown's Exchange client - need it for work) work just fine. I think the problem referred to above isn't with market apps you buy on the newly unlocked phone, but with losing access to already-purchased apps when you unlock; I'm not sure about that since this phone was wiped before I unlocked it.
I recall having a similar discussion with a professor during one of my MS Defense & Strategic Studies classes. He stated that there were no 'recall' codes because it was considered too-likely (any likelyhood being too likely, in his opinion) that they would fall into enemy hands - so the system would have a remote 'off' switch, and no-longer deter an opponent who has (or thinks he has!) the right code.
It's an interesting thought experiment. The big upside to a post-launch deactivation system is that after a launch, you can say "oops" and kill the missile - hoping that the other side's launch detection systems haven't triggered a counterforce/countervalue attack in response. The big downsides are: if the 'enemy' gains your deactivation code, they are no longer deterred (at least on the ICBM/ALCM/SLBM/etc. front; I'd imagine manned planes could still fly) and may even see an opportunity to strike. You also risk lowering the level of caution displayed by your own side - knowing you can say "oops" and stop the strike reduces the mental barriers to launching in the first place.
This really was just another level of escalation/deterrent theory head-games that were so popular in the past. I remain convinced that deterrence is a dangerous hope upon which to rely, but that's another story.
Here in Columbia, MO public transport is spotty - but far better than most smaller cities in the region. We don't have rail (at least not passenger rail), but there is a bus service that hits most points in the city for $1 per ride (including transfers) or $35/month. I rode the bus for quite a long time; my commute consisted of walk 1/2 mile to the bus stop, ride downtown, change bus, and exit the bus outside my office (I'm lucky, there's a bus stop on the edge of our parking lot). It cost me $35/month for a bus pass, and took about an hour each way.
Unfortunately, the buses are pretty wretched! The city managed to make all the buses leak water through the top onto passengers (they installed a CCTV system and now even the brand new bus leaks). On a rainy or snowy day, I'd be drenched by the time I got home/two work - notwithstanding any wait at the bus top (buses are rarely on time, and are roughly hourly - so miss one and enjoy an hour in the rain). On the upside, they are nicely air-conditioned.
So, I picked up a 50cc scooter. It cost me about $800, but I get to/from work in 20 minutes every day - and gas costs come to about $2 every two weeks. Throw in oil and basic maintenance, and I probably spend about $15/month on the thing. That's only $20 cheaper than the bus, so it'll take a very long time for me to make up the $800 purchase price (assuming the scooter lives that long), but I've gained 40 minutes each way - so I can leave at a more reasonable time, and be home in time for dinner with my wife everyday. That alone is worth it for me.
I grew up on the edge of London, and was thoroughly spoiled by the public transit system there. Between rail, bus and subway I could go *anywhere in the city,* in a reasonable amount of time. It's not practical for a city of 100k people to install a system like that, but they could definitely do better than leaky buses once an hour...
I recently setup a client of mine with two Win2k8 64-bit servers (in a larger virtual VMware setup). So far, it's worked out very well. It's fast, stable (uptime is exactly equal to the number of days since we last had to reboot for a patch), and played nice with everything already present. Active Directory and Exchange 2007 migrated from the previous Win2k/Exchange 2k setup without a hitch. In other words: no complaints at all, other than the price (which wasn't too bad, since the client received non-profit pricing - but most of what I setup is Linux or FreeBSD and I greatly prefer that pricetag!).
Things I noticed that have improved:
* The group policy editor is a bit easier to use, and less confusing.
* The Vista performance/health monitor is actually pretty good, and provides a really handy ntop-like interface for seeing which service is doing what with the network (not as fine grained as I'd like, but it's a good starting point).
* The old Services-For-Unix services are more tightly integrated, and it was very easy to get NFS up and running.
* Less is installed by default, and adding just the required services was very straightforward.
* The scheduler seems to have improved, because processes distribute over CPUs more widely, and throughput/responsiveness "feels" better.
* The new role-based manager for file serving is a bit easier to find, but is really similar.
* A couple of new diagnostic wizards have appeared, including one for Group Policy - it helped me find a couple of problems I hadn't thought about.
Items I wasn't so fond of:
* Activation. It doesn't matter if you have a charity volume license anymore - you still have to activate. That bugs me, because this server has to last for years, and I worry that if I have to restore a backup in 5 years time the activation wizard may make my life difficult.
* Volume shadow copies are STILL not configured to my liking by default.
* If you want to use some of the new active directory features, you need a pure Win2k8 domain on the server side. It works with "legacy" Win2k/2k3 systems around, but only if they aren't domain controllers.
* The start menu/icons are straight from Vista.
* License management makes less sense, since the license control tools are now hidden away - checking CAL status is a pain.
Overall, for an MS operating system it's pretty good. I don't see a compelling reason to run out and upgrade any Win2k3 systems that are working well - but for new servers, it works great.
I'd imagine that it is made more likely by the topology of the ocean floor itself; there are probably good corridors through which to travel undetected (especially in 'friendly' water where it's unlikely that the enemy have detector arrays). If both sides are using the same ocean floor map, it seems that the odds of a collision go up considerably if there's an obvious corridor to traverse/hide in.