They *could* produce the firmware and updates forever, but how are they going to pay the salaries of the people and infrastructure to deliver it? Yes, they could build it into the price, but if they built in 'everlasting support' into the original purchase price, it would be too expensive for you to buy in the first place. It's not only up to the SonicWalls of the world, but also the chipmakers (and every other supplier up the line).
The folks that front the money (so you don't have to fork out the purchase price 2 years ahead of time) want some return on their investment. If technology was static, the 'perfect' product could be made, it would be flawless the first time out and never need replacing. But technology isn't static, and there is no such thing as infinite patience.
I own a Phatbox for my VW that I hacked and put a 120GB laptop hard drive in. It has probably been in my car for about 7 years and it is still going strong, so I call BS on a lot of this stuff.
This is an anecdote- it has no statistical basis. Repeat it many, many times under controlled conditions, and then you can extrapolate with a good probability as to if it is BS or not.
I take issue with the 'doesn't support huge numbers of older PICs' Microchip has something on the order of 600-700 different PICs in production (and which rarely if ever get obsoleted) - looking through the latest MPLAB 8, I see a few rfPICs and some PIC18s that are not supported by ICD3. But if the part you're using isn't supported, that can be a huge issue.
I've seen quite a few issues with the boost circuit, they haven't traced back to an issue with the ICD3 itself- they were due to the resistor on the MCLR pin being too low, and dragging down VPP. The datasheets pretty universally (now) specify 10K on the MCLR pull up. The boost circuit on the ICD3 is more controllable, but not as strong as the one on the ICD2. With some of the newer parts, the ICD2 can very easily over-voltage MCLR.
On top of that they both seem very limited in terms of the number of breakpoints you can set, lack of data breakpoints and so forth.
If they could just sort their hardware out they might have a nice platform there.
Breakpoints and other debugging features are a matter of hardware- it adds silicon, and therefore cost. Some PICs come with embedded debugging hardware (in every part), some have special debug parts ($25-$50 US)- compared to the development parts of yesteryear, these are super-cheap.
The ICD2 is a *really* old design- it is now obsolete, unfortunately, there are lots out there. The ICD3 is a far more robust platform, from the drivers on up.
Yep, gigabit to the home would be cool, and I would score massive geek points, but in terms of an individual user, what use would it be? A big pipe makes a lot of sense when you're aggregating traffic from a bunch of different sites, but a normal residential customer (torrents aside) is going to be pulling most of their bandwidth from a small number of sites at any one time. Of course, this is for the near term, and I would expect that we are a pretty long way from putting a 2 way gigabit connection to use.
On the other hand, I expect that TWC already has plenty of experience in delivering one way multi-gigabit bandwidth- digital television.
The temperature dependence is a very strong factor that does seem to be missing from the analysis- to add to what the AC parent said, my experience is that the minimum number of erase cycles is when the device is at maximum temperature, take it down to room temperature, and the typical number of erase cycles goes up by an order of magnitude. Most computers have an internal temperature of over 40C when run in a normal environment,
Your drive will fail, SSD or HD. You must be prepared for that.