More than once I've seen the existing functionality of an IDE's automatic (or semi-automatic) compile, run and debug loop sabotaged by some (sometimes mandated) third party plugin which is supposed to make things "easier." I've watched as people poorly integrate Java Spring into a project and render it impossible to use Eclipse's debug button because of badly constructed project dependencies. ("Oh, if you want to run the project, just drop into the command-line terminal and type 'ant someobscurebuildproduct run'; it's pretty easy, why are you complaining?") I've seen people integrate Maven into a build process in ways which guarantee the project will stop compiling at some unspecified time in the future by improperly scoping the range of libraries that will be accepted. (And I'm not a fan of Maven or CocoaPods or other external dependency resolution tools anyway, as in part it presumes the external libraries we link against that are hosted outside of our source kits will honor their public interface contracts as minor revisions roll out, something which isn't always true.)
I've seen refactoring which adds code bloat (rather than simplifies the code). I've seen home-rolled configuration files which make code discovery functions break code discovery functions in Eclipse useless (do you really need to home-roll you own class loader, and specify Java classes in JSON?). I've seen plenty of 'voodoo stick waving' standing in for good coding practices. I've seen the lava flow anti-pattern obscure a simple Java RTL call in 20 discrete layers of classes, each added on by another refactoring that did nothing but make things more obscure.
I've seen weird blends of ant and makefile build processes which took a product that should have taken perhaps 5 minutes to build take over an hour, and build processes so broken that it was impossible to shoe-horn into an IDE without rewriting the entire project. ("Real programmers use 'vi.'" *shakes head*)
In fact, I have a working theory about programmers--and that is this: Most programmers think something should be a certain level of difficulty. And when a project turns out to be easier than they think it should be, they add artificial complexity until it reaches the expected level of difficulty. At some point, the project then needs to grow, making things organically more difficult--but the artificial difficulty added by the programmer who thought things were too easy before then simply sabotage the project. This is why quite a few projects I've worked on over the past 25 years have failed.
This is why I have no hope for any project that attempts to fix the problem. That's because the problem is cultural, not technological. We've had IDEs which had integrated debugging and one-button build + run processes going back to the 1980's--yet somehow we always glom to the next big thing, oh, and sorry about breaking your IDE--but this next big thing is SO important it'll totally compensate for the fact that we broke your edit/run/debug cycle.
Meaning I guarantee you the moment someone builds a fool-proof IDE which makes it brain-dead simple to edit, compile, run and debug an application, some damned brain-dead fool will come along, worried that things shouldn't be harder, and break the system.
A free market presumes competition, and it presumes regulation against perverse incentives. Neither are the case here, where cable companies are granted de-facto monopolies over geographic regions, and where the majority of traffic being carried on the internet is increasingly in direct competition with the cable company's video offerings.
It's why, while I do have sympathy for a properly functioning free market (with competition and no perverse incentives), I have no sympathy for cable companies trying to argue that it's their hardware, they should be able to do what they want. Yes, it's their hardware--but they've been granted regional monopolies. That strongly implies that they have no leg to stand on when they argue 'free markets' to bypass regulations being imposed on their networks.
It's also worth remembering that, as pilots are the last line of defense when equipment starts malfunctioning, when a crash occurs the NTSB often sights "pilot error" for failing to maintain control during an equipment malfunction. It may be that the root cause of the failure was equipment malfunction, but unless a wing falls off the aircraft, the pilot is expected to maintain control of the airplane through all equipment malfunctions.
Anytime I hear about someone talking about making a better autopilot or someone making an autopilot for cars, I really wish they would watch the "Children of the Magenta" video that's now posted on Vimeo.
Autopilots often make things more difficult for a pilot because, in some circumstances, the autopilot simply adds a new workload layer that can sometimes interfere with operations. I recently experienced that myself as I was trying to engage the autopilot as I was taking off out of Raleigh (I have a private pilot license and was flying a small 4 seater); the stupid system started complaining into my ear about altitude settings and being improperly set up just as Raleigh tower called me to ask me to turn. I had to shut the stupid system off and call to Raleigh tower to repeat the instructions.
Now watch the video and tell me that a more sophisticated autopilot system isn't going to just encourage more pilots to plug themselves (and the passengers they're flying) into the ground at a faster rate.
Shouldn't they be looking at a different solution here?
Well, they could equip aircraft with air-to-surface missiles designed to track laser pointers, and allow pilots to legally fire back. But I think a legal solution involving a few public arrests and some public education may be preferable.
The landing on highways takes time to set up as traffic needs to be stopped.
Obviously you've never flown with any of my CFIs. (Do you have any experience in the left seat?)
You don't stop traffic. You don't call up and ask the highway patrol to kindly put up a road block, coordinate with local officials, clear traffic from the traffic lanes, then coordinate with the pilot to see if it's safe to land.
It's an emergency. You basically land and hope for the best.
Couldn't agree more. Why is it on the test then?
Because the FAA isn't happy until no-one is happy.
From what I hear at airports in suburban areas the minimum hobbs is more like 5hr/day, though it probably depends greatly whether you're talking about a weekday or weekend.
5?!? That's crazy! The highest I've ever seen was 2hrs minimum for a weekday and 3hrs for a weekend--and that from a flight school which had their planes constantly rented out. The club where I was renting was 2 min for a weekend and 1 for a weekday.
Just as your getting an IFR rating was the obvious and common sense solution to a near disaster.
To be clear it wasn't a near disaster (*rolls eyes*), and I had several outs. I wasn't even in violation of the regs; I was more than 500 AGL and was more than the minimum required distance from people or a populated area (I was out over the ocean). Now had I not had the training I had with an instructor who believed in pushing the envelope--well, someone who has been trained to live in fear of the rules could have been led to believe they had no outs when in fact there were several staring them in the face. Like landing rather than crashing in a box canyon.
In the case of a car or boat when the operator becomes ill he can pull over and stop. An aircraft is a different matter in that it could kill many more people including the operator if it crashes.
Unless the driver is on a busy freeway, at which point expect a multiple-car accident with a handful of deaths. Or unless the driver is driving through a busy pedestrian mall. Or the driver has a senior moment and makes a wrong turn.
The solution of this problem is trivial and is left as an exercise for the reader.