If we look at jet aircraft, wear depends on the airframe and the engines, and the airframe seems to be the number of pressurize/depressurize cycles as well as the running hours. Engines get swapped out routinely but when the airframe has enough stress it's time to retire the aircraft lest it suffer catastrophic failure. Rockets are different in scale (much greater stresses) but we can expect the failure points due to age to be those two, with the addition of one main rocket-specific failure point: cryogenic tanks.
How long each will be reliable can be established using ground-based environmental testing. Nobody has the numbers for Falcon 9R yet.
Weight vs. reusable life will become a design decision in rocket design.
It used to be that any degree would get your "foot in the door" with HR. Some of the best programmers I worked with over the years had degrees in English, Philosophy, and even a History major.
University teaches you how to learn new material, how to prioritize it, how to summarize, how to reach the meat in the middle of the chaff. It does not teach you how to program. While there are benefits to knowing computing theory, it's not theory that gets the job done -- experience does that.
I'm surprised you're having such a tough time finding work if you're actually good at programming. Perhaps it's the way you're presenting yourself in your resume, because, as I said, it doesn't really matter what your degree is in for getting your foot in the door.
It's far easier to just have the cops shoot the white people, too. Then they'll know what it's like to be a "person of colour."
And in 1...2...3...
Cue all the math junkies who claim that there is "proof" you can't hear the difference between 44.1/16 bit audio streams and higher quality rates like 192/24 or analogue. Because the math "prooves" that thousands upon thousands of people who claim to hear a difference are "delusional liars."
I am neither delusional nor a liar. I hear the difference. It's clear as night and day.
I disagree completely. Good science fiction has never been about the technology, but about human and alien personalities and moral questions brought about by the technology. Good science fiction explores interpersonal relationships, character traits, philosophical stances, and other such subject matter.
The science fiction of the mid-late '80s made good movies because the directors and script writers were selecting stories with deep connotations, instead of viewing them with an eye towards turning them into CGI action flicks emphasizing trivia like "the technology" instead of the plot.
There is still a tremendous amount of good science fiction written over the years that would make terrific movies. But hollywood won't back those "risks" -- they're too busy investing in action movies pretending to be science fiction. There are exceptions to that, but for the most part you know it's true: hollywood doesn't want to discuss morality, philosophy, and personal interactions in a script. They want a nice "safe" piece of pablum that will make audiences go "ooh" and "aah" over the mindless special F/X, not cause them to think for themselves.
The problem has been the same since the PC first came out: users can "do things" with a PC/laptop/smartphone/tablet and think that "doing things" makes them an expert on IT. So when they come up with a "great idea for a new application", they can not and will not fathom the fact that it can take months or years to implement, is going to cost hundreds of thousands if not millions of dollars, and will be obsolete before it ever hits production due to changing business needs.
There is no cure for the "wisdom" of people who tell you how to do your job, or how their 14 year old nephew could write the application in a few weeks. They've made up their mind that you're just a lazy SOB trying to milk the company for money and a cushy job, and will never, ever, ever understand just how much effort goes into security, design, testing, porting, etc. To them, everything is "easy."
The real problem is that companies let such users and managers make business decisions based on "their gut instinct" instead of properly planned and projected schedules. Because heaven forbid you should ever tell the marketting manager that he can't have his shiny Sharepoint solution because it doesn't provide anything useful to the company that can't be accomplished with a properly organized set of folders on a shared drive/server somewhere.
No, they're the ones who sign for the budgets, and they're the ones who like the "shiny", so you're the one who gets stuck trying to make the shiny work with all the line of business systems that are actually important to the operation of the business.
And if you even hint that you can't do it, well, there's a company overseas that's promising to do it in a month as an offshore service, so you're fired.
Which, in a nutshell, is how the bean counters and their ilk get away with their bad business decisions: they constantly hold the threat of offshoring and termination over your head to beat Mr. IT into submission.
"except in response to events and information"
Artificial Intelligence does not imply volition. I know of no reason to expect an early AI to have a will or to come up with results expect in response to events and information it's designed to respond to. While some might try to simulate the volition of a live entity, I do not feel it's necessary to include such a component in order to qualify something as an Artificial Intelligence.
Artificial Intelligence just means artificial thought about something. Sufficient understanding of the subject matter to reach conclusions and produce outputs relevant to what is known or implied. Creativity and volition are another kettle of fish entirely.
I can see them losing market share to renewables, but that's not the same as losing money.
There is nothing about legislation anywhere in North America that guarantees the continued success of an obsolete business model. No matter how many congressmen and senators the MPAA and RIAA have bought off.
C started out with high level "constructs" that were basically the operators of the PDP-11 and VAX processors from DEC. While those constructs have mapped well to other processors, virtually every statement in C originally compiled to one instruction on those processors.
To this day, C still gives you the power and flexibility of any low-level language worth it's salt, and ample opportunity to hang yourself and your code. Don't forget -- C++ originally targetted C as it's output, not machine code. C has similarly provided the "back end" for no small number of special-purpose compilers.
Then there are the operating systems and device drivers that have been written with it, and all the embedded systems logic for devices all and sundry.
C will never die any more than assembly or COBOL and FORTRAN will. There will always be those special-purpose high-performance tasks where it is worthwhile to use C instead of a higher level language. Just as there are times where it still makes sense to drop even lower and into assembly.
You go ahead and try to bootstrap an entire C++ implementation so you can write a kernel in it. Good luck with that. Getting a C library framework running on a bootstrapped kernel is hard enough. C++ would be orders of magnitude harder.
Those areas are underserved because the only place you're going to see a computer there is tucked under the arm of a crackhead who stole one and is looking for a fence.
I work on my pet project (http://msscodefactory.sourceforge.net) because it's a fun challenge I set myself many years ago. Whether others use it is irrelevant. Whether I ever make money off it is irrelevant. There is only one thing that matters to me:
Having fun coding.
That's it. Beginning and end of story. I work on it for fun.
Use an IDE to edit? You're kidding, right?
Why in all that's holy would I load up a multi-megabyte behemoth instead of using a text editor for editing code? I use the IDE to fix build errors that result, and to do the debugging.
But with ant handling the build process and a decent debugger, I see absolutely no need for an IDE. In fact, Eclipse crashes about half the time I try to use it, so I can't use it for projects the size I work on as a build manager. It pukes itself far too often, forcing a complete rebuild every time. And the more code has to be rebuilt, the more likely it is to puke on itself.
No man. A decent editor like vi or emacs, a build manager, and a debugger are all you need. Loading up a whole IDE is overkill.
But then again, I've never seen any debuggers other than IDEs for Java.