Graphics so real you could almost be there although we can't figure out why you'd *want* to be there, exciting architecture-based gameplay. Defeat enormous boss structures such as gothic cathedrals and terrifying office blocks, advance to higher levels and face ever-more-powerful types of inanimate building...
At least most of these actually got off of the ground and some really don't belong in a list of bad aircraft - the example of the Comet has already been raised, the MiG-23 wasn't a bad plane by any means - unforgiving of inexperienced pilots, but so was the F-104 and *that* one gets included in a lot of "best planes ever" lists. Total production of the MiG-23 family is over 5,000 - bad planes don't get built in that sort of numbers.
Throw in planes that were pretty adequate in their time but verging on obsolete when they had their 15 minutes (the Devastator), those that weren't actually bad but had the misfortune of being the successor to something so successful it wouldn't go away (the Albacore). It's difficult to call the Me 163 a bad plane - it was a desperate measure that made it very dangerous, but it's a very significant type. The He 162? Another desperation measure, but one of the more trusted opinions on the merits of aircraft (Eric "Winkle' Brown) found it a downright joy to fly, although again it was (again) unforgiving of inexperienced pilots, which perhaps wasn't the best quality for something intended to be flown by pilots with minimal training.
Besides, there are so many things that can ruin otherwise good designs - how many 50s US jets are considered jokes because the DoD decided they were to be powered by the Westinghouse J-40? Not bad planes, but a bad engine. Some planes that escaped from the J-40 and had alternate power plants suggested (F4D, for example) ended up being considered classics.
Like tech hirers really value the fields they hire from that much. They may consider a 'B' in CS better than an 'A+' in English, but maybe the 'A+' in English will be more useful when they declare you obsolete after two or three years because it's cheaper to get a fresh college grad who they can pay less and who was taught the latest and greatest software development fad at college, than to continue to pay you.
Yeah, file me under bitter. I'm good at what I do, but at 47 I'm over 20 years past my sell-by date as far as most tech companies are concerned. OK, so it doesn't help that I live in a town where unless you want to be a Java code monkey or a Microsoft sysadmin, you're screwed. Even if there were jobs around, I've been backstabbed and screwed around by almost every employer I've ever had, and frankly don't trust *anyone* now. In fact, after the last experience I had, I don't even trust friends who recommend me for jobs. That was the one that made me completely give up. I'm now resigned to being out of work basically for the rest of my life (while hoping a terminal disease comes along to shorten it). Next step is to sell the computer.
I could have taken either path after high school - had no trouble getting on a good CS degree course, and came out with a 2.1 (OK, so I'm not an A+ in CS, maybe an A- or a B+) but would just as easily have done an English degree and gotten that A+. I suspect the English degree would have been better in the long run. I steer people away from CS-oriented stuff as a career when I can unless they're considering something that'll give them dual-purpose skills they can apply in other fields, or they can specialize in fields that seem to be more secure in the long-term. For example, a degree in CS with a strong focus on databases - the theory, the implementation, administration - seems like a very safe option even now, and if they offered degrees in systems administration, I suspect those'd be very safe too. Both do require a CS core, but throwing out a lot of stuff that's of little use in practice in favor of more relevant stuff aimed at the specialization would be very useful.
I used to think that the degree course I did, which was broad and deep (British university system - bugger all about being "well-rounded", it's CS all the way through) but while I still think it's better than the US approach. It didn't do a damn bit of good in the end, because so many places couldn't care less about any of it (didn't even know what half of it was, and when they did it was sometimes worse (*)), they just want somebody who can do what they're using this week and doesn't mind a career path that places "lay him off and hire somebody cheap straight out of college" ahead of career advancement.
(*) "How do you know functional programming, that's after your time? So you can code in *F?" "No, you stupid bastard, I learned ML before you were born, on a course taught by a very nice man called Robin Milner. You probably haven't heard of him because he wasn't a Silicon Valley billionaire".
I wonder if this is a US-specific problem. Back home in Scotland, there were public links, where basically anybody could play, and while I never really saw the point of the game myself, my younger brother went through a phase where he was into it - at quieter times, like on Saturday mornings, he and his friend could go down there and play for free - the place rented old sets of clubs for next to nothing which he could often get back just from the golf balls they found while playing, The responses about "it used to be middle class" seem to reflect the US - in Scotland (at least, 20 years ago, maybe it's gone bad since) it was for absolutely anybody - I'm from a pretty solidly working class background.
All that said, I agree that making the game easier won't really attract people - it's the modern perception (and US reality) of it as a rich man's game. *They* have the time because it's a slightly (very, very slightly) more active variation on a business lunch - wander around hitting a little ball occasionally and talking business. Or so they claim, anyway. I suspect that people who're there to play the game tended to go around faster since they spend much less time chewing the fat about Big Business Deals or Who To Screw Over Today.
...when I'm in work, there does tend to be overlap. Not 100% overlap - a full mirroring of a work system - but enough that I could throw together anything I might need for work with reasonable certainty that it will work without problems. The major difference has tended to be the system - typically work environments have been Linux while I've been using Mac OS X on my home setup - and that the home system's running a more up-to-date setup than I do at work. It's just a more comfortable environment for me, and a good percentage of the time what I'm working on/learning is as much for my own benefit as for my employer. I may use their specific problems as the target for what I'm learning, but anything I learn that isn't proprietary to my employer has value to me.
Of course, being stuck in the desolate IT wasteland (at least as far as anything interesting goes) that is Milwaukee and being in my mid-40s acts as a big demotivator for learning anything new, because employers seem to think somebody fresh out of college who has just learned the same skill is more desirable as a hire than somebody with that skill plus 20 years prior experience. I'll admit I'm not as cheap to hire, but experience seems to be a liability in the IT market, not a plus, with the people hiring apparently of the belief that all knowledge becomes obsolete the instant something new and shiny comes along. So nowadays I learn not because I'm under any illusion it'll make me more hirable, but because I find it interesting. It means I end up working on more offbeat stuff that's not necessarily of interest to anybody but me (like writing my own language for the hell of it) but I've got to keep my brain busy.
Clearly there was a major flaw in the requirements stage for this - the person who omitted "The project must not enslave humanity." has a lot to answer for, although it's possibly it was just due to vague requirements specification and there was a "The project should not enslave humanity." in there, that the developers felt they had to work around in order to achieve the management goals.
Other than that though, it's a wonderful example of project management on a very large systems engineering effort gone right. The system's so robust that it *kills* people who try to introduce bugs or take it down, it integrates with a completely incompatible outside system seamlessly (pre-dating Plug And Play by decades), the resulting combined system scales well (it's too old to be web scale, so shut up), it's not beyond using the threat of nuclear destruction to make sure its deadlines are met (and hopefully to target people who ask if it's web scale) and it not only patches itself, but by the end of the movie it's fully in charge of its own SDLC and is working on writing the next major release without outside intervention. Other than the killing people and enslaving humanity bit, which can be characterized as a feature as well as a bug, who *wouldn't* want to manage a project like that?
Get off their collective asses? What's the urgency? Are the names of these exoplanets going to have any significance to *anybody* other than astronomers anytime soon? For values of "soon" that could measure in centuries. It's not as if somebody's desperately waiting on this information so they can put out bus timetables.
Apple never claimed they were going to offer that stuff.
However, Apple *has* provided an API that provides iOS 5 apps with a cloud-based key-value store that *applications* can use pretty much however they want to. There'll be a lot of interesting iCloud functionality appearing over time, but don't expect stuff like co-editing because that's not what the service is intended for.
Note that I'm currently less than delighted with iCloud however - for such a big deal, flagship, gosh-wow product, for iCloud mail (both via IMAP and the web-based version) to be dead as a doornail less than 24hrs after launch is pretty poor.
First company I worked for was a great place to work - small but growing, varied work, nice salary increases (yes, this was almost 20 years ago). I left rather regretfully - personal circumstances meant I'd had to move out of the area, but they even helped me out by taking me on as a contractor for the first six months in my new location, while I got on my feet there.
Later, after my next employer had been and gone and I'd been out of work for some time, they came back and asked if I wanted to telecommute. Didn't work particularly well, but they stuck by me even when I went through a burnout and had to go on short-term disability until my brain was less fried.
However, all it took was one incident with miscommunication between two managers, which resulted in one of them losing face, and I went from being prized to getting a lousy review (which I protested as strongly as I could) and put on a probationary period - where I was put to work doing something completely different from what I was really hired for, and generally treated like scum before finally being laid off (and I later heard they were very pleased because they'd found out that the money saved could pay for two outsourced developers).
So - small firms can be good, but the very smallness that makes them sometimes great places to work can turn on your very quickly, and it's much easier to get canned because of a personality clash or an idiot who wants somebody else to take the fall for their mistake. Your company has already demonstrated that they aren't above outsourcing. You might feel bad about doing it at a time when your company sees you as their golden boy, but if the sheen wears off once those two junior developers get up to speed, that perceived loyalty on their part may evaporate. Go with the better offer - while people within companies may be nice or decent, companies themselves basically don't give a damn, and even a CEO who is your best buddy one day can turn on a dime and can you the next. Having a good offer from elsewhere is getting to be a rare thing - don't miss the opportunity. It's good that you're not jaded enough to automatically think that way - I've gotten to the point where I'll work for nobody but myself, I've got complete distrust of employers.
It's nice to see the almost complete absence of any titles reflecting current "flavor of the month" development techniques. About the closest it gets is "Design Patterns", which I've not got the highest opinion of (for reasons I'll explain in a footnote) but which at least codified some common best practices in a way that they could be taught, rather than learned by trial and error.
Always been a bit bemused by "Code Complete". Read it (well, the first edition), enjoyed it, and thought it contained a lot of good stuff, but at the time I was perplexed by it being a Microsoft Press book. Seemed kinda like the Pope penning the definitive case for atheism.
Somewhat comforted to see The Dragon Book in there, even if it is in its newer edition. Think I've still got that one around. That, and the Hennessy and Patterson book "Computer Architecture: A Quantitative Approach". My edition's horribly out of date, but so am I..
(*) Big problem with design patterns is that there's a tendency to take them as Holy Scripture of some sort, and/or to unnecessarily squeeze algorithms into a given design pattern. However, the problem there doesn't really lie with the book, but with the reader (or the teacher of the course, I guess). "Design Patterns" strikes me as something that should be read after a decent level of programming ability has been reached and not before - there's a level of expertise required to know when using a pattern that'll make maintenance of the software a joy for all who touch the code, and when to just wing it. Too many people immediately jump in and conclude that they must use a Visitor pattern on each Decorator, except where there's a Mediator involved, in which case it's necessary to employ the Churning Curds and The Knot Of Fame, before finally employing The Clinging Creeper to either a Flyweight object for the Proxy, or an Abstract Singleton. Then they code it, and you end up with an exponential number of classes, several mutually inclusive interfaces, and a system so flexible that you have to embed a dedicated parser to make any use of it beyond initiating a single object.
My life'd probably be a lot better if I'd managed to get my PhD. In my case the main reason I failed to get it was moving from full-time study to part-time study and full-time employment. Reckon I was about a year of full-time work from defending, but the demands of work (and being on the other side of the Atlantic from my supervisor) meant that from my first draft to defense time took over three years. Also, taking the basic direction from my supervisor lead me down something of a blind alley - the whole area of processor architecture I was researching was discredited, dead and buried by the time my defense came, largely on the basis of my own supervisor's work. If I'd got the PhD, well, given that I did I'd have had a decent chance of getting into R&D somewhere, or even just staying in academia. Instead, I got an M.Phil., a "consolation degree" that I had to explain to every potential US employer until I just started listing it like a Masters. If I even describe at as a "Masters via research" it sets off "overqualified" alarms.
Of course, there are a dozen different points in the past 25 years where, had I chosen A instead of B, I think I'd be better off than I am. Hindsight doesn't really do me much good, alas.
Well, comp sci wouldn't hurt, but depending on syllabus you may well got more relevant experience for working with embedded systems from having EE as a second major, and the combination of the two engineering disciplines is a strong one. Another couple of possible matches, if you can find somewhere that offers a suitable program, would be systems engineering (that is, the real "big picture" cross-domain, full-product-lifecycle discipline) or material science. The latter may be part of an ME program, depending on where you are, but in the alternate power area developing new materials is going to be central to decreasing the cost and increasing the efficiency of those power sources.
Oh, I know, but one of the options (where I live - as against where I *want* to live) is fixed - my spouse absolutely will not move away from her family, regardless of the fact that this city is something like 4th poorest in the US and a wasteland as far as IT is concerned. In the current economic landscape, even if she were willing to move and an employer was willing to overlook my long-term unemployment, it'd be a difficult move, based on the fact that she's *got* a decent job that she's happy in. Even if my wife doesn't earn what I could pull in were I to find a job I was a good fit for, one of us actually already having a job that we enjoy outweighs us moving elsewhere to a job where how well I'll fit in is an unknown, and that could end up with the situation where my wife can't find work, and I'm bringing in a decent amount and hating every minute of it. At least *she* is happy, whereas moving has a fuzzy solution that could range anywhere from "we're both miserable" to "we're both happy", but that's automatically biased towards the former because of my wife's family ties.
Staying where I am does at least hold out some hope of doing the work I want to do, if I work for myself, although without any sort of income guarantee. However, long-term unemployment has wreaked one of its fairly common financial effects on us, in that we're now in a situation where for the next five years, anything I earn basically vanishes, and the best I could reasonably do is reduce "five years" to "two or three years", if you get my drift. At least, working for myself I can stick to my strengths - Ive just got to hope that the products of it prove to be profitable.
Basically just making the best of a lousy situation.
My own unemployment situation is terminal - but it's a product both of the economy and my inability to relocate. If I'd been free to move to an area where the jobs in my field are three or four years ago, chances are I'd never have become unemployed in the first place. Of course, I've now been unemployed so long that I couldn't even get a job in those areas anymore. However, living where I do there's a major mismatch between what employers seem to want (seems to mainly be enterprise Java coders) and where the bulk of my experience lies (systems engineering). However, while I have the skill set to work with EJB 3 or Spring, that's just a side effect - in my last job, the work I was hired for never really materialized, so I ended up doing a fair bit of Java before they decided that they'd be better off using the money they were paying me to get a couple of dedicated coders, without all of the baggage of my experience doing other stuff, straight out of college.
While I've given up looking, I think a lot of problems lie in the areas of HR, whether in-house or through an agency. With the exception of a few particularly specialized tech-oriented agencies, there's a real disconnect between the people who run the departments who have the vacancy and the people who do the hiring. That's a problem, since it's difficult to convey what's really needed for the job, and where having skills A and B is a valid substitute for C, or cases where you've got experience in D and they don't know that implies your expertise in E and F is off the chart, or where experience in G can get you up and running with H very quickly even if you're not experienced with it. They feed the resume through their buzzword checker, and kick it out if it doesn't include C, E, F and H. So somebody who is quite capable of doing the job doesn't even get through the preliminary culling of resumes. A good tech agency can do a lot there - and I had one for a while, who put me forward for jobs that even though I didn't look like a good match to HR, they knew from extensive interviews and their own expertise what I could and couldn't do.
In the end though, I think a bigger contribution to me stopping looking was the way I'd been treated by employers and potential employers over the years. In my last job, my boss was *so* insistent that I had to get a specific piece of work done by an arbitrary date (arbitrary because it was between Christmas and New Year, and those who were depending on it weren't going to be back in the office until January 5th) that I had to work over Christmas day, and *then* laid me off on January 7th. Then there was the Dream Job where the hiring manager seemed *super* enthusiastic from the first interview, and had me in for a second and third interviews on the next couple of days, then told me that while he couldn't say I had the job since he had to get his manager's manager to sign off on it, it was really just a technicality - then it took 2-3 weeks for them to actually pin down the right people and get them to sign on the dotted line, so long in fact that the company changed its policy so that they would no longer hire people through agencies before it was all done, and after keeping me hanging on with "any day now" for close to a month it was "Sorry, we can't hire you, bye." Of course, the agency that had put me forward had me under an agreement whereby the company in question couldn't hire me directly for a year. Even though the agency went out of business about three months later, it was still too late. That one pretty much broke my spirit completely - it was the only job in my field that I've *ever* seen advertised here (excluding one local company that has as a mandatory requirement experience with a particular DoD standard that you can only get in this state by working for *that* company).
So I gave up. In theory I'm having a go at getting going on my own in iOS/OS X development, trying to funnel what I did for fun in my spare time into a job, but that's getting nowhere. I've spent seven of the last eleven years out of work, largely due to being fscked around by companies and family commitments nailing me down to a city (Milwaukee) where nobody wants my skills. Working for myself is really my only hope now - the length of unemployment and the degree of bitterness I've developed towards common tech management and hiring practices make it highly unlikely I'll ever get a job working for somebody else again. So if the self-driven development goes nowhere, I'm basically done. For life. Makes the CS degree and all the time I spent working on my PhD (which I failed to get, I'll admit, but only just) seem like a complete waste. When I've been asked for career advice in recent years, I've done everything I can to steer people away from the path that I took.
While my own situation is perhaps unusual, I'm sure I'm not the only person with engineering skills in the US who is "lost" to companies needing engineers largely because they'd become so disenchanted and disillusioned and just plain pissed-off with the way companies typically treat us.