Yeah - but a net twerk ?
People don't listen to that preflight announcement stuff because they've heard it a hundred times before. People who've flown even a couple of times before don't need to listen. People who are on their first flight, where it's all new and exciting are paying attention.
So, no - I know how to wear a seatbelt and that my seat cushion can be used as a floatation device and to check where the nearest exit row is...yadda yadda yadda. I can stick my nose into my phone and I won't miss anything important.
What's needed is either to make those instructions INTERESTING (like the Southwest Airlines people often do) - or to only give the routine instructions to people who need it. That way, when something truly important comes up, people will pay attention.
I ran Minix for a year or more on my Atari ST - having a UNIX-like operating system on a machine I could have at home was a truly awesome thing. Tanenbaum's work is fascinating, useful and will be around for a good while...which is more or less the definition of "successful" in academic circles.
The debates with Linus were interesting - but I always felt that they were arguing at cross-purposes. Linus wanted a quick implementation of something indistinguishable from "real UNIX" - Tanenbaum wanted something beautiful and elegant. Both got what they wanted - there was (and continues to be) no reason why they can't both continue to exist and be useful.
Tanenbaum's statement that the computer would mostly be running one program at a time was clearly unreasonable for a PC - but think about phones or embedded controllers like BeagleBone and Raspberry Pi? Perhaps Minix is a better solution in those kinds of applications?
I think that to pass the Turing test, you have to tell the judges that the entity they are about to talk to *might* be a computer program. Eliza worked because people had never encountered a computer that even tried to be remotely human - so the assumption was that this was a real person from the outset. Also Eliza is a psychologist - so she gets to ask all the questions and steer the conversation into territory she can actually handle. Responses to things she can't parse are things like "So how does that make YOU feel?" - which work in that situation.
In a real turing test, the questions are completely open and the judge is initially highly sceptical that this is a real human.
Judges in these contests always seem to low-ball the questions. Ask "How would Santa Claus fend off a horde of attacking Ninjas?"
Those are insanely difficult questions for an AI to get right without some neutral "I don't feel like answering that right now" kind of response. A 13 year old kid would leap in and start wondering whether Santa could fly away in his sleigh and drop presents on them...or set the elves loose on them...or ask another question in return, like "Can the reindeer help out?"
Something that requires creativity - not just knowledge (which Watson could pull off) or a decent use of the English language (which Eliza could manage to some degree).
Of course people soon became tired of lugging tons of batteries around with them - and having to stand in line to get them charged up at the end of every work-day. Also, measuring the amount of charge transferred between your battery and that of the supermarket when buying a pound of carrots was always a matter of some dispute. Hence there came to be standard batteries with numerical displays on them to show how much charge remained. Places called banques sprang up where you could leave your batteries and read out their charge remotely. Exchanges allowed you to discharge your batteries *here* and to use an exactly equal amount of energy to charge up those of someone on the other side of the planet who wished to provide you with some physical goods. The inconvenience of physically storing all of that electricity made it more efficient for the banques to supply it to people who needed it, in exchange for electricity in return in the future. Over time, nobody was ever sure that the amount of electricity held in the banque was as much as the banque claimed to have stored - or owed to it.
Pretty soon, a shorthand word for "total amount of electricity" was needed - and that quirky unused '$' symbol on everyone's keyboard came to stand for some arbitrary amount of the stuff.
I realize a while ago that it had been a very long time since I last used a dollar bill or a coin - so I looked back through my banking records to see when I last used an ATM (which is a reasonable approximation for the date when I last needed cash for anything). I was surprised to see that it was almost two years ago. I also looked back at my checkbook...same deal. Haven't used that in two years either.
For me at least - electronic money is already here.
Who cares? I mean - really - you can get a fake degree based on "Your life-experience" or any number of junk bits of paper.
The fact is that when you go for a job someplace waving your Ph.D in Creationism - the people offering the job are going to have a really good laugh at your expense. The only job you're going to be able to get will be working for the Creation Research center.
Think of this as "educational Darwinism" - those with degrees in junk subjects will be rapidly eliminated from the business gene-pool.
C is a mere subset of C++. So you might as well cross C off the list.
Fortran is hopelessly obsolete - although it's certainly still used in a few niches.
You'd be certifiably crazy not to pick either C++ or Java.
Frankly - if you can get your head around C++, Java is a snap because it's little more than a C++ subset too.
The other thing you perhaps don't realise is that a halfway decent programmer can pick up a new language in a weekend - and be 100% comfortable in it in a month. I've totally lost count of how many languages I know...but a quick count says it's at least 30.
C++ or Java - you choose...but forget C and Fortran.
Back when we had a bunch of big SGI graphics machines we decided that they were basically cold heartless bastards with no love of humankind - so we named them after mass-murderers: Hinkley, Lechter, Sutcliffe, etc. This was considered to many to be kinda tasteless - but hey - we're geeks.
When we started to transition over to using Linux PC's for doing our graphics, they seemed like little toys - so we had all sorts of toy names, stuff like Crayola, Etchasketch, etc - but as we learned to network a bunch of them to do the same work, they earned names like Lego, Duplo, Erectorset, etc.
When I named my machines at home, my son was going through a 'batman' phase - so we had Batcave, Waynemanor, Batmobile (a laptop), Alfred, etc. Later the craze was The Matrix - and we used the names of the hover-craft. The machine I'm using now is still called Gnosis for that reason.
Wait - it's MUCH worse than that:
"...publishes orally or in writing, exhibits, or otherwise makes available anything obscene to any a group or individual;"