Hmm, I recall learning the seven shuffle result when I was in math grad school in the 80s (from Prof Diaconus himself.) Did he not publish it until '92?
Often "the code is trying to..." is just shorthand for "the person(s) who coded this had the code to do...." It is always risky to speak informally about a formal system, but it is also a risk to be too formal - humans have a much harder time following formal language. The formal language, indeed, *is* the code, and the reason we don't just talk in code is that our brains are not wired that way.
Saw this tweet: "Glad Google's tracking Santa now to faux-compete w/NORAD's tracker. Finally a "choice" bt govt or private omnisurveillence propaganda." https://twitter.com/emptywheel...
Wish I could delete this comment of mine - I conflated two CBS reporters. It was Lara Logan who got horribly played by a fraudster pedaling absolute falsehoods.
She got played by bad sources pedaling BS stories about Benghazi. This for a report that made it on the air. Yet she insists that CBS suppressed other stories of hers. Were they suppressed because they were bad reporting, or for political reasons? Since leaving CBS, she has gotten wilder about her claims. She really needs to have been hacked, to give herself credibility. If the government hacked her computer, it would validate everything she has said. If the government is not out to get her, she's indistinguishable from any other terrible journalist. What's funny is how breathlessly the conservative press is running with this video. They obviously have no knowledge about what an actual computer hack looks like. Pathetic.
Why would that relationship be there? One CPU cycle is more akin to a single internal reaction in a cell. The idea that our brains are *not* doing complex calculations quite a lot is the misconception here. A single CPU cycle is not enough to even begin to do a useful computation, while the human brain can hit a 100-mile fastball with less than a second of brain computation. The real difference is that a computer AI can be multi-tasking in a way the brain can't. So, even if conversations with the human world would be slow, they could be processed separately, and wouldn't require the "full" attention. (One of the things I would have loved in "Her" is if they showed us this multitasking. At one point, the AI is singing a song, and the lead starts up a conversation. She stops singing and talks, but it would have been funny and alien if she just put her singing into "background music" mode, and then talking while singing. Lack of imagination there.) We've got modes of communication which are much slower. Consider a book, which was written years ago. Finally, why ascribe impatience to an AI?
From an Onion talk: https://www.youtube.com/watch?...
Well, that was a boring boring answer.
I'm firmly of the opinion that you should learn as many different programming languages as you can early on, so you learn how to think in each language and understand what the strengths and weaknesses of langauges are. Honestly, if you learn on language really well, you'll have a niche, but you won't be able to grow nearly as well as if you have loads of experience working in different languages.
svenjick writes: "This is the story of a user's battle to get reimbursed from Hewlett-Packard for unused software. Having to buy two HP laptops for relatives, but not being able to order them without Windows & co pre-installed, he contacted HP's support as indicated in the EULA displayed upon the first boot. Then followed an incredibly long and dirty procedure, where HP repeatedly showed bad faith. Finally, the only solution was to start legal proceedings against HP Switzerland and some members of its management. A happy ending to a highly complex and unpleasant procedure, which should not even exist in the first place, and where users' rights are clearly not respected."
Link to Original Source
Link to Original Source
Use an interpreted language for most of the beginning teaching. Especially if the language has a shell where the user can tinker.