It's stupid if you're benchmarking relative efficiency -- it's not an efficient implementation (and you'll have no trouble finding explanations for why the Python and Java code they wrote, while simpler, is not efficient).
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
And this is why we should not teach CS101 in Java or Python. If they'd been forced to use C this whole experiment would have turned out differently.
Not at all. If you wrote your C in memory string handling as stupidly as they wrote the Python and Java you will still get worse performance in C (e.g. each iteration malloc a new string and then strcpy and strcat into it, and free the old string; compared to buffered file writes you'll lose). It's about failing to understand how to write efficient code, not about which language you chose.
Write-only code is easy in ANY syntax.
Write-only code is possible in any language. Some languages make it easier than others.
I'm guessing the reason he doesn't take money from the fossil fuel industry is because he just can't be bothered with such trifling sums. The average salary in the US is more like $350k or $400k, IIRC. 120k is for total losers.
I can only presume your talking about research grants combined with salary, despite saying "The average salary" because otherwise you are simply flat wrong. The average salary for (full) professors in the US is $98,974.
Should we teach everyone basic first aid and CPR, fundamentals of mechanics, and the basics of how to sew, cook, etc.? Yes, yes we should.
I don't think the "Teach Everyone to Code" movement is about making everyone professional programmers; it's about ensuring that everyone gets exposed the basics of how programming works, just like they get exposed to the basics of a great many other things in their schooling.
The results were startling. After re-running the election 100 times with a randomly drawn nonpartisan map each time, the average simulated election result was 7 or 8 U.S. House seats for the Democrats and 5 or 6 for Republicans. The maximum number of Republican seats that emerged from any of the simulations was eight. The actual outcome of the election — four Democratic representatives and nine Republicans – did not occur in any of the simulations. "If we really want our elections to reflect the will of the people, then I think we have to put in safeguards to protect our democracy so redistrictings don't end up so biased that they essentially fix the elections before they get started," says Mattingly. But North Carolina State Senator Bob Rucho is unimpressed. "I'm saying these maps aren't gerrymandered," says Rucho. "It was a matter of what the candidates actually was able to tell the voters and if the voters agreed with them. Why would you call that uncompetitive?""
Seems to me that Apple is playing catch-up in the phablet arena. Apple was late to the party and lost the toehold because of its tardiness.
No, no, you're looking at this all wrong. Apple stayed out of the Phablet market until they were "cool/hip/trendy". The vast sales Samsung had were merely to unimportant people. Apple, on the other hand, entered the market exactly when phablets became cool, because, by definition, phablets became cool only once Apple had entered the market.
Logic is a binary function. Something is in a logical set - or it is not. Being illogical is not a synonym for being mistaken. Degrees of precision are irrelevant for set inclusion. Fuzzy logic is not logic.
Fuzzy logic is logic. So are linear logic, intuitionistic logic, temporal logic, modal logic, and categorical logic. Just because you only learned Boolean logic doesn't mean there aren't well developed consistent logics beyond that. In practice bivalent logics are the exceptions.
... a lot of people respond to this by saying the criticisms are stupid, that "if you know what you're doing" then you'll understand what's really going on, etc.
Yes; "if you're just willing to get your hands a little dirty and muck in and learn then you can bend the hugely complicated interface to your needs" they'll say; they'll complain that your just not willing to learn things, and thus it is your fault. Such people will inevitably state that they are "power users" who need ultimate configurability and are (unlike you) willing to learn what they need to to get that.
For those of us who learned to "customize our desktop" back in the days of FVWM via scriptable config files calling perl scripts etc. it seems clear that "power users" are really just posers who want to play at being "super-customised". Almost all the modern DEs do have complete customisation available and accessible; some of them just use a richer (scripting) interface to get such things done.
It's not the features that you stare at with no idea what they do that cause a problem. As you say, a quick look at the manual can help to sort that out (though it does add to the overall cognitive load). It's all the potentially subtle things that you don't even realise are features and so never look up and don't realise that, contrary to first inspection, the code is actually doing something subtly different to what you expect.
Math is all about being precise, logical.. Communicating exactly one concept at a time. Natural languages do neither.
Except math is almost never actually done that way in practice. Euclid was wonderful, but almost all modern math does not work that strictly (and Euclid really should have been more careful with the parallel postulate -- there's "more than one thing at a time" involved there). Yes, proofs are careful and detailed, but so is, say, technical writing in English. Except for a few cases (check out metamath.org, or Homotopy Type Theory) almost no-one actually pedantically lays out all the formal steps introducing "only one concept at a time".
Not every programmer deals with these [mathematical] questions regularly (which is why I donâ(TM)t think math is necessary to be a programmer), but if you want to be a great programmer you had better bet youâ(TM)ll need it.
I don't think you need math even to be a great programmer. I do think a lot of great programmers are people who think in mathematical terms and thus benefit from mathematics. But I also believe you can be a great programmer and not be the sort of person who thinks in those terms. I expect the latter is harder, but then I'm a mathematician so I'm more than read to accept that I have some bias in this topic.
Math IS sequencing. So is using recipes. That is how math works.
Math is a language. Just because you can frame things in that language doesn't mean that that language is necessary. Recipes are often in English. English is sequencing (words are a serial stream after all). That doesn't mean English is necessary for programming (there seem to many competent non-english speaking programmers as far as I can tell).
Disclaimer: I am a professional research mathematician; I do understand math just fine.
College education wastes countless hours teaching academic stuff that a great majority of programmers will not use on the job, while neglecting critical skills that could be immediately useful in a large
Of course there was a time when college education was supposed to be education and not just vocational training.
I think part of the problem is that "programming" is itself so diverse.
The other part of the problem is that math is so diverse. There's calculus and engineering math with all kinds of techniques for solving this or that PDE; there's set theoretic foundations; there's graph theory and design theory and combinatorics and a slew of other discrete math topics; there's topology and metric spaces and various abstractions for continuity; there's linear algebra and all the finer points of matrices and matrix decompositions and tensors and on into Hilbert spaces and other infinite dimensional things; there's category theory and stacks and topos theory and other esoterica of abstraction. On and on, and all very different and I can't even pretend to have anything but cursory knowledge of most of them