Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment Re:Why not "Cooking for All"? (Score 1) 246

I found high level language unintuitive until I knew what they were doing behind the scenes. I tired Basic as my first language, but it left my confused and frustrated after a few days of trying. Then I tried ASM and it was a natural fit. Once I understood how ASM worked, I progressed to learning how compilers converted C into "ASM", then C++. I absolutely hate black-box magic. I need to know what is going on, at least in a pseudo-code kind of way. I must have a working mental model of what is going on. I didn't get into programming to code, I got into programming to tell computers what to do. I best know what I'm telling the computer.

This is all a matter of how well the language matches the task though, surely? If you're starting out with tasks that involve manipulating long lists, you really need a language with list support, and given that the majority of situations where anyone not in a designated dev role might want to code something is because they want to perform a series of operations on a moderately long dataset, that's the sort of computing we should be teaching.

I'm training to be a high school ICT/CS teacher, and I'm always looking at the "natural environment" of different languages, in order to avoid "hello world" type introductions. The natural introduction to C is to build extensions for the command line -- that's more or less what it was invented for. I can't imagine introducing Python without lists. And I think lists will always provide better opportunities for meaningful tasks -- playing with text generation, shopping lists, accounting etc.

Comment Re:What a retarded concept (Score 1) 246

Then think of computers as a way of concretising the results of problem solving. Logo taught me a fair amount about geometry and parametrisation in mathematical descriptions. How do you know you've solved the basic word problem correctly? Only because the teacher tells you, and then you're thrown into the big argument about how what you did actually matches what the words say, and it's the teacher's interpretation of the problem that is wrong. Get something wrong in a computer-based task, and the mistake becomes quickly apparent -- if it doesn't draw a square, there's no arguing as to whether you've made a mistake or not.

Comment Re:Why not "Cooking for All"? (Score 1) 246

Not everyone is a chef.

You don't have to be a chef to know how to prepare a meal that's edible and nutritious any more than you have to be a programmer to use a computer. Guess which one most people will find more useful in their lives.

Ask that question 100 years from now and the answer will be likely reversed.

Ask that question 100 years from now and you'll probably get a blank look and "How do you cook a meal without programming the Chef-O-Tron?"

Comment Re:Why not "Cooking for All"? (Score 1) 246

I say you've got that back-to-front. If you start with a low-level language, you've got to teach all the unintuitive stuff about how computers work first. Start with a higher-level language, and the abstractions are much more human-friendly. How many lines does it take in C to map an array of ints to an array of their squares? Now try that in Python.

Computers exist to make complex tasks simple. If we start by taking simple tasks and making them complex, we fail to connect the students to the core idea of what a computer is all about, and we turn them off.

Comment Re:Why not "Cooking for All"? (Score 1) 246

I might agree that a general computer knowledge class is important - but the kids all do that anyway through their normal coursework these days; they all seem to use computers and word processors and so forth. Programming is not for everyone. Basic math is, in ways kids don't understand until they grow up and see how a cashier can't make change when the computerized cash register is down. Basic science is, history, social studies... there's a reason these are all core subjects, even if you learn stuff you never need - you tend to realize how useful a lot of the stuff you learned actually is.

I have to disagree with you here. The business world is awash with people carrying out ad hoc data filtering tasks manually, or at best by dumping data into Excel and using the built-in column filters. Untold millions of days of productivity are lost to office drones carrying out easily automatable tasks, simply because it would take longer to approve programming spend, brief the programmer, test the script and apply it than for an office drone to just do it -- depending on who you work with, this could mean jobs taking anything up to two weeks. I've turned 3-day jobs into 1-day jobs that way, and if the client had been much bigger, that would have taken 2 weeks' worth of manual filtering into 2 days. If everybody knew a little bit about programming and could knock together their own ad hoc scripts, macros and queries, we'd save millions of dollars daily across the globe.

Now, before anyone talks about the risk of "enough rope to hang themselves", I'd like to go back to an earlier point: the most common tool for non-programmers to process data is Excel. 9 out of 10 executioners agree that this is the best quality hang-yourself rope on the market.

Comment Re:12k€ not 12€ (Score 5, Interesting) 208

TFA says "12.523 EUR".

Of course the fixation of (continental?) Europe to use decimal points as thousands separators is a bit stupid (saying that as someone from there).

Except that Europe doesn't use the decimal point as thousands separators, you use the period character. The traditional anglophone decimal separator is a mid-height dot, which I'd demonstrate for you, but it doesn't seem to be available in the default iOS keyboard. What happened is that it was missed off some mechanical typewriters and was left off the keyboards and character sets for most computers. Computer programmers cheated and used a period instead of a decimal point, and as publishing moved to digital formats, it was easier for people to use the period in place of the point. By the time Unicode finally introduced the decimal point into computing, it was already dead.

Slashdot Top Deals

In every non-trivial program there is at least one bug.

Working...