If you teach a 10 year old to write "code", that won't help them in 8 or 10 years time when they try to apply for a job. The "code" technology will have moved on in that time, so the stuff they learned a decade ago will be obsolete. The knowledge that a professional programmer has, has a half-life of a few years: maybe as long as 5 years in some areas - possibly as a short as 1 or 2 in rapidly developing fields of work.
This seems incorrect. A simple back of the envelope regression analysis between 12 programming languages I used in school/work and the jobs available on monster.ca. Gave a R of about -0.1. So programming language age and jobs available appear to be uncorrelated. Now you will probably be tempted to drop back and punt. That is to make your argument way more specific (Oh I meant that Business knowledge W + Language X + IDE Y + Framework Z wouldn't be useful in 2025) however clearly, if you had been educated in logic you would realize that doesn't mean that teaching language X is of little or no value. Knowing Lisp a language almost as old as CS itself has helped me in evaluating products and understanding problems just in the last five years.
Since nobody can tell what skills will be needed in the next decade, learning a particular coding language, the "learning to code" is almost certainly teaching the wrong language to children.
The argument of someone who doesn't understand the need to clarify your premises. If nobody has any information on what is needed in 2025 then no premise should be privileged (i.e. learning current languages is of almost no help). If you are asserting that we only have enough information to determine that learning languages people use today is of almost no help. Then you are either wrong (by my regression above) or just begging the question.
It would be far better to teach them basic maths, basic logic and how to think in abstract terms - rather than focusing on tangible, here and now, stuff that will produce children who can blink an LED on a Raspberry Pi today, but will have no clue about hw to deal with the "AI on a chip" they might be faced with when they start their professional careers.
The assumption here is pretty ignorant. Learning to program is of almost no value because there will be nothing in common between programming languages now and whatever people use in 10 years. Well the Church-Turing thesis begs to differ. Unless the "AI on a chip" (*snort* *chortle*) is not a Turing machine then clearly any programming language would have something to teach them and would at least be potentially useful in instructing them on the nature of computer science.
When I started my first job after graduating, the job description didn't even exist when I started my university course. So what is the chance that teaching 5 or 10 year children a specific computing skill will be relevant to their career prospects in 10-15 years time?
Again a correlation coefficient of -0.1 seems to say "not nearly as bad as a moron like you thinks"