When I was at school, we were taught binary arithmetic. Computers, we were told, couldn't do arithmetic in decimal numbers, only in binary, and if we ever wanted to work with computers we would have to be able to do binary arithmetic. Meantime, many of the girls in the school spent hours every week learning to use mechanical tabulator machines, because, as everyone knows, every business in the world needs an army of girls with mechanical tabulators to keep their accounts in order...
Binary is useful, because that really is how the computer does its calculations. Sure, for your convenience the machine has been set to work in base 10, but if you don't know binary you're clueless about the limitations of various types of numbers it uses and their binary representation. A classic example of this I encountered was a scientific application that had been enhanced by programmers from a financial background with a really, really strange error. Lots of different people looked at the problem, lots of them were baffled, but the damn problem wasn't going to be understood - or fixed - until you actually looked at the underlying numbers and their manipulation. Binary is useful, and is a cornerstone of computing. If you think it's useless, go back to writing tic-tac-toe programs.
Both these skills were completely obsolete before we even left school. Similarly with touch typing. Voice recognition and speech to text is now at a level where it's extremely unlikely that keyboards will be more than a vague memory for mainstream users by the time people now in school are thirty.
Bullcrap. As I commented above, you don't know enough about computers if you think knowledge of binary is useless. Touch-typing? Well, it *is* useful, and nobody in their right mind wants an office full of people trying to use speech recognition software.
For heaven's sake don't waste people's time in school teaching them to use ephemeral, obsolescent technologies. Teach them to use their brains, and teach them fundamental principles. Teach them to learn. Workplace skills can be taught in the workplace, and will in any case change far too rapidly for schools to keep pace.
The technology is not obsolete, as I and others have explained. As to "skills to be taught in the workplace"? Get real! Most companies do not want to spend any money on training people to do anything. Just trawl the IT jobs market for proof of this, you'll find adverts laying out skills requirements (in terms of x years use) for new programming languages that could only be gained if you'd been one of the developers writing the damn language.