The same is true of university exams. My undergraduate exams, for example, mostly required that you answer two of three questions per exam. To get a first (for people outside the UK: the highest classification), you needed to get 70%. Most questions were around 40% knowledge and 60% application of the knowledge. If you could predict the topics that the examiner would pick, then that meant that you could immediately discard a third of the material. To get the top grade, you needed to get 100% in one question and 40% in another. This meant that you could understand a third of the material really well and understand another third well enough to get the repetition marks, but not the understanding ones and still get the top grade. This meant that you could study 50% of the material and still do very well in the exams, as long as you picked the correct 50%. And some of the lecturers were very predictable when setting exams...
What jobs do you imagine existing in 10-20 years that don't require some understanding of programming? I thought my stepfather, as head greenskeeper at a golf course might have had one before he retired, but it turns out that the irrigation system that he had to use came with a domain-specific programming language for controlling it. A lot of farm equipment is moving in the same direction. Office jobs generally require either wasting a lot of time, or learning a bit of scripting (hint: the employees who opt for the first choice are not going to be the ones that keep their jobs for long). Jobs that don't require any programming are the ones that are easy to automate.
But, of course, we don't need to teach our children to write. After all, they can always hire a scribe if they need to and there really aren't enough jobs for scribes to justify teaching it to everyone.
But if I'd rolled out USB A sockets in 1995, I don't think I'd object strongly to replacing the faceplates on the sockets with USB C ones in the next five years, if the wires in the wall could supply the required power.
I have yet to see a USB-C connector yet, and I am usually a first adopter.
No one you know has a MacBook Air? Most of the next generation of mobiles are going to have USB C (Apple and Google are among the bigger backers), so expect to see a lot of them appearing.
And if this is done by lottery, a lucky winner might well sell his ticket if the price is right... and would we even want to try and stop such transactions, like we prohibit people from selling their own organs now? If you (at, say, age 35) win the lottery and get to choose between a normal lifespan in sufficient wealth, or an extended lifespan that will be spent either working or worrying over money (or both)? Because your state or private pension scheme is most certainly not going to cover you for 300 years.
If a business guy tells you: "I need an FTP server", your answer shouldn't be "no way in hell", but "what is it you really need?". Understand what their business need is, then offer your expertise to set up the right technology for the job to meet that need. And it goes further: if you understand their business, you can take the initiative and bring new tech to their attention and show them how it will help them to do things better, faster or cheaper. Many IT departments don't do that often enough or well enough.
I'd argue that there are still big gains being made with new IT; the need for continued innovation is still there. For starters, replacing traditional inventory and accounting with computer based solutions hasn't been a big bang where all the benefits were realized in a short time. These things evolved from basic isolated solutions, adding bar codes and inventory tracking, automated warehouses, JIT logistics, ERP, standardisation in integration tech that allows easy outsourcing of payroll and other business processes, etc. And this process continues. My current client suffers from the stuff I described above, but that doesn't mean that all their projects fail, and we've seen some significant tangible benefits coming out of the use of mobile devices, new ways of learning and providing support, a shift to SAAS, virtualisation, web-based solutions (thin client), and they even still develop some bespoke software that gives them a real competitive edge. They go for "commodity tools" in the sense that their strategy is to "buy not build" where possible, but the stuff they buy is being improved upon all the time, and even SAAS solutions do not free you from having to have at least some knowledge of IT when rolling them out into the organisation.
Keep in mind that innovation doesn't mean operating on the bleeding edge of tech or inventing your own stuff, in most cases it means adjusting your organisation and the way you do business to take advantage of advances being made in tech that is already available as "boring" commodity software or services.
The report says that given the low levels of digital knowledge and skills outside of IT [..]
When I first started working, IT was more closely interwoven with the business functions. Gradually, IT was separated into its own department, parts of it were outsourced, and the work was more compartimentalized (moving from individual generalists to fully interchangable specialists). To be sure this has had positive effects: in my own experience the level of professionalism has gone way up and there are far fewer ninja projects and hobby departments. But the downside has been that IT has lost touch with the business almost completely, and the amount of red tape is staggering.