I'd use Watson as a great example of how deep learning systems won't make coding go away too soon. From the Wikipedia entry:
Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework. The system was written in various languages, including Java, C++, and Prolog, and runs on the SUSE Linux Enterprise Server 11 operating system using Apache Hadoop framework to provide distributed computing.
Any guesses as to how many lines of code and development hours are behind that stack? How about a guess as to how long it'll be before Watson is able to make useful contributions to a significant part of that software stack? Is it worth thinking about the hardware stack, or the effort put into curating the database?
Watson is, basically, a sophisticated search engine built upon a massive mountain of human effort.
Experience says that the more complex systems become and the more ubiquitously they're deployed, the more you need people who can build them, expand them, bend them and glue them into place. It doesn't seem to follow a curve like agriculture where productivity can continuously increase while labour contracts. It probably will turn that way, eventually, but I don't expect to be around for it.