I think he got it wrong why we got lost.
It's not because we didn't or don't know. It's because software was free back then. Hardware was so bizarly expensive and rare that no one gave a damn about giving away software and software ideas for free. It's only when software was commercialised that innovation in the field started to slow rapidly. The interweb is where it was 18 years ago because ever since simply because people are busy round the clock 24/7 trying to monetise it rather than ditching bad things and trying new stuff.
Then again, x86 wining as an archtecture and unix as software model probably does have a little to do with it aswell. We're basically stuck with early 80ies technology.
The simple truth is:
CPU and system development need's its iPhone/iPad moment - where a bold move is made to ditch out decade old concepts to make way for entirely new ones!
Look what happed since Steve Jobs and his crew redid commodity computing with their touch-toys. Imagine that happening with system architecture - that would be awesome. The world would be a totally different place in 5 years from now.
Point in case: We're still using SQL (Apollo era software technology for secretaries to manually access data - SQL is a fricking END-USER INTERFACE form the 70ies!!!) as a manually built and rebuilt access layer to persistance from the app level. That's even more braindead than keeping binary in favour of ASM, as given as example in the OPs video-talk.
Even ORM to hide SQL is nothing but a silly crutch from 1985. Java is a crutch to bridge across plattforms because since the mid 70ies people in the industry have been fighting turf wars over the patented plattforms and basically halted innovation (MS anyone?). The sceomorphic desktop metaphor is a joke - and allways has been. Stacked windowing UIs are a joke and allways have been. Our keyboard layout is a provisionary from the steam age, from before the zipper was invented (!!). E-Mail - one of the bizarest things still to be in widespread use - is from a time when computers weren't even connected yet, with different protocolls for every little shit it does, bizar, pointless, braindead and arcane concepts like the seperation of MUA, editor and seperate protocolls for sending and recieving - a human async communication system and protocol so bad it's outclassed by a shoddy commercial social networking site running from webscripts and browser-driven widgets - I mean WTF??? etc... I could go on and on ...
The only thing that isn't a total heap of shit is *nix as a system, and that's only because everything worthwhile being called Unix today is based on FOSS where we can still tinker and move forward with babysteps like fast no-bullshit non-tiling window managers, complete OpenGL accelerated avantgarde UIs (I'm thinking Blender here), workable userland and OS seperation and a matured way to handle text-driven UI, interaction and computer controll (zshell & modern bash).
That said, I do believe if we'd come up with a new, entire FOSS hardware arcitecture "2013" with complete redo and focus on massive parallel concurrency and build a logic-and-constraint driven touch-based direct-maniplation-interface system - think Squeak.org completely redone today for modern retina touch display *without* the crappy desktop - that does away with seperation of filesystem and persistance seperation and other ancient dead-ends, we'd be able to top and drop *nix in no time.
We wouldn't even miss it. ...
But building the bazillionth web framework and next half-assed x.org window manager and/or accompaning windows clone or redoing the same audio-player app / file manager / UI-Desktop toolkit every odd year from bottom to top again appears to be more fun I guess.
My 2 cents.