Just because a piece of software is old, doesn't mean it's suddenly doesn't do its intended function.
It usually does, because the intended function changes over time. This is particularly true for business software (COBOL's niche), where regulatory requirements change over time and as companies grow to cover more jurisdictions, where accounting best practices change, where the company structure changes, and so on. Eventually you get to the point where the software was originally designed to do something so totally different to what it's doing now that it may make more sense to rewrite it than to keep adding hacks.
Could you ever imagine pro video editing (i.e. Adobe Premiere / After Effects) 100% within Chrome
Depends. With WebGL / WebCL, I can imagine preview effects there quite easily. I can also imagine that it would be nice to be able to do the real rendering runs on a rack somewhere else. The more difficult thing is imagining the multiple GBs of data between the two. Possibly uploading the raw source data to the server and keeping the local copy and just syncing the non-destructive editing instructions would work.
The "problem of needing offline access" most certainly has not been solved
Note that HTML5 does allow effectively unlimited (policy set by the user) local data to be storage and applications that run completely disconnected. It's possible to write a web app that uses the browser for the UI, but only uses the network for software updates.
You can show me the micro-benchmarks all day long; doesn't change the fact that a complex UI in JavaScript is vastly slower.
You're conflating JavaScript and DOM. With FTL, JavaScriptCore can run C code compiled via Emscriptem to JavaScript at around 60% of the speed of the same C code compiled directly. That's not a huge overhead (40% is a generation old CPU, or a C compiler from 5 years earlier). Transitions from JavaScript (or PNaCl compiled code) to the DOM, however, are very expensive. This is why a lot of web apps just grab a canvas or WebGL context and do all of their rendering inside that, rather than manipulating the DOM. Optimising the DOM interactions without sacrificing security is quite a difficult problem.
No, they are again. During the Pentium 4 era, they were behind on pretty much every metric. They only survived because of name recognition and AMD not having the production capacity to take more than about 20% of the market share. At the mid to low end, an Athlon system with the same performance was cheaper than anything Intel sold. At the high end, Opterons were roundly trouncing Xeons in absolute performance and performance per dollar.
The Pentium M was when it started to turn around for Intel - the laptop market started to grow rapidly and AMD was only just competitive on performance per Watt, but didn't have the laptop motherboard makers onboard. With the Core 2, Intel retook the performance crown.
"Only the hypocrite is really rotten to the core." -- Hannah Arendt.