Its not even about optimizing the code, its making choices that from the beginning cannot result in faster code. People like to focus on the overhead of JITs, GC's, and hidden object copies, etc, in many "modern" languages, but frankly while they have an effect, the mindset they bring is a worse problem.
Modern machines can lose a lot of performance with poor memory placement/allocation in a NUMA configuration, doing cache line ping ponging, and on an on. Things that are simply not controllable if your language cannot even guarantee a consistent location for the data in question.
Lets not even talk about the horrors of HTML/javascript/CSS/AJAX/etc.
Now, all that said, a huge percentage of applications are going to be "fast enough" if they were written in bash, running in an emulated x86, in javascript, in firefox on a $50 tablet. Simply because even the slowest thing available today has 100x the performance of the machines 15 years ago which somehow managed to be useful without storing all their data in the "cloud" for the NSA to peruse.