Several years ago (in 1992) I was involved in the rewrite of a large fortran program, which had been around for decades and had become unmaintainable and slow.
The physicists who had been working with the program were absolutely convinced that no software could replace fortran for speed. I did not believe that. Operating systems at the time were written in C, and the only thing that beats C for speed is assembly. I made several tests with bare scientific calculations, using fortran, C, and C++, and fortran came up (surprise surprise!) the slowest.
Then, the physicists rewrote the fortran application in C++, with some help from me with the object-oriented design. Not only the replacement program was faster, more maintainable, and easier to read. But we found out that the main problem of the program (written in times where RAM was scarce and never updated) used files instead of in-memory arrays or any more modern data structure, or a database, which would be the first choice nowadays. The application was slow because it was designed with constraints that were not realistic anymore. In the end, the replacement program was slightly faster in the number crunching, but immensely faster in the data handling. While the original program required one day or more of very expensive supercomputer time to complete its work, the new program ran in just a few hours.
The sad truth is that, even now, many fortran programs are fossils of an era where hardware constraints made the choice of data structure and algorithm, and they were never reviewed.