What could be that unicorn real work number crunching application that couldn't be written in 1/10 of lines of code of python in some numpy with a pinch of DAAL or tensorflow? Why most AI research who is not know to be low on the number crunching departement is done in Python?
There is even Games developped in Python (Sims 4, Eve Online, Civilisation IV, Pirates of the Caribbean Online...).
In my career, I have written my share of Assembly and C code for performance reasons when a 486 DX 400 was the must have of the shinny machines. But now, compilers and 'interpreters' can do a better work than I due to the complexity of the processors and their different caches sizes. Right now you can write some highly optimized assembly code that work at nearly the theorical speed limit of a processor with a specific cache size, but who will work like a molasse on a similar architecture with double the cache size. An interpreter could adapt its realtime compilation to the architecture it run on and even for the same algorithm to the type of data you process.
Now, if your Python program is still too slow, you can update your slowest function into cython and compile it to have a speed boost that C can get you without losing the time needed to develop everything in C and by that saving a lot of time to go to the next project.
In the beginning of my carreer, I would have used the best language for the job from a toolbelt composed of Assembly, C/C++, Fortran, Delphi, Lisp and Prolog. Now, I use most exclusively Python for my projects.