While writing out the tedious models and actually proving the run time is a ridiculous waste of time in many circumstances, most good developers (with an actual CS background) can come up with a Big O, Theta, and Omega with only a little bit of analysis on the algorithm in question. I personally consider it at all times when writing and modifying code, but I do the complexity in my head for a good ballpark more than anything. Is it always necessary? Probably not always, but especially given the high availability and tight computing times the systems I work on need I would rather do this up front then crash a damn airport system and delay thousands of passengers and flights potentially.
I went through all that way back in algorithms in college and I had a professor who had a fucking math doctorate with a dissertation and study focus on algorithmic run-time analysis. I have actually written out a full proof maybe twice since then (one of those was just a thought exercise) and used a recurrence relation once. I don't advocate anyone go through that hell on the regular, but one should be familiar enough to come up with at least a rough time (and in some cases spacial) complexity with a reasonable analysis of the code.
Now I do agree with many people that layering in 75 lines of code to reduce the run from O(nlogn) to O(n) is probably a waste most of the time especially considering how unmaintainable that algorithm would likely become.