If the (then) future hadn't gone the way that it did
If there is anything that I have learned is if something can go wrong it WILL go wrong. Don't let it go wrong in the first place. Spend the extra 1 second and think about your code before you vomit your crap code into production. It isn't "premature optimization" until it makes it some degree harder to understand than the non-optimized version. Most of what people call "premature optimization" is really just writing correct code. Should I use String.EndsWith or String.Contains to find if a string has a certain suffix? Should I use an explicit cross join and filter in the WHERE clause or use inner joins? These are examples of the level of boneheadedness that most people write code. Don't get me started on multi-threading. If you don't find it intuitive, please don't do it. Most of the time I can make someone's code several factors faster with virtually zero effort while making it easier to read. And when I say "faster" I don't mean as in micro-benchmarking the code in simplistic situations, I mean production where the entirety of the code is being called many times on many threads. Sure, your "read the entire file into memory and write it back out" works fine in your simple tests, but throw it in production and the 400MiB files put massive pressure on memory when there are hundreds of jobs running at the same time all trying to load the entirety of files into memory before writing them out. STREAM the files. Then you have the issue of many times good code runs slower in synthetic benchmarks but runs faster in production. Everyone focuses on empirical evidence, but then compare apples and oranges when thinking their dev boxes or servers are the same as prod in that the computers are only running their code in tight loops
Bad code and bad coders exist. The bell curve suggests that some 50% of developers and code fall on the wrong side of that. I may be one of them, who knows.
I don't disagree with your assessment. As I've learned in my years of working as a sysadmin and developer, there's precious little code that can't have some sort work done on it for readability and/or performance's sake. Many (myself included) have lost sight of the forest for the trees time to time. Sometimes you get people doing things that they've never had to do before. Sometimes you wind up with a problem out of your league that has to be solved. It happens. Problems still have to get solved/code written though.
The other thing I've learned is that "Delivered puts food on the table. Perfect is the enemy of delivered." I always try to push the best code I can out the door, but sometimes I just don't have the experience or knowledge to write the perfect solution to everything I work on. I think everyone runs up on this from time to time. Anyone who claims otherwise is either lying or has stagnated in their career/work.
The best way I know to AVOID pushing crap out the door is to have a peer review done, preferably by a peer who is better than you. Many eyes and all. For many years, I had no peers to review and see the potential problems, working as a 1 man code shop.
That said, to address your example, slurping an entire file into RAM is a pretty dumb idea, if it's able to be processed in a window. I'm sure many people have horror stories of "wow, that's a dumb idea" that they can impart, myself included. Fix the problem, move on. If the dev who wrote the offending code is available, ask them what they were trying to do. Make it a teachable moment. Make the world better. [insert something about sunshine and rainbows here].
Or not. I'm just another user account on Slashdot.