The most recent one was a custom SQL cursor in Oracle EBS. Add an index, refactor some correlated queries, and create a cut-down version of a complex view that it was using.
Another of the examples a few years back was where I reimplemented an FTP process to retry each individual step instead of reverting to the first step on failure. Given that each step had about a 50% chance of failure on a bad day, and each script had about 20 steps, it meant that it was failing... (runs calc...) 99.9999% of the time. OK I'm exaggerating a little, maybe it wasn't 50%, and only a few of the jobs had as many as 20 steps, it was about 10 years ago and I forget the details so my ego may be filling in the gaps. But it did mean that we didn't have to have a guy sitting at a screen hitting "Retry" all day long, and we could get file sets deployed in a couple of minutes instead of taking all day. The FTP was being done by a proprietary tool, so I had to implement my own system to parse its manifest.
And then there was that Excel spreadsheet that was massively bigger than it should have been. Everyone's system ground to a halt every time they openened it. No-one could figure out why it was so big, I spent an afternoon trying the obvious things and gave up. Then inspiration hit me, and I wrote some VBA to look at the number of "shape" objects in each sheet. There were millions of them. Someone had put boxes around a bunch of cells, those boxes had somehow been shrunk down to one pixel and replicated thousands of times, so a quick VBA procedure to delete all box shapes, and bingo, some people could do their jobs again.
A lot of people don't realise that computers don't behave in the way that we expect - we have an intuitive grasp of the laws of physics, but information is not physical and does not obey the same laws. There are infinities and paradoxes and undeterminables that are hard to understand. Minds that can intuitively navigate this space are few and far between.