Nearly all code is "bad" to one degree or another. There are some very good reasons for this.
1. As Steve Jobs said, "real artists ship". If you're in a research lab it may be a different matter, but in the real world, your code is never perfect, yet sooner or later you need to let real people use it, and either make money or discover that you're trying to solve a problem that doesn't exist. If the former happens, you should try and go back and improve your code. If the latter happens, be glad that you didn't try to design the perfect solution to your non-existent problem and go try to solve real problems.
2. The human race moves forward. Hardware gets faster, cheaper, smaller and uses less energy. Operating systems, browsers, and languages become easier to develop for with new features, and become more efficient with hardware acceleration, various optimizations, etc. You may have written amazing code 10 years ago, but it is "bad" code today because there are now better ways to do things that didn't exist at the time. Over time you should update your code to try to take advantages of this, though you may be limited due to a need to support legacy platforms.
3. You weren't the same person 10 years ago as you are now. We are continuously learning new things, fixing previous mistakes, and making new ones. If you look back at your old code, you should think that it's "bad" at least in some ways. If you don't, it means you didn't learn anything since you wrote it, which is a shame.
4. You don't know everything, and you don't have to. That's why you hire smart people, so you can focus on your strengths while outsourcing the rest, and so that you can learn how others approach problems differently than you.
Instead of trying to defend your territory as others have mentioned, I'd ask for ways in which the code is "bad" and can be improved, and then include him in the process of triaging those tasks against the rest of the team's workload.