I've been doing this for a few years and the one gap I'm seeing more and more of doesn't actually have anything to do with programming techniques, "design patterns" or anything else that's hugely technical. All of these things are pretty well-known and accepted by everyone, and you can always be sure that there'll be someone around pushing one or another of them as the be-all and end-all of Programming.
The one gap you might have as a self-taught programmer is in fact in the _history_ of computer science. There's a lot of stuff that has happened and in fact people keep finding and solving the same problems, never realizing that the problems have been encountered and solved many times. (An example that's particularly relevant to me at the moment has to do with extent-based file systems; ext4 has extents and so do a number of new file systems. Great idea, right, particularly for large file systems? Thing is, extent-based file systems have been used at least since the 70s in mainframe operating systems. Odd that it took 40 years to get it into Unix.)
But don't feel bad that your self-teaching has skipped the history of computing. It appears that most university computer science programs neglect that little bit of background as well, in favor of jumping straight into C++ or Java.
Maybe I'm an old fart but that half-semester of history I took back in 1981 made a small but significant improvement in my ability as a software engineer.