Memory management isn't taught so much these days because, what in the hell are you going to write that's going to use 16GB of RAM?
In all seriousness though, memory management was important when software and hardware were on par with one another. However, (IMO) hardware has outpaced software at this point. I defy you to find a program my gaming PC can't run whilst I simultaneously play Crysis and browse Slashdot. In addition, most high level languages are incredibly forgiving when it comes to bad memory management, thanks to good built-in garbage collection. Sure, it helps to know that grittier low level, less abstracted stuff. Knowing what your computer is doing on that level would make you exceptionally efficient. However, if you are an experienced programmer, who understands best practice and can competently scour an API for the most efficient classes and methods to use in your object oriented programming, I see no reason why you shouldn't be seen as a "real" programmer. Anyone who says otherwise needs to get with the times...
If you only program on Windows then you are write. Welcome, however to the real world where windows is but a very small fraction of the hardware devices in existence and the very vast majority of them do in fact require memory management to function. Bad Advice is in the end Bad Advice.
Funny thing about games even, it turns out that if you were writing games even on Xbox you would find that your employer would fire you with that advice. You should keep it to yourself.