In ancient times when I learned programming, memory was limited, the allocation of memory was static and linear, and there was no bounds checking, so understanding how memory is used was the key concept for a new programmer.
Learning assembler and reading "core" dumps to see what was going on in memory at the low level really helped in debugging Fortran programs. That awareness is still useful for applications where using a lot of memory is not feasible or efficient and for languages such as C that require deliberate memory management. Judging by the number of malware exploits and memory overflow fixes that are announced each month for popular systems, it's still a major issue.
So I agree that some exposure to assembler or C is useful for any serious programmer, and then go ahead and take advantage of languages that clean up after you and trap memory errors, in suitable applications (for some definition of suitable!).