Back in the day, I/O was dreadfully slow. Think about 5 1/4" and 3 1/2" floppy disks and slow hard disks, and how long it could take to save a document. I can still hear the clunking and whirring in my head as the little activity LED blinks and the operating system grinds to a halt.
Now, with faster HDDs and even better SSDs, making "save" a separate, user-triggered operation doesn't make much sense. And with a jillion cores, you can easily offload the CPU work to do the saving to another thread so the UI isn't interrupted. Look at iOS - how many apps have a "save" button at all? It's expressly discouraged from the Human Interface Guidelines, and iOS users have been happily plugging along without it for years.
I think the real shocker is why applications still have a 3 1/2" floppy disk as the save icon. It's just an anachronism now.
"Oh no, a virus has replaced all my Fourier transforms with Laplace transforms!"
You'd be surprised. These types of viruses are spreading at higher frequencies. It's only a matter of time.
It appears that PL/I (and its dialects) is, or will be, the most widely used higher level language for systems programming. -- J. Sammet