Well beyond hardware, Software reliability over the past few decades has shot right up.
Even Windows is very stable and secure. Over the past decade, I have actually seen more kernel panics from Linux than a BSOD. We can keep servers running for months or years without a reboot. Out Desktops,Laptops, and even mobile devices now perform without crashing all the time, and we work without feeling the need to save to the hard drive then backup to a floppy/removable media every time.
What changes has happened sense then on the software level.
1. Server Grade OS for the desktop/mobile devices. Windows XP on uses the NT kernel, Macs and iOS use a Unix Derivative. Android and GNU/Linux are Linux based system. All of these OS's were designed for Server based useage with proper memory management and multi-tasking as well support from SMTP. Causing a lot of those silly crashes of Yesterday a thing of the past.
2. Understanding and prevention of buffer overflow. While we knew about buffer overflows for a long time. But it was found to be a security issue in the late 1990's. So newer languages and updates to existing compilers are designed to try to prevent them. Plus the OS now randomizes the memory segments to help reduce the risk.
3. The "Try" "Catch/Except" commands. It is nearly impossible to try to break a complex program during testing. The Try/Catch idea in modern languages while many old schoolers claim leads to sloppy code, it does attempt to deal with the fact that the world isn't perfect and people make mistakes, and allows a clean way exit your program or procedure on error conditions. Meaning your program will still run when things are not perfect, as well once it quits your data is in a clean state to prevent further data corruption.
4. The rise of Server based programming. We go threw cycles of who should do the work The Server or the end use device. How often do you actually need to download a program any more, I bet most of you don't remember Software stores back in the 1990's where you had to buy a program to do everything. They were cheap $10 programs, or you can get a collection of shareware. But in general everything you needed to do on your computer needed a program. If you wanted an electronic Encyclopedia you needed to get one and have it stored on your computer. You wanted a program that had some forms and did some calculations you needed a program.... This created a situation where you had a lot of programs on your computer with DLLs/Shared Libary versions conflicting each other. Today a lot of these small program are now done over the Web, On the server, with Javascript to make the UI clean. But that means there is less stuff running on your desktop that could be conflicting with each other.
5. Rise of Virtualized Systems. If you had a server, you needed to put all your stuff on it, on the same OS, with a complex set of settings which made them more suited to mistakes. Virtualization allows you to have a bunch of custom settings designed to do one job and do it well. Vs. one server to do many things.
6. The rise of the interpreted and virtual machine languages. Most stuff done today doesn't require compiling straight to machine code, but to a virtual code (Java/.NET) or it interprets the code in run-time (Python/Ruby/PHP). That give the developer separation from the hardware. Yes it slows things down, but it also prevents a lot of oddities that happen when you make a program on a 32bit vs 64bit OS. Or even the minor difference between a Core i5 and a Core i7.
7. The fall of easy to use languages. Back in the Old days, there was a lot of languages like FoxPro, Visual Basic (not the .NET) which were designed to be used by Non-Programmers. These type of languages are not being used as much any more, that means the programming is happening with people who know how to program and not from people who just know how to use a computer. This means the programs wrote in these hard (By hard meaning there is some design and though behind it) to use languages are designed better and not as cobbled together.
8. Inexpensive database. Back in the old days, a Relational Database costed thousands of dollar or more. Too much for most peoples usages. Now with MySQL, Microsoft SQL, PostGreSQL and the others, having a Relational Database to you program isn't going to break the bank, and you have a tool that allows for better collection organization and retrieval of data.
9. Google Search/Widely available broadband internet. Back in the old days of coding, you had to use your reference book, or help in the software. However most programmers write programs that do something different than what the compiler makers had in mind, so you need to figure out a work around. Today if you hit something that seems overly complex you do a Google search and you can usually find out a better way to do it. The old way you just did it the bad way, it worked so you kept it.
10. Fast Processors, Lots of RAM. The old days everything needed to be optimized. This first added more time to your programming, and opened your code for a lot of bugs, because you needed to strip out as many checks that you can get away with. Also you need to keep an eye on the RAM you are using and write methods to store data and retrieve it over an over again. Today you wouldn't think twice about storing a few megs of data in RAM process it, than just save the output on disk. Easier code, less errors.