Maybe I mostly remember the slings and arrows -- these so-called BASIC program listings that were about eight lines of actual readable (and thus re-writeable) BASIC code and the rest of the page or pages being DATA statements with numbers. Then the PCs came, and we could, if sufficiently masochistic, type in similar listings to use with DEBUG.EXE. Later, as software grew larger, there soon came the need of faffing about with config.sys and autoexec.bat so that available memory was maximized. In the late 1980s onwards, there were the expanded memory nonsense too and more and more options and things in config.sys. There there would be jumper settings so DMA channels, port-addresses and interrupt lines on the various plug-in cards in the PCs. This continued well into the 1990s, then that got replaced by something called Plug-and-play which maybe, maybe not, did work, thus everyone called it "Plug-and-pray". And all on the original 640K plus whatever High memory had been put into place. I do not miss any of all this. TFS mentions the dreariness of business computing. they are absolutely right!
But I might not be typical -- I started with learning FORTRAN, then after that BASIC seemed primitive (no functions? and thus no data hiding? i have to make sure I don't re-use any of the variable-names anywhere else? and only one letter? at least FORTRAN allowed me to use six! bah) but the PC-compatible had Turbo Pascal, and there was also the assembler and later, Turbo C, so that became a nice set-up, with direct control of the pins on the parallell and serial ports, and even some DIY card with A-D converters! Yay!
Then there were the wonderful Unix systems, HP-UX and AIX back around the mid-1980s, where you could actually do more than one thing at a time without the machine crashing. And even if your program decided to hang, or accessed some memory out of bounds, it would say "bus error" or "segmentation fault" and stop, but the rest of the system, including other programs, would continue happily along as if nothing had happened. These even had networking so we could have programs on one machine talk with programs on another machine.
Of course this didn't last. Those Unix systems were way too expensive. Instead, Windows NT happened, and a form of multitasking and even eventually a useful networking system (TCP/IP is useful, all the other weird and wonderful variants turned out not to be so) and the access to the parallell port vanished, while the support for the serial ports became increasingly wobbly. ISA, EISA, Micro Channel, and MS-DOS became dinosaurs soon after; parallell and serial ports followed on as being branded "legacy". And like the dinosaurs, some of their descendants are still around now: RS-232 serial ports never really went away completely. USB came, but turned out to not be as hacker-friendly as those serial ports -- there is a reason everyone today runs (RS-232 style) serial via USB using a pl2303 or FTDI or similar chip to talk and listen to the UART in their SBC or microcontroller board.
There was a sort of dark age, of PCs running klunky MS-DOS or slightly less klunky Windows, until the late half of the 1990s, when Linux distros became easily available, and so good that they actually worked right on some reasonable random PC hardware that would be available, and all the good old Unix ways of doing things finally became economically feasible, intially on PCs, many of them second-hand. Around the middle of the 2000s the first single-board computers started showing up, and some of these are now becoming as understandable and documented as those old 8088 PCs with their MS-DOS once were.
To some extent we are in a golden age right now.