I mean MacOS, for example, didn't have any kind of memory separation. Applications had statically assigned memory, but they were free to write to the memory of others freely. That's one of the reasons why MacOS was nearly unusable for any webbrowsing around Version 6 and 7. In fact back then it emulated 68k code on the Power platform.
Then came MacOSX, taking an ancient version of some BSD and removing all the good bits replacing them with proprietary stuff. Even MacOSX 10.3 was hardly usable. It did work for a while, but after a week of uptime it became increasingly sluggish.
Software quality never was particularly good at Apple. They always just competed with Microsoft, not with any meaningful quality standards.
Same goes for hardware. Logic board failures were common during "evil Steve's" reign. Macs just became much more fragile than the industry standard. Batteries were glued in. Harddisks were really hard to replace. Even things like the Apple Airport had design flaws leading to mass breakdowns.
I guess the point why this now looks like a sudden decrease in quality is that the "reality distortion field" is gone. Apple is no longer the underdog which invests significant amounts of its money into engineering. Apple is, particularly since "evil Steve" a marketing driven company.