I'll bet most people today are interpreting this as simply connecting computer peripherals.
Quite right. Back in the day, it was idiot-simple -- my 10-year-old daughter could take a stray motherboard, any old case, a power supply, a hard drive, a CDROM drive, and any random SVGA card, plug them all together, and have a working computer in less than an hour. Every part was interchangeable with every other part, so we could cannibalize old computers and reuse parts as needed.
Now, not so much. First there was the pain of going from AT to ATX: I couldn't afford to replace all components at once, but anything I bought new was ATX and wouldn't work in my AT box. Then you couldn't count on every type of SIMM going in every style of motherboard, as DIMMs took over. And processors... wow. Time was, I could put any x86 chip into the socket on the motherboard and boot right up. The last time I tried that CPUs had just started running dangerously hot. Previously my assembly sequence included plugging the monitor and keyboard in and going into CMOS to make sure that memory and CPU were being recognized. I had myself a new motherboard and a new CPU (bought separately from a different company) and went through my usual routine -- popped the CPU into the socket, inserted a couple of SIMMs and the SVGA card, attached monitor and keyboard, and powered on. I got into CMOS all right... for about 10 seconds, and then that's when smoke and sparks started shooting out of the motherboard. The $150 CPU had totally melted and took the $65 motherboard with it.
For the past 5+ years, I go straight barebones. Get the CPU, RAM, mobo, and power supply all bundled together and pretested, then separately buy video adapter (if onboard isn't fast enough for me), drives, and whatever other trinkets I need to pimp out the hardware. No more explosions, rarely any more blood sacrifices. I miss the days of DIY, but I'm getting too old for that kind of excitement.