My main desktop has 24GB; my home away from home one has 12GB and it's painful.
I don't think you realize how demanding certain tasks are - my Photoshop CS5 process is usually idling at 15GB. I actually put some composited art pieces on the backburner because they simply weren't possible, even with working with collapsed layer copies - the free transforms necessary to do certain stitchess (often freehand because I often value human perception over "properly" corrected edge convergence) simply caused CS5x64 to die in seconds on my i7/6GB setup. 24GB - and give me 240, I'll use it - was the difference between having something in my head and something complete.
And that was merely 10,500x3,700/16-bit but was made of 15 (20+MP/16-bit) images selectively mixed plus an hour or two of clone painting. True story: when I installed Adobe Lightroom 3 (an image colection manager) I realized that it switched the "Open in Photoshop" link to the 32 bit version... after I simply tried to open a flattened, preprint image four or five times and failed with RAM errors. I had just moved up from 6GB, so I was still used to seeing those errors, so I sat there reclicking a few times like an idiot... before it clicked.
An
80MP back only costs $30,000 - as a capital cost for fashion and product photography, that's nothing. 165MB a shot. Some people put these on large format cameras and slide them across a frame taking 20+ images to recombine later.
Film scans are huge, especially when oversampled for archival purposes. Scans are not done in black and white, and may have a 4th (RGB+IR) channel.
Migrant Mother and the rest of Lange's WPA work? 4x5 (inches). Most of Ansel Adams? 8x10. At a recent camera industry show, one custom large format camera maker described a recent commission to build a 20x24 vacuum back (to prevent film from buckling, making an irregularly curved imaging plane). When I asked why (emulsion coated) glass plates weren't used (like they were in astrophotography for decades after they were "obsolete" to maintain critical even field sharpness) he answered - it doesn't scan well. I can only image that guy's headaches.
Imaging is everywhere, and the nominal file sizes are just the start - transforms and layers start multiplying sizes very quickly. Why shouldn't a doctor be able to pull a 10,000 slice axial scan and compare it to not only that patient's previous 20 scans (say 5 years, every three months) but the historical scans of 50 other patients, perhaps scaled and superimposed. Might not be rigorous enough for a study, but back of the envelope experiments lead to the sometimes pivotal "huh, that's funny" moments. Sometimes you need a big envelope.
In turn, imaging is just a specialized case of sampling, and that is present in just about every industry job that requires book learning. Of all people, shouldn't computer industry (dev and admin alike) people be able to think (in terms of use cases and) of uses cases other than: mom, secretary =2GB, server =16GB+? The list above (and I don't mean to pick on you; rather industry myopia) boils down to age and size of data consuming device, a nod to "high end workstations," and servers. That reads like PC Magazine, c. 1990. Well, workstations have outstripped the needs of most servers long ago - servers get virtualized into a reduced number of machines/cpus, while workstations, once they run out of space for more cpus/gpus/asics start getting expanded into clusters, then renderfarms. I understand how this can be nipicked, but my point is, intensive endusers -those in information creation and manipulation- are not just a massive GFLOPS long tail, but that quantity can be a qualitative change for these users that leads to new modes of thinking. It's one thing to run up against a memory limit; it's another to add extra dimentionality to your techniques because "your brain is now bigger." Thing is, when the tool user isn't the tool maker, these unseen limits may not be perceived as such - they dont realize what they need.