Honestly, I'm not sure if you're a troll, or just someone who strongly believes if you don't do it your way, you're wrong.
I'm working in a research institution. We have limited funding from grants. We are doing X-Ray research, with detectors that output data on the order of 30GB a run, and there can be more than one run a day. This data, once generated, needs to be accessible by compute nodes, without hitting the acquisition disk. There isn't reliable down time between acquisitions, so rsyncs are hard to schedule. We also need to schedule backups, which is easier on central storage, as these acquisition machines move around, and aren't always up.
Laptops have trouble carrying around 30TB for analysis, and desktops aren't cost effective with that storage load. I could also go into the issue with data walking out the door, which may be prohibited, or desired depending on the situation.
On top of binary research data, there's all the program source, program binaries, infrastructure data, standard office documents etc.
I'm not sure about a content management system - we have a Wiki which is great, and SVN which is great, and Vault for Inventor source control, which is also great. For office documents, the closest thing I'm aware of is Sharepoint, which doesn't seem like anything I want to touch with a 10 ft pole. What else should I be looking at?
And how does it work for users who barely understand "save to this network folder"?