The issue isn't equipment, it's storage and bandwith.
If you collect a small city's worth of data, you'll have quite a lot of images. Maybe only a terabyte, if you're lucky, but probably several terabytes. Now extend that to an entire state/province, or a small country. You'll quickly be racking up terabytes and pedabytes of data.
"No problem, storage is cheap." you might be thinking, but storage gets expensive as you increase the demands of the storage. All of this storage needs to be available immediately, so it can't be stored on near time storage devices, which might make it less expensive. And it must be stored in such a way that makes it redundant in case of hardware failure, so either using disk, or system level data replication.
And now that you've stored the data, you need to serve it to users. Pushing out a small amount of data to a user isn't a problem. 2 cents a gigabyte seems cheap. But if you need to serve a whole country worth of data, with tens or hundreds of thousands of users, you now hit bandwidth issues- bandwith caps, and overage costs. Getting a larger pipe to the user costs more money, and deals that seemed reasonable start to become very expensive very quickly.
You'd quickly start talking about needing to spend hundreds of thousands of dollars just to store the data, and then hundreds of thousands (or more) to serve it out.
Commercial organizations are not going to be inclined to put money towards something when they don't have to, and the burden on users would be incredibly high.