Remote Data Access Solutions? 54
magoldfish asks: "Our company has several terabytes of data, typically chunked into 2-6 GB files, that we need to share among users at sites around the US. Our users range in technical skill from novice to guru, data access needs to be secure, and we require client access from multiple platforms (Linux, Mac, and Windows). These files will likely be accessed infrequently, and cached on local hard drives by our power users. We've been running WebDAV, primarily to accommodate non-savvy users and guarantee access from firewalled environments, but the users are really complaining about download speed — maybe 50 KB/sec serving from a Mac. Any suggestions for better alternatives, or comments on the pros and cons of alternative access techniques?"
Infrequent access = Send out dvds (Score:3, Insightful)
How will the data be used? (Score:4, Insightful)
Depends on the situation (Score:3, Insightful)
Or, if you need it spread out for some reason, iFolder or rSync seem the best choices. However, you could also look at AFS.
Basically, you have to get the long haul data transfers down somehow, or else get faster connections.
More of a question (Score:3, Insightful)
But how often do these files need updated? Is the end user in a read only situation? How infrequent is infrequent? How many users are you talking about and what's the density of these users? Even though the access is "infrequent" is this an access that modifies data that would have to be shared across your entire user base?
Your scenario needs some gaps filled in as far as the requirements. I see a lot of people suggesting large capacity medias being shipped to the user but if this data is being updated frequently this is not a solution. If you have a large number of users and the data is not updated often you would still have to judge the frequency of sending out updates to X number of users versus the costs of having the data centralized and the infrastructure upgraded to meet the needs of these users. If you need to share changes in this data from the user end then using a physical media via postal services is going to cause problems in the coordination of the correct version of these files to be used by the other end users. and god forbid you have several users who need to update this data in a short timeframe as you're going to have disks being mailed in by users not knowing what changes were made by other users. You'd doubtlessly fall into some kind of data integrity problem and even if your talking only having users update their data every few months your still going to need to hire someone on to coordinate these efforts and insure that all the end users are getting every update in a timely fashion.
Without more information it's hard to suggest something and still be confident that we're not leading you down to a solution that is completely inappropriate.
Your chunking appears to be a problem... (Score:3, Insightful)
Figure out how the data can be broken into smaller chunks and managed...that will probably indicate what sort of tech will enable things for you.
A really good point! (Score:3, Insightful)
While the answer may reside with any of the main themes recommended by responders (improving transfer, reducing the amount of data to transfer, and eliminating the need to transfer via remote desktop solutions such as Citrix, MS Terminal Services, and VNC), the questioner really needs to define his needs. Does the data really need to be local at each site? If data needs to reside locally, does each site really need such large chunks? What are the OS platforms used? How important is privacy, data integrity, and "chain of custody"? Answering about ten basic questions can shed enough light the problem to determine which of the recommended solutions (if any) make sense.
It's tough to get a good answer without asking good questions. On the other hand, when you ask the right question, the answer is generally obvious...