Sounds like a fairly simple case for a Hadoop cluster - a smallish one at that. We're currently deploying to clusters at 1PB/rack density, which means you could deploy a rack or two easily enough. You'd get compute, you get a single flat filesystem, you get redundancy, all built in. Our biggest cluster is now up to 16PB, all one big compute/storage beast, chugging away all day.
I'd suggest starting with the Hortonworks Sandbox VM - grab it, fire it up, play with it. Add some files, poke around, see if it meets your needs. Learn about mapreduce, or maybe your data can be put in to HIVE for analysis.
The nice thing is that yo ucan use hardware you may already have to get things going. Hortonworks is pretty much at the point of a 'next next finish' installer, so you really only need to dedicate a few hours to getting something up to test. Then, thre's a lot of tuning and craziness to running a bigger cluster, but a POC is simple.
Anyhow, I'm blind, because all I do is Hadoop clusters all day, but this seems like an easy win for ya.