CrashPlan could help you a lot. First, CrashPlan is a backup system, so it makes and manages a copy of your data, including every version of every file. CrashPlan addresses the bitrot problem on their side by running their own checksums on the stored files : if they detect an issue with a stored file, they will replace it with the original version, still stored on their computer. If some files get corrupted on your computer, you can restore them from CrashPlan, but you will need something on your side to tell you that something went wrong. Now, even if you realize that the file is corrupted years after it happens, you can still recover the previous non-corrupted version from CrashPlan.
Now, 2TB is a bit much to store on CrashPlan's cloud : unless you have a very fast connection (at least 100MB) it's going to take you a while to upload your data. The solution is to run your own CrashPlan PRO Enterprise server onsite (with periodical offsite backups of course). Don't be fooled by the name, it's pretty easy to set up and administer, and the licenses are fairly affordable (75$/user/year).
I've supporting CrashPlan PRO Enterprise in my company for 3 years, with 25 clients and about 1TB of data. While I'm not super-happy with the way the Code42 people run their CrashPlan business, the tech is solid. I'm kind of thinking that other backup systems work in similar ways.
Now, I hope that you'll excuse me for asking this question, but which kind of crappy file systems and hard drives are you using that generate significant levels of "bitrot" in files which are basically just sitting there?