Does anyone know if it aims to encrypt all your files quickly or over a time period to increase the chance of poisoning backups?
If the former, one mitigation might be to check file types on the backup? Assuming you do a backup to a different architecture, such as Linux, check file types - is a jpeg really a jpeg? Can it read plain text files? As soon as it finds one it can't, flag it up for investigation. Perhaps have a number of canary files, pull those first each time and compare them to known good copies stored in a non-shared filesystem on the backup machine, halting the backup if the file has changed in any way. It'd be a pain to set up, but once scripted it would all be automatic.
Question for cryptography gurus - does having a known good file or files increase the feasibility of decrpyting? I.e A file is encrypted. You have an unencrypted copy of it on read only media. Does that increase the chance of finding the keys used to encrypt A, and thus enable you decrypt other files for which you don't have good copies? Probably not, but thought I'd ask. Apologies if it's a stupid question before I get the piss ripped out of me ;)