They have been trying to detect GW's for roughly half a century, making instruments gradually ever sensitive when nothing found. Was their magnitude so uncertain that they had no idea how sensitive the detector had to be to detect them?
If it's nearly a guaranteed result, as you implied, then why the huge uncertainty over the sensitivity needed? Or did the early trials merely hope the models were wrong when trying to detect results beyond what the tech of the day could handle relative to the (faint) magnitude the models suggested?
For example, why build a detector that is only sensitive to waves of 100 units or larger if the models say the actual waves should only be 2 units of size? You wouldn't build the 100 unit-size detector unless you had a decent reason to believe the models could be wrong. But I've never seen that assumption stated in the write-ups over the years they've been building all these detectors.