Man, I haven't even mentioned the numerical computing. It is an exception, because that's where programming serves the math, not the other way around.
In numerical computing, ironically, there is very little overlap between the math methods used in the development and math methods used for the goal of the development. Programs there often look more like a math formula. Developers simply skip the "computer science" as a whole and use the computers (with help of specialized libraries) as almost pure calculators.
As an exception, it is simply obscures the subject of the discussion.
The talk here is about what precisely from the math is used in general software development. (IOW, math serves the programming.) My personal experience, having majored in the applied math 15 years ago, is that by studying math one learns the methods to approach the real world problems. "Learns" is a weak word. The methods are implanted, grafted (or even brandmarked) onto the brain. Normal person's brain go into freeze when faced with thousands pages of specification. Person with math background, already switched into "divide and conquer" mode, and probably has already dismissed the >90% of it as trivial, incremental and derivative. (Some people learn it one their own. But studying math is definitely a nice shortcut to get there faster and earlier.) But the math in itself, either discrete/algebra or analysis or numerical, is very very rarely needed.