Ugh.
Look, in mathematics, dividing by zero makes no sense and that's fine. But I'm not working in the realm of pure math, I'm doing some actual work in a little universe of my own construction here. I don't need the value to be some sort of infinity. The value is undefined on our computers because we say so.
I'm a games programmer, and there are plenty of situations where we might accidentally end up with 0 as the denominator, and it would have no bad effects at all to treat that as a zero. In most cases, that's actually the expected result. That is, I end up writing ternaries like:
float foo = x != 0 ? y / x : 0;
foo will be zero if x is zero, otherwise it's some fraction. Why is x zero? I don't know, maybe it was a countdown timer, or I'm just trying to find some fractional interpolation along some curve or something. NaN is a MEANINGLESS answer. Zero is the only answer I want out of this equation when x is zero.
Your understanding of math is correct for our normal every day situation, but in broad strokes, you're wrong. I can easily define a new mathematical system where division by 0 is both allowed and defined. Algebra is very flexible this way. That's why when you add 25 hours to 2pm, you end up at 3pm the next day, and not 27pm. We've defined math in this daytime context to wrap around. There's no such thing as 27-o-clock. In this algebraic system, 2 + 25 != 27.
So what this programmer is asking isn't about whether or not dividing by zero in the purest mathematical sense is correct, they're asking whether or not it makes sense to have an alternate system where division by zero just gives you zero, since zero is almost always the answer that they (and that I) want.