The problem you have is not that you have a wrong idea of programming, but that you have a wrong idea of mathematics. Most people only get educated into a tiny, tiny, tiny fraction of mathematics and subconsciously think that something like the quadratic equation is the height of mathematical expression complexity.
However, nothing stops a mathematician from examining the consequences of much larger systems, such as might look like a program, and in fact mathematicians do. There's research into large numbers, research into large proof systems that make most programs look like small beans (go read the Principia Mathematica... or rather, go skim the Principia Mathematica and run screaming), research into all sorts of things that are definitely math but look a lot more like a program than you think if you've only been education with conventional primary school mathematics, or even if you've gotten a bachelor's degree in computer science, which does not generally cover the Curry-Howard isomorphism. (I didn't even get it in my masters program, I had to learn it myself.)
There is no feasible way of drawing a distinction between mathematics and programs. You might be able to draw a legal one, but as always happens when you try to introduce colors to bits, the coloring just won't stand up to the sort of sandblaster scrutiny that will be applied by the plaintiffs and defendants.
You might observe that few programmers appear to be thinking mathematically when they program, to which I'd follow up with an observation that no, no they don't and boy does it show! But note carefully that's a characteristic of the programmer, not the program. The programmer may not understand math, and may crank out a mathematical system of breathtaking worthlessness and with few or none of the properties the programmer would have found desirable (like "actually doing what it's supposed to do"), but it is not written into the definition of mathematics that something is only mathematics if it is "useful" or "good". It's still all just a term-rewriting system when you get down to it, and any attempt to draw a barrier around "term rewriting system" and "real-world program" is simply doomed to failure.
It's all just assembler, and all assembler can do is basic arithmetic, moving numbers around in memory in a manner very easy to characterize mathematically, and some very mathematical conditionals. You might be able to fool yourself into thinking you've somehow transcended these primitives into a "non-math" domain... but you haven't.