So "sin^2 x*y" means "(sin(x*y))^2", it's a pretty strong convention. If you meant "(sin(x)^2)y" (as "(sin(x^2))*y"), you'd write as "y sin x^2". If you meant "sin(sin(x))*y", you'd write as "y sin(sin(x))". If you meant "sin(sin(x*y))", you'd write as that or "sin(sin x*y)". Sure, it's a bit odd for people unfamiliar with mathematics, but those people are never going to do mathematics anyway.
But could you code this convention into a compiler, e.g. by formally specifying precedence for trigonometric operators? It would have to account for the most common interpretations:
sin x*y = sin(x*y)
sin x * sin y = sin(x)*sin(y)
sin^2 x = sin(x)^2
sin^(-1) x = arcsin(x)
Now continue this for log, lim, summation notation, and all the other clever ways mathematicians have devised in order to avoid writing parentheses. It will quickly become clear that this is an arduous, error prone task and the time better be spent elsewhere by using proper notation from the beginning.
The people who drives a language's evolution are those people who uses it.
Yes, and the OP said that this language is ill-suited for writing computer programs and hence predicted that it will see declining use in the computer world.