Yes but. I used to declare variables overly large as a kludge to help out when error-trapping was consuming too much time and I knew that the compiler wasn't good with overflows. So I'd do input error checking up to the point where it started to take too much time, then declare a variable larger than reasonable input would be, and then attempt to trap and reject input at a length between reasonable input values and the declared variable size. Declaring a variable just larger than the input buffer was one specific way to address attempts to force overflows through buffer overruns. Yes it was a horrible kludge and can't survive any sort of dedicated attack, but it served to deter casual probes looking for exploitable boundary condition errors.
Of course the better answer is to not use an OS and compiler that sucks so bad that the basic io buffers and basic overflows are exploitable, but sometimes you gotta use what you have.