If you keep increasing the voltage then it's likely that you can hit higher frequencies, but the power scales with voltage squared and frequency linearly, so power will go up pretty quickly. However, nowadays in advanced processes the interconnect is becoming more of a factor in the limitation on frequency scaling instead of the transistors themselves, in which case increasing the voltage will only help up to a certain point.
The trade-off that the company selling the CPU's makes is between the cost of cooling, reliability and lifetime of the device (higher voltage will wear the transistors out quicker, and high temperatures accelerates this process), and yield.
"The fundamental principle of science, the definition almost, is this: the sole test of the validity of any idea is experiment." -- Richard P. Feynman