Bullshit. VLSI code is almost always verified by finite models, and many processors are verified down to the level of mathematical axiom.
Bullshit on your bullshit, no it's not, and no they're not, not even close. Hardware companies have a fetish for formally verifying floating point stacks because, 20 years ago, the fickle and vacuous mainstream press people decided one particular piece of errata in one particular processor -- the Pentium FDIV bug from 1994 -- was important for some reason, even though every processor ever made and used has errata. AMD took advantage of Intel's bad publicity to formally verify their own FDIV instruction -- JUST the FDIV instruction, mind you -- and then doing formal verification with floating point stacks became something of a thing. There's nothing more going on than that.
Take a look here, in the section "Errata": http://download.intel.com/desi...
Doesn't look like the "proved" that VLSI very well to me, although they doubtless subjected it to a fuckton of simulation hours. Which is what they should be doing; theorem proving software or silicon is, usually, a ton of effort for little gain. Simulation hours cost much less than developer time. Our processors would likely be 486 level today if the designers had to prove everything correct. If that.
Provably correct software code exists in small amounts, and it's emergence is inevitable.
Said the formal verification researchers, for 30 years or so now.