It's being rendered at around that rate as well.
The VR software includes some ability to shift an already rendered frame because of head tracking, the same approach could probably be used to compensate for eye motion. I'm not sure how much an eye really moves in 1/60th of a second. It also has a micro-stutter that is probably fairly unpredictable. Gross motor movement takes a while to start and stop, so the viewport of the next frame can generally be calculated with reasonable accuracy.