As an example if you have the same game running on two machines, one at 60Hz and the other at 1Hz. In both cases you press the button at exactly the same time. The game update will process that button press and start a muzzle flash, that took some period of time that we will assume is equal for both machines (i.e. I won't make the slow rendering machine also have a slow game update, even if typicaly the two are tightly coupled). So 1/nth of a second after the button press both machines are ready to show the muzzle flash. On the first machine you will see the muzzle flash 16.6 milliseconds later. On the 1Hz machine the muzzle flash will appear 1 second later.
Now, my example is a bit extreme (to make it obvious that there is a difference. Do not think that this is irrelevant in real word cases. I worked on one of the first fps games to win awards for jump puzzles that were not atrocious. Early on we spent a lot of time testing the game at 30Hz and at 60Hz. If we ran the game at 30 we could effectively double the quality of the graphics, which the art team obviously wanted so that they could do even greater stuff. But after blind testing we found that everyone noticed "something" was better about the game that ran at 60Hz. Reducing the latency between the button press and the jump allowed players to gage the jump point more accurately. Reducing the latency of the joystick movements allowed the player to guide their landings more accurately.
One final note, maintaining a consistent frame rate is even more critical, players have to know that when the press "x" they will get the same result.
Apple doesn't make specific mention of any other additional safeguards. However, as we discovered through some research and testing of our own, it seems as though Apple went through the trouble of pairing every individual Touch ID sensor cable to each individual phone as well. That's an incredible feat, and it immediately raises the question — why?
While this will make replacing the button an Apple exclusive (read expensive) job, it seems that Apple wants to make it difficult for people to hack the system."
Link to Original Source
Here's one example of many that came up in my search
Focusing on the Glass display is actually quite easy, and doesn’t result in everything else becoming visually out of focus. This is because the focal point of the projected image is not on the surface of the Glass prism, but rather about 8 feet out from your current position. Even being nearsighted, without my contacts, the Glass display is unreadable as my eyesight at 8 feet is all but a blur. However, even though your eye focus may not change, the display doesn’t create an image that fits within 3D space. It’s still a 2D plane that is floating in 3D space, which results in you having to make a conscious decision of which plane to mentally focus on. Watching someone use Glass almost looks as if they’re daydreaming; their eyes are locked to the position of the Glass display and it is tough to break their concentration