"100% Accuracy" implies a positional error of zero meters (to infinite decimal places), which is obviously not what they're talking about.
I caught that, too. But really "percent" doesn't even make sense as a unit of accuracy, does it? Unless it's fractional, in which case I'd take it to mean that if you want to make a relative move of x, you'll get something in the range (0,2x) or maybe (0.5x, 1.5x)? I mean, on the nano scale that's still kind of remarkable, but as you've pointed out it's just not what they mean.
Having pissed-off employees who feel chained to their workstation (and consequently horribly unmotivated) can also be a pretty big cost.IT is part of a business. Making IT's job harder in that business costs money. The article is making the point that there are some pretty serious cons about using laptops, and these need to be considered as part of their cost.
They are called computers simply because computation is the only significant job that has so far been given to them.