An anonymous reader writes "Ever see the Windows vs Linux uptime graphs? I think NetCraft has some really good ones comparing Apache and IIS web-servers. Well they're the most inaccurate measurement of uptime available to mankind and should be completely ignored when comparing uptime propensity of a particular operation system. The reason? Microsoft forces you to reboot your server after applying security updates. So if we say Microsoft releases a critical update 4 times a year, we can conclude that the average uptime of a Microsoft server is 3 months. Pretty quick and easy math. No graphs needed here. What do you think should be a proper measurement of uptime? Is Microsoft guilty of falsely advertising their products as reliable? Could that be grounds for misrepresentation?"