This article just recounts the collective fantasy of some tv executives.
I dropped Firefox because it is built on the carcass of an ancient browser
And Chrome sprang fully-formed from the brow of its creator when they spake the word?
Chrome is based on a rendering engine that was originally created by Apple in 2001. Webkit was a fork of KHTML, which was at the time a very short and cleanly written open source project. When Webkit began to impose crufty legacy problems on Chromium, a new fork was created with the intention of excising problem code. The Mozilla foundation got too comfortable with Firefox, and it is losing relevance quickly.
If you'd ever bothered to actually pay attention to Mozilla's bug tracker or Firefox release notes then you'd understand how full of shit you really are. But who needs reality to get in the way of their fantasies?
My opinion is formed basically entirely on asinine bugtracker comments from core developers. One of my biggest peeves is the reluctance and downright refusal to consider moving forward from NPAPI, even though it is one of the biggest security risks for web browsing.
I can trust my Okular software to view it. Can *you* trust your software? No? Then why are you still using it?
Sounds like hubris.
things = 47;
What is that supposed to do?
A non lazy programmer shouldn't subtract two timestamps from each other to get a duration but uses a (self written) function that can handle overflows.
I am not sure who you are even talking to. My response was in response to a smart ass comment made by a user named fisted, where he basically said that someone was a moron for suggesting counters that will run for orders of magnitude longer (ie. tens of thousands of millennia) are a pretty OK idea.
Nobody mentioned calculating duration besides you (in a perfectly sensible way, I might add). This is a smart answer to the question that it is an answer to, but a really kind of silly answer to a question that it is not an answer to.
And you have to adjust a lot of variables to become long. All temp vars that hold a timestamp. If you miss a single one, your screwed.
Yes, the program would have to be implemented without error, to not have an error... that is a tautology. Pragmatically, use a statically typed language, and do not change anything, use the correct type while implementing the program the first time.
What would a non-lazy programmer use instead? An arbitrary precision int or something? Can you think of any downsides to that approach?
63 bits for a nanosecond counter gives 292 years.
My post was not about nanoseconds, it was about milliseconds.
If you did the math, you don't need excess space. If you need excess space, you're just shifting the day of failure into the future. Yes, perhaps far enough, but still.
What math would you do to determine exactly how high a counter should count?
Would using a 64-bit long on a millisecond counter be lazy programming?