when the tulips start to bloom again.
when the tulips start to bloom again.
College ain't supposed to be easy.
In CS, the killer is usually electro-magnatism or calculus-level probability.
In Physics, it is usually diff-eq's.
In math, it is usually partial diff-eq's.
Yes, the exceptions to the "rules" in org-chem is maddening...but if it wasn't, prescriptions and pharmaceuticals would be easy. Instead, they are rife with mistakes, side effects, false-positives, and a lot worse, and if you don't have the background to understand at least to a degree why, then I'll be damned before I let you write me a prescription for anything.
But seriously, this is college leading to one of the toughest post-grad programs our society has to offer. It is supposed to be hard. Deal with it or get out.
you really don't get it. money spent on a product that adds no value to the product is money lost.
if you don't get that simple *fact*, I'm not going to bother to say anything anymore.
Which was my point, I thought I made clear, from the very beginning. Keeping up with software updates (that theoretically aren't supposed to change anything unless they make it better, even if it means an API difference to code to) is one thing. Generally, one can hold back a patch and coordinate it with some other release cycle.
Having to adapt to an arbitrary, non-technical reason mandated by a clueless government is something else, a very EXPENSIVE, something else, as it is as i noted, something that adds no value to the product but costs as much as a new large feature would. And in this case, the deadline wasn't set by us either, but by the same government - we couldn't hold the patches back. Quite the opposite, we needed to get them in and tested ASAP as customers using our app to project into the future needed to have that date information accurate.
My response (see the title this thread has had all along) was in response to the continual suggestions that happen twice a year, every year, right on schedule, for getting rid of DST, but clueless dolts who think it is "easy" to do and wouldn't have any impact except some alleged positive one based on some analyst's arbitrary definition of "productivity". While getting rid of DST might eventually have a long term benefit, it will still have an expensive short-term cost that needs to be carefully considered based on the experiences of 2005-2007.
Just having some database isn't enough in and of itself. One needs to be able to process it and interpret it correctly so that it "makes sense" to the business model one is using.
In addition, code that looks backwards in time (reporting engines, for example, or event history viewers) need to be able to use the old rules as they look back, while looking at the current (and potentially new) rules looking forward, and have to know at what year/date some country made the decision to cut DST. This is a reason the TZ folder/file on linux boxes (for example) is as large as it is.
No, we're not mission-critical like medical stuff, but that doesn't stop the customer base from griping when the DST-handling code was broken before I was assigned to fix it.
And this DST issue is somewhat different from what I was originally ranting about, which was the amount of work I was going through building test plans and cases for actually being able to see if the software I was working on at the time handles the DST change correctly, coordinating and tracking all of the OS and software patches (there were 10 updates to Java's JDK/JVM alone in a matter of 5 weeks, ALL related to this issue) and ensuring our stuff would work.
It isn't the best at all. I inherited this API. Trying to see if we can take the pressure off by having the API send a date-time string back instead of making the UI responsible for the epoc conversion, so the server (with much better resources) becomes responsible for it. As it is a published API for our customers to also use, it is a pretty big change to do that.
That said, I've been rather proud of what I've managed to achieve so far with the limited resources I've got.
THANK YOU! somebody else actually 'got it'.
Currently I am having to track and get our system up to date for several other countries timezones that have changed their DST rules in the last 5 years, like Iraq and Egypt (both dropped DST entirely), plus trying to figure out, without actually coding a lunar calendar into my system, when Brazil postpones their DST end date by a week to avoid closing Carnivale early (yeah, lots of people don't know they do that...). (oh, you guys who think this stuff is easy while we're at it, go ahead and, fresh from scratch, calculate how many seconds UTC epoc (since Jan 1970) it is to get to the "third sunday of march".)
I only brought up the damn 2038 thing as an aside. For crying out loud the comment about it was in parenthesis. Sheesh...
I have been talking about DST, and the DST change, and what it takes for a software firm to deal with it, and why each firm has to deal with it even if for the most part they get their DST information from patches from systems software vendors. And my primary damn point, which is that when companies have to test and track their software to systems software updates that do (or don't) correctly give them the DST information they need for the timezones they care about, that is money spent, and when the government arbitrarily changed it because they have no idea how much work it really took for people like me to figure out what the impacts of it will be on our systems, that was money wasted.
DST and TZ data is closely interrelated - when we changed, others didn't.
I know what the hell I've been talking about here, thank you.
you have absolutely utterly failed to see the point. you have absolutely no idea - i was talking ABSOLUTELY about macro-econ, not micro. at the micro-econ level, it is just money spent.
at the macro-econ level, it is money spent that *provided no value*. Everybody was $5 billion less than they had before, but the products were, in effect, merely the same as they were before all this effort. Nothing was gained. Intellectual and economic power was wasted.
there is a difference, regardless of if you care to see it or not (so far, you don't).
The point is that from the perspective of the company having to pay members of their IT department to work on the DST change, that money was going into an effort that would not, directly, lead to increased sales, increased profits, or increased interest from a marketing standpoint.
It is wasted money.
In the case of the major vendors, they had to do this effectively for every client they had even if they didn't have a support contract to do so. Their support teams "lost" money doing this work, relative to the number of customers that benefitted from the change.
This is in contrast to spending money making a new feature, or fixing REAL bugs, things that actually can increase sales. Billions were spent just to keep working at all, because of the ignorant decisions of lawmakers who have no idea how computers work.
You can try to look at it as the sense of, "well, those IT guys like you spent that income anyways", but that misses the point: when I create something for my company, that is in effect *inventing* money - I increase the value of my company's product, who in turn increase the value of the companies that purchase it by leading them to cost savings and increased profits which get invested etc etc etc. This is the economic power of Intellectual invention, it is why the whole system works (and why IT companies do so well in the stock market - it isn't a limited resource like how much coal there might happen to be under some mountain somewhere).
When that money is spent just to keep something functioning at all, there is no value added, only money lost. The stock market reflected that loss at the time with flatlined growth for almost four months 'til it could recover again. Nobody had any profits to feed into the market after that.
It wasn't $5b per year - it was $5b as a one-time expense. There's nothing more to do now that it is done (and admittedly most platforms made their systems more flexible so they could adapt to other changes, like Russia's DST-365, with much less effort on implementation and testing).
It is unrelated to the supposed claim of $2b per year in "lost productivity", whatever that means (translation: any discussion of 'productivity' requires defining what you mean by it, and nobody can agree on that when it comes to IT work).
never said they weren't. I just said that if you ask the corporate IT world to go through this hell of trying to update systems to reflect a large-scale American change all over again, they'll laugh at you...and give their 527 PAC money to your opponents in an instant, ensuring you'll be kicked out of office.
I never said 2038 has anything to do with DST. I said that it is a y2k like problem where very old data storage systems will run into a limit and be unable to figure out if the next day is September 2038 or January 1970.
And every computer still has configuration files for timezones and DST info. Just because a computer CAN sync up with a central server doesn't mean it always does, and the issue is not knowing what time it is NOW, it is that of software being able to properly schedule for time in the future or accurately reflect time in the past. I can't just send a "what time is it really" query to some central generic server when writing for an event to take place in November 5th, 2013 (on the other side of the American line, but long since past the European switch, and yet not quite into the South American switch...oh, and Brazil nicely changes theirs every year so that it never changes before Carnevale is over, did you know that?).
I need to be absolutely accurate to the time the user expects it to happen, or else when that day comes up, the event they wanted is an hour off. I have to write the code to do this and trust that the libraries I rely on are accurate in their DST information.
And I have been writing software like this, in several languages, for much of the 20 years of my career so far.
How about you?
The disruption is all in IT. Computers don't just magically know what time it is. They have a chip that tells them the number of microseconds since some arbitrary date (happens to be Jan 1, 1970, which means a 32bit int is gonna run out sometime around 2038 - the next "y2k" problem).
Everything else they need to be programmed to know. Every DB has its own implementation for deciding this. Every VM (including the JVM) has its own. Every OS kernel has its own. Every custom DB environment (like, say, airline reservation and tracking systems) all have their own. Every power grid system has its own.
When the rules change, these ALL need to be updated to keep up, and (this is the important bit) they kinda need to be updated in sync with each other, because each layer may be asking the previous layer what it thinks, but if someone doesn't update, say, windows XP ('cause MS won't push any patches for it), well, Java still needs to know and get it right, and so does Oracle and MySQL and PHP and any other number of things someone might be running on the XP box they can't afford to upgrade/replace (or why should they, it works just fine for what it is being used for, right?). In the airline industry, not only do the airlines need to be correct within their own systems, they need to be assured they are correct with regards to TSA's systems, and the FAA's systems, and every *separate and independent* airport's flight control tower (oh, and they all need to be correct with regards to each other as well). And that's even before we get into the issue of airline ticket purchase exchange systems, the means by which 3rd party sites like Orbitz and Priceline get their tickets - they need to be sure that they are getting the right information from every supplier of flights to their systems, or have their own means of helping the customer correct their flight reservations if they get it wrong because, say, US Airways didn't get a patch right.
So huge IT suppliers like Oracle and Microsoft and HP all need to work to keep their software correct with the changes, while not breaking the current dates before the change is actually meant to take place. Huge and small IT-dependent businesses all need to keep up with all of those patches and test their products and services on them in order to be sure they are right.
Getting it wrong could be disastrous. We got it right in 2006...but we could also have just gotten lucky. Going through that again will be painful.
When the Bush-era change happened, I supervised the change in my company, having to track the dozens of updates of Windows, Java, and Oracle (often because each one had to incorporate a patch to detect if one of the other two had not actually been patched). This amounted to basically $50,000 of my companies dollars wasted for no actual benefit - $50,000 just to say we still worked.
And the worst thing about it all was that even after all that money on our part, and on the part of Microsoft, Sun, and Oracle (who saw even less money relative to the efforts it took), nobody would be able to say 100% that it was "right". There still could have been one stupid little detail that would have gotten it wrong on the day of the switch or projecting forward to the switch-back.
Current estimates is that the DST change of 2005 cost the economy $5 billion in expenses *just to keep working at all* - that's 5 billion that wasn't spent on improvements, or new features, or anything actually giving new value to their customers. It simply ceased to exist, for the illusion of savings in other markets (energy and retail) that never materialized.
And I still saw most of my local trick-or-treaters after dark, so saying an extra hour of light for Halloween also was a pointless exercise.
One good suit is worth a thousand resumes.