Please create an account to participate in the Slashdot moderation system


Forgot your password?

1985 Usenet About Y2k 406

Anonymouse Cow writes "Here's a trip down memory lane (for some of you "oldsters"). Google's newsgroups has the first usenet mention of the Y2K bug... in 1985! Quote: "I have a friend that raised an interesting question that I immediately tried to prove wrong. He is a programmer and has this notion that when we reach the year 2000, computers will not accept the new date." Check out the replies!"
This discussion has been archived. No new comments can be posted.

1985 Usenet About Y2k

Comments Filter:
  • Lisa: Well, look at the wonders of the computer age now.
    Homer: Wonders, Lisa, or blunders?
    Lisa: I think that was implied by what I said.
    Homer: Implied, Lisa, or implode?
  • Sssshhh... (Score:4, Insightful)

    by jukal ( 523582 ) on Friday August 02, 2002 @05:18PM (#4001680) Journal
    Yeah, the developers already back then knew that they planted a ...krrrhmm... a few little easter eggs, but we don't want to be unemployed... do we?
  • by delta407 ( 518868 ) <slashdot AT lerfjhax DOT com> on Friday August 02, 2002 @05:19PM (#4001682) Homepage
    Remember, right after January 1? The world didn't explode (it didn't even implode!), so a handful of people in the media started saying the whole thing was a hoax to drive cash into the technology sector.

    They have the nerve to say that even thoigh I have a fax machine that says it's 8/2/19102.
    • Odd thing : Searching for numbers on Google

      19099 : 12,300 matches
      19100 : 531,000 matches
      19101 : 537,000 matches
      19102 : 518,000 matches
      19103 : 71,900 matches

      There's a massive number of systems out there still showing April 24th, 19102 at the top of the page. That's 2 1/2 years after the bug.

      Yeah, it was all a hoax and never affected any machine.
  • And now Y2038 (Score:5, Insightful)

    by shoppa ( 464619 ) on Friday August 02, 2002 @05:22PM (#4001707)
    Many of today's programmers are curiously nonchalant about Y2038, when Unix and other OS's that store the date in number of seconds since 1970 in a 32-bit signed quantity overflow and the date goes negative. The vast majority lump it into the somebody else's problem category, for one of several reasons:
    • They won't be around.
    • Surely the date field will expand to 64 bits by then.
    • They plan on making a lot of money 36 years from now

    Almost all of these were uttered in that Google thread from 1985 about Y2K :-)

    Strangely, though, few seem to care that there are many file formats where the "automatic" kernel 64-bit date expansion they expect will be a problem. If the application expects that the date will always fit in that 32-bit field, and there's no obvious way to extend that field, then you have a lot of files which may no longer be useful...

    • by dananderson ( 1880 ) on Friday August 02, 2002 @05:39PM (#4001807) Homepage
      Fortunately, some people have thought it through. There's a proposed POSIX standard, xtime, to create a new time type, and new functions, to handle a 64 bit time type (in a 32 bit world!).

      The xtime struct contains:
      int_fast64_t sec;
      int_fast32_t nsec;

      In the 64-bit world, it's no problem--time_t is defined as a long long (64 bits).

      • by ford42 ( 90100 ) on Friday August 02, 2002 @06:11PM (#4001960)
        Yeah, but that just pushes the problem off, doesn't it? Instead of worrying about 2038, we would then have to worry about 584554531360! What are we going to do 584 billion years from now when 64-bit time runs out?

        Instead of following hare-brained schemes like this, I think we should look seriously at implementing RFC 2550 [].
      • Oh great, and what about the year 292279027178 problem?

        Short sighted idiots...

      • Using 64 bits for time_t is wasteful. It would be better to just use 33 bits for time_t. This would save space and push the Y2038 probably out a few more decades. ;)
    • It's definitely a concern for some people! At a previous job I wrote wrappers for all the time-related C library calls our code used, and made the application code use 64-bit time everywhere. Nowadays I mostly code in Java, which uses 64-bit time from the get-go (Java time is in milliseconds instead of seconds, but that's still a lot of headroom.)

      That said, I agree with the parent that there seems to be much less concern about the problem than there ought to be. The crazy thing is how long it's taking OS vendors to supply low-level 64-bit time system calls. If I could have used 64-bit time in my stat() calls and so forth, I would have started doing it years ago. But short of not looking at the clock or writing wrappers like I did, it's impossible to code a Y2038-proof application under some OSes even today, and on the OSes where it is possible, it usually takes some hunting to figure out how. Most vendors have tweaked their system calls to allow 64-bit file sizes, but for some incomprehensible reason they didn't move to larger time values while they were breaking the APIs anyway.

      Time representation is one place I think Microsoft got it right, actually. One of the Windows time formats is a floating-point value, the number of days since Jan. 1, 1900 if I recall correctly. This is great since it gives you sub-microsecond precision for the immediate future while allowing dates way off in the past or future.

      • I think Microsoft got it right, actually. One of the Windows time formats is a floating-point value, the number of days since Jan. 1, 1900 if I recall correctly.

        Yeah, that's a stunningly good idea. Make every date manipulation have to rely on floating point arithmetic, making things far slower than they need to be. How much did Intel pay them for that?

        Are you sure you don't mean fixed point? Even that would be over the top... If you're going to be using 8 or 10 bytes to store the date (as floating point uses on Intel), then you could store microseconds with 12 bits, milliseconds with 12 bits, seconds with 6 bits, days with 5, months with 4. That's only 27 bits, leaving 25 or 41 bits for the year - slightly more than we'll ever need (even if they're signed dates with -2^79 representing millions of years BC to microsecond resolution!)

        But generally, most people want the difference in dates/times, so subtracting them is usually best, so I'd say that just using a 64 bit integer is more than adequate.

        I think the worst example of using bitfields are Microsoft's date formats in MS-DOS. The bitfield format makes doing any calculation far more complicated than is really necessary, and only provide 2 second resolution. It does allow dates up to 2107, though fortunately I'm going to live safe in the knowledge that no-one in the civilised world will still be using DOS then (unless 640k really is enough for someone.

    • Re:And now Y2038 (Score:4, Informative)

      by dananderson ( 1880 ) on Friday August 02, 2002 @05:53PM (#4001883) Homepage
      Interesting essay on the Y2038 problem, and probably human nature, at Roger Wilcox's Y2038 page, []
    • If you've thought about Y2.38K, then you might like JR Stockton's Critical and Significant Dates page []. I found it while rummaging through Google looking for info related to Steltor's CorporateTime UNIAPI_TIME time value from their API. (UNIAPI_TIME was a "weird" number, which turned out to minutes since their epoch -- 1/1/90. I couldn't find any info about it, so I "decoded" it myself with a tiny Perl script. In case anyone cares.)

      Anyway, Stockton's page had me occupied for a few good hours. It's quite a read. It has great stuff on it, like the base filedate for Windows "Last Modified" calculation, when 16-bit BSDs die, when NTFS fails, etc. LOTS of good dates there.

      I even submitted my newly-discovered UNIAPI_TIME epoch value. It was much more exciting that submitting my transmeta-based Gateway/AOL Webpad's BogoMips value to the BogoMips mini-HOWTO [].


  • by aengblom ( 123492 ) on Friday August 02, 2002 @05:23PM (#4001715) Homepage
    I don't know whether to to gaze into the beauty of the formated and edited messages or make prank calls to the phone numbers listed beneath them.

    Ahh the conflicted mind ;-)
  • Brilliant!...... (Score:4, Insightful)

    by Dr_Marvin_Monroe ( 550052 ) on Friday August 02, 2002 @05:24PM (#4001716)
    I've always suspected that people in 1979 were smarter than today, and NOW I have proof!

    Bug fix strategy for date roll-over...quoth message...

    "First, I modified the daily demand deposit program with code that checked for the date and about mid-1979 started printed warnings on the console of what would happen come new year. Then the systems analyst and I got new jobs. This is known as stepwise interactive development."

    It's funny to see that this problem was known at least 30 years before the Y2K hysteria....I hope that this is a lesson to all of you young programmers....

    "run away! away!..." Holy Grail...

  • Old news! (Score:3, Funny)

    by DaphunK ( 565928 ) on Friday August 02, 2002 @05:25PM (#4001723) Homepage
    Yeah. I think we've heard this one before...
  • by Jafa ( 75430 ) <jafa@mark a n t e> on Friday August 02, 2002 @05:26PM (#4001730) Homepage
    Man, I love reading these old threads. It's always a cool bit of memory lane, seeing the old email addresses (UUCP, ARPA), and the old but still familiar sigs. And the coolest thing is the lack of flames. When the one person in the thread who was an astronomer made a mistake on leap years, no one jumped at his throat. One person even says "So, he made a mistake. Who doesn't?" That would never happen that nicely today.

    Just some ramblings...

  • fools (Score:3, Funny)

    by natefaerber ( 143261 ) on Friday August 02, 2002 @05:28PM (#4001740)
    How naive. Little did they know that this would lead to total global chaos...Coke machines killing kids, toasters strangling people, and people using rusty bicycles as currency. You know...dogs and cats living together...the destruction of civilization as we know it.

    Oh wait, that didn't happen...I gotta go find that money I buried.
  • Things like Y2K won't be much of a problem in the future because (if you follow the BBC) we're bound to be destroyed by an asteroid in the next 50 years or so.

    FYI just announced today...Cool NERD clothing!!! []

  • Old news (Score:5, Interesting)

    by awptic ( 211411 ) < minus berry> on Friday August 02, 2002 @05:31PM (#4001762)
    This link is from Google's list of historically significant usenet posts; the complete list is at e_20.html []

    There's some really great ones in there, including Linus announcing Linux, Microsoft soliciting for new 'wizards', a thread about the chernobyl accident, and so on.
    • How about this [] little gem, from the first post to mention Revenge of the Jedi?

      I wish Lucas & Co. would get the thing going a little faster. I can't really imagine waiting until 1997 to see all nine parts of the Star Wars series.

      Heheh.. The other funny thing is that the post is by Randal Schwartz of Llama and Camel book fame. Hang in there Randal, you've almost made it to Episode 6! :)

  • Are you suggesting that people pull their money out of the banks on Dec 31, 1999? If so, then maybe you should suggest that people avoid the rush and grab it Dec 30, or maybe Dec 29, ....
    ...asks Landon C. Noll, nearly fifteen years before the US Treasury announces they will be printing more bills. Followed up by Bruce Adler:
    I seriously plan on closing my checking account several months before the end of the centuary and hiding all my cash under my mattress until all the smoke clears.
    So how many people actually did that, anyway?
  • These guys obviously had a grasp of the problem and understood how to avoid date problems in the future. They also understood the devastation that could ensue if dates were to go awry in software. But, as is human nature, did any of them do anything about the problems? I guess not, since 15 years later everyone was in a panic about Y2k. One guy even quit his job rather than fix a serious pending date problem in his system.

    Human nature: ignore problems until you can't.
    My nature: fix problems now, you'll be happier in the long run.
    My fate: get treated as a doomsayer/whiner.

    There is a cost to being proactive...
    • "These guys" were engineers.

      Business decisions are not made by engineers; they are made by the people who employ engineers.

      Business people with short term profit motives should not be confused with engineers having made or not made a decision to deal with the Y2K problem.

      UNIX currently faces a Y2038 problem with 32 bit signed seconds since the epoch, yet I don't anyone paying people proactively deal with that problem; do you?

      -- Terry
  • if Slashdot were looked back upon as one of the earliest mentions of the Y10k problem. None of those stupid programmers took into account 5 digits!!

    Oh well, I'm looking forward to dealing with 2038 myself. What is it? About mid Janurary when it dies?

    • I have had the pleasure of working with software which took into account 5 digit yers and failed to pass Y2K testing.

      The software was for an archialogical database and stored the year photos were taken as 2 digits, while other data was stored in a 5 digit year field representing BC, AD or BP. BP related to carbon dating and is the number of years before 1950. 1950 is 0 BP.

      It really was an odd piece of software.
    • Well, by then robots will have overtaken the human race, so let's leave that particular bug in to get them back.
  • 15 years and... (Score:2, Informative)

    by DaphunK ( 565928 )
    We all still waited to the LAST minute to fix the bugs :) I know that the accounting software company that I work for was up very late many nights in December 1999, upgrading UNIX servers and program files so that the "world" would not come to an end in the Oil Marketers pocketbooks. J
  • by Skyshadow ( 508 ) on Friday August 02, 2002 @05:39PM (#4001802) Homepage
    This is supposed to be Usenet?

    But where is all the off-topic spam? Where are the trolls? Where is the porn? The flamers?

    This is clearly some sort of clever mock-up of Usenet and not the real thing. Frankly, given the omissions I've stated above, it's not even a very well-done imitation; I'm shocked the /. boys would be fooled by it.

  • ahh 1985 (Score:4, Funny)

    by Anonymous Coward on Friday August 02, 2002 @05:40PM (#4001812)
    Reading those messages just goes to further prove on of the infallible laws of humanity: the quality of spelling is inversely proportional to the availability of spell-checkers. Eh, Rob?

    Seriously, just LOOK at those posts. Proper grammar, proper punctuation. Hell, one guy even INDENTED the first line of a paragraph! Have you ever SEEN such madness?

  • Follow the 'highlights' link in this [] story.

    Scroll down to 1985.
  • There was a problem with dates or something in the year 2000?

  • bolles@reed.UUCP -- uucp

    Also notice, if you try to check out the cross linked posts..

    Group: net . bugs (This group is no longer archived)
    Group: net . flame (This group is no longer archived)
    Group: net . puzzle (This group is no longer archived)

  • wrong :) (Score:4, Informative)

    by Anonymous Coward on Friday August 02, 2002 @05:44PM (#4001831)
    The first mention of the y2k bug was banks in 1975 calculating 25 year mortgages that ran into problems then with it.
  • My favorite post (Score:3, Flamebait)

    by Monkeyman334 ( 205694 ) on Friday August 02, 2002 @05:47PM (#4001846)
    Check this one out (my emphasis added):

    Some software blows up on dates at other times. I'm aware of some old
    DEC software (don't worry... you're NOT using it... it's single user!)
    that keeps the date year as a 5 bit offset from 1972. Let's see...
    1972+31=2003, so it blows up in 2004. Probably, tho, the display-a-year
    routine isn't written to handle beyond 31-dec-99, since no one expects
    that RT11 (oops, now I said it) will still be used then. I hope.
    Join the (Hopefully) Great Usenet Blackout 4/11/1985

    Alright, so maybe that wasn't in there. But wouldn't it just suck if someone 15 years from now posts a story about a 15 year old slashdot post to a huge newsite and all the people laugh at what huge dorks we were?
  • Oh, dear oh dear. Folks, there is an outside world out there and that world uses computers to do REAL STUFF. One of the "real stuff" things that computers do out there is to store data in files, both on tape and on disk.

    on tape and on disk

  • by prockcore ( 543967 ) on Friday August 02, 2002 @05:51PM (#4001867)
    One of the replies:

    "If you are really worried about timewrap breaking programs in subtle ways,
    then set your clock ahead now, and find the bugs. That will give you several
    years to fix them. If you are binary only, you might NEED several years
    to get you vendor to fix them!"

    See! Even in 1985, they understood that opensource bugs get fixed faster than properietary software! :)
  • my favorite reply (Score:5, Insightful)

    by elmegil ( 12001 ) on Friday August 02, 2002 @05:52PM (#4001875) Homepage Journal
    I think, though, that IBM will get moving on this problem around the year 1995, if only so that the society on which they depend for profits will continue to exist.

    How prescient some people were back then :-)

  • Maybe Google should get some award for preservation of history? Imagine what kind of gems will turn up fifty or a hundred years from now.
  • Attitude (Score:4, Interesting)

    by Dirtside ( 91468 ) on Friday August 02, 2002 @05:57PM (#4001899) Journal
    It's interesting to note the fairly casual attitude everyone in the thread has toward this potential bug. Basically, they seem to be saying, "Yeah, it'll be an issue, I guess, but people will deal with it then, hey here's a funny story..."

    Not that there's anything wrong with that attitude, but it does indicate two things: One, that even hardcore geeks (i.e. people who had email addresses in 1985) can be complacent about things that seem a long way off (rather than fixing it long before it'll become a problem, as would be "ideal", for suitable definitions of ideal); and two, that computers were not the societally pervasive force that they've become in the last decade. A lot of the reason people didn't see the Y2K bug having that much potential impact that far in advance was because this kind of omnipresence of computers was just beginning. (In AD 1985, personal computerization was beginning...) These days, even an average Joe on the street would probably be astonished to hear that any kind of, say, large utility wasn't thoroughly computerized, but in 1985, such a revelation would have been met with mostly blank stares.
  • Gack...I feel old now. One of the posts in that thread was from me. Oh well, it's cool to know I participated in the first usenet Y2K discussion. :-)
  • by PsyQ ( 87838 ) on Friday August 02, 2002 @06:04PM (#4001922) Homepage
    This post is on Google's list of memorable posts []. It's the first mention of Star Wars, Episode 6 []. I think the probability that this is THE Randal L. Schwartz is very high.

    How cool is that? He even scores for quintuple Nerdhood by:

    1. Being on Usenet in 1982
    2. Having his Usenet post on Google's memorable postings list
    3. Being a Star Wars geek
    4. Being a Star Wars geek ON Usenet, IN 1982!
    5. Writing his own scripting language

    And who knows, maybe that page at Google was generated by HIS scripting language ;)
    • Phew, and no one noticed that this is the wrong Perl guy. He's still a Perl Jedi, but Randal's the one writing all the books, not the language. Sorry, Larry :(

      Guess I should've stayed in Python Land, where both the newbie books and the language are written by the same old Guido.
  • Slashdot: Are you planning to read Slashdot on August 17th 2002?

    Users: Probably not - it's a Saturday.

    Slashdot: Well if you do, whatever you do, don't read Slashdot on August 17th! The internal coding of "August 17th 2002" triggers a perl script that sends Cowboy Neal's entire Boy Band mp3 library to your e-mail account...

  • you know what was missing there? i didn't see anyone claim "first post"...

    seriously though, i think it was interesting that the majority of their discussion seemed to be focused around the fact of calculating whether or not 2000 was a leap year, rather then the fact that computers couldn't handle the year 2000 because they were only storing the last two digits representing the year, and not the century...

    also, noticed there was a lack of links to the "" website in the thread...

  • Just think that in a few years you will be able to refer to the year 2002 as aught-two! By the way the Websters Thesaurus also lists ought as an alternate spelling to aught.

    Yikes. The year is more than half over and I don't find this out 'til now. So much lost time!

  • "I think, though, that IBM will get moving on this problem around the
    year 1995, if only so that the society on which they depend for profits
    will continue to exist.
  • So I was perusing the articles in Google, came across the Cold Fusion [] and some of the corresponding threads.

    Someone makes a point, "From cold fusion it's not a far step for 750 terrorist cells to begin making H-Bombs in their kitchen"

    Ironic that the H-Bombs are available first, eh?
  • by myawn ( 562028 ) <mike.theYawns@com> on Friday August 02, 2002 @06:27PM (#4002035) Homepage
    I worked in banking during the late 70s and early 80s, and we were well aware at the time that there was an issue with dates that would require changes to software before the year 2000.

    People seem to think that this was some unexpected oversight; it was nothing of the sort. Given the cost of storage at the time, and the millions of records that had to stored with one or more date fields, it was a purely economic decision to save money at the time. I don't have the numbers needed to do the math, but I suspect it was actually the right choice. If you compare the cost of additional required storage to the eventual rework cost, discounting for time, maybe it doesn't look so stupid. Especially since many programs really did cease to be used before the problem arose (although probably far fewer than we would have predicted)

    We all joked at the time that, along about 1998 or 1999, we would take jobs in other industries until the changeover was complete.

  • Surprisingly Google doesn't even mention the prescient Dance Dance Revolution discussion here: UCP []

    Talk about a revolution.
  • I think it would be interesting to track down some of the participants from this thread (particularly Spencer L. Bolles, the originator) and get their viewpoints 17 years later.
  • "From: larry@extel.UUCP (larry@extel.UUCP)
    Subject: Re: Computer bugs in the year 2000
    Newsgroups: net.bugs
    View this article only
    Date: 1985-01-24 10:05:00 PST

    Another problem is that we have gotten into the habit of only using the
    last 2 digits of the year (look at your checkbook). Even worse is that
    some business software only allows a 2 character wide field for the
    date. Perhaps the designers did not expect their program to be in use
    in the year 2000 but I would not be suprised to see a considerable
    amount of 370 code running in the year 2000.

    Just think that in a few years you will be able to refer to the
    year 2002 as aught-two! By the way the Websters Thesaurus also lists
    ought as an alternate spelling to aught."

    Say what? aught-two ?? Anyone here calls it aught-two ??
  • Bob Bemer (Score:4, Informative)

    by m_chan ( 95943 ) on Friday August 02, 2002 @06:48PM (#4002145) Homepage
    Bob Bemer [] is credited with the first world-wide publication of the Y2k problem.

    R.W.Bemer, "What's the Date?", Editorial, Honeywell Computer J. 5, No. 4, 205-208, 1971

    Here is a funny quote from him:
    Q: So whom do you blame?

    A: Richard Nixon.

    Q: What did he do?

    A:I proposed a national computer year back in 1970. I wanted to model it after the IGY [the International Geophysical Year was from July 1957 to December 1958]. I could see that people were not prepared for the influx of computer usage that was sure to come. I thought that if we all put our minds to it and planned ahead a little bit, maybe it would be easier. Year 2000 was just one of the issues we would have addressed.

    President Nixon was very suspicious of computers, though, and wouldn't sign off on it. Without his proclamation we couldn't do it. I think he'll go down in history along with King Canute.
    He has a rather impressive list of accomplishment to go along with those tidbits, including prior art [] for the British Telecom patent fiasco [].

    A pretty neat dude.
  • by medcalf ( 68293 ) on Friday August 02, 2002 @06:48PM (#4002150) Homepage
    That is the highest signal to noise ratio I've ever seen on USENET - and it was crossposted to net.flame!
  • y2038 (Score:3, Informative)

    by DunbarTheInept ( 764 ) on Friday August 02, 2002 @06:50PM (#4002160) Homepage
    I predict the y2038 problem won't take much effort to fix. Most (good) programs these days are designed without hardcoding the exact bytesize of things, and instead using system-supplied types. For example, we don't say:
    char timebuff[4]; /* 32 bits */
    *((int*)timebuff) = time(NULL);
    Instead we do stuff like this:
    time_t timebuff;
    timebuff = time(NULL);
    When the system type for time_t is change to something with more than 32 bits, the code just needs a recompile and voilla - it handles dates past 2038. The work is going to be in making sure every program gets recompiled, and in converting saved files that have the date already stored in 32 bits. The ugly part will be if your system depends on third-party stuff in binary form only that you can't upgrade for whatever reason.

    Note, I didn't say the problem will be nonexistant, just that it will be easier to fix than y2k.

  • In one of the messages, a "Tim Smith" says:

    If you are really worried about timewrap breaking programs in subtle ways,
    then set your clock ahead now, and find the bugs. That will give you several
    years to fix them. If you are binary only, you might NEED several years
    to get you vendor to fix them! :-)

  • by nomadic ( 141991 )
    Maybe they just thought by the year 2000 we'd have evolved to a new stage of consciousness, and would live eternal lives as cosmic spirits of energy.

    It was the 70s, remember.
  • by Get Behind the Mule ( 61986 ) on Friday August 02, 2002 @08:36PM (#4002646)
    Here's where I get modded down for geezerness, but heavens to Betsy, Usenet was great back then. Back before the Internet exploded and innocence was lost.

    Here we see a Usenet thread, with thoughtful and interesting responses from knowledgeable, experienced people at universities and research institutes. No flame wars, no snot-nosed kids from AOL, no spamming, no hot grits or Natalie Portman, no ranting about how Usenet is a mysterious cabal of Illuminati scheming to rob our freedoms and kill our firstborn.

    I wasn't around in the nerdy, cliquish days of 1985 (I'm not that old!), but I did see the early 90's -- when Usenet was still a respectable hangout for serious and informative disussion -- dissolve into the mid 90's -- when all hell broke loose. It was exciting, and only logical, to see such a useful medium become so popular, but now the spammers and ranters and schemers have completely taken over. There are still a few pearls in there these days, but you have to go look for them in that enormous, stinking pile of shit.

    I used to use the 'vi' binding in 'nn', which gave me a full curses screen to type my posts. Now I type Slashdot comments in this puny little HTML textarea. What has the world come to?

Syntactic sugar causes cancer of the semicolon. -- Epigrams in Programming, ACM SIGPLAN Sept. 1982