Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
OS 9 Businesses Operating Systems Apple

100 Years of Macintosh 280

Zero seconds on the Mac OS system clock is January 1, 1904. The Mac OS epoch hits 100 years ... now. That's assuming you live in the Pacific time zone, anyway: the Mac OS epoch is unique in that it is time zone-specific. Of course, none of this applies unless you are running Mac OS, and all you Mac users are using Mac OS X, right? (Geek note: the Mac OS epoch is unsigned, which is why it can count over 100 years from 0 seconds, and 32-bit Unix can't, though it can count backward to 1901.)
This discussion has been archived. No new comments can be posted.

100 Years of Macintosh

Comments Filter:
  • by green pizza ( 159161 ) on Thursday January 01, 2004 @04:04AM (#7851591) Homepage
    I've been using my older Mac all evening (I know, boring life). Right now it claims it's "2:01:22 AM 1/1/2004". Seems to be OK to me.
    • by Concerned Onlooker ( 473481 ) on Thursday January 01, 2004 @04:07AM (#7851605) Homepage Journal
      You're lucky. I'm on OS X and my computer just asked me to go outside to replace the AE35 unit.

    • Fixed long ago (Score:2, Informative)

      by Anonymous Coward
      The Classic Mac OS epoch limit was fixed quite some time ago. I believe it was around System 8.6 if I remember correctly. Classic Mac OS has since that version had the ability to work with any date in the range from 20,000 BC to 30,000 AD.
      • by Anonymous Coward
        "Cool," you say. Then you ask, "But 29,940 AD? Who cares about that?"

        As with many things, the answer should be obvious: time travelers. While the mainstream press seems to have, once again, missed a great Apple story, it can no longer be kept secret: the Macintosh is the preferred computer of time travelers everywhere. Or everywhen. Or at least everywhen across a span of sixty millennia.
      • Re:Fixed long ago (Score:4, Informative)

        by Trillan ( 597339 ) on Thursday January 01, 2004 @06:59AM (#7851933) Homepage Journal

        This article and all comments seem to be a little twisted.

        What's an epoch in this context? An epoch for dates is usually the year after which the entered year is assumed to be the next century rather than the previous one.

        For Macs, this has varied over the years with different software releases.

        The other way to look at it might be the date it "rolls over." But date 4,294,967,295 is not for something like 35 years. I think it's in 2040, but I'm not entirely sure. I haven't had to deal with it in a while. :)

        The only significance of today's date is that it's 100 years after time 0.

        (And, of course, there are other APIs available on the Macintosh that won't break even then.)

      • Great. Now I have to worry about the Y30K problem.
    • I have this ancient version running on a Quadra 610, and after checking "Show century" in the Date & Time cdev, it showed that it was indeed in 2004 and not 1904.

      (Given that my Apple IIGS got through Y2K without a hiccup, I'm not particularly surprised that there were no issues with newer hardware either.)

  • by Anonymous Coward on Thursday January 01, 2004 @04:07AM (#7851600)
    I will remove the PRAM battery from my LC II temporarily and boot it up, resetting its internal clock, in commemoration of this event.
  • It seems that with Apple's other projects, they stand a good shot digging themselves out the nitch they carved out long ago.. Since Apple models itself a hardware company, do they offer patches on a similar basis as Microsoft or to they rely more on the BSD patching system?
    • by KrispyKringle ( 672903 ) on Thursday January 01, 2004 @04:23AM (#7851654)
      OK.

      1) Who says they model themselves as a hardware company? Companies that do both hardware and the software that runs on it are common in enterprise computing (Sun, IBM, SGI, etc). Would you say these companies have little software experience because they are hardware companies? Apple is much the consumer equivalent of these; they make hardware and software woven very tightly together; the idea behind a Mac is not that you get superior hardware or superior software, but that you get a package. And that in being a cohesive package, it is superior, almost inherently, than a hodgepodge of off-the-shelf components (much like Sun's claim that Solaris is the best OS for Sparc, or SGI and IBM with IRIX and AIX (which are both perhaps on the way out, in favor of custom Linux distros)).

      2) Yes, Apple patches are offered as timely as Microsoft (which is to say, perhaps not as timely as they should be). I've seen plenty of reports on Bugtraq of Apple being unresponsive to reported bugs, but then I've seen the same with MS. Presumably, they simply didn't take the issue seriously or deemed it unworthy of addressing for some other reason (which leads us back to just how trustworthy your computing really is, if you can't trust the company that designed it).

      3) What ``BSD patching system''? I'm pretty well experienced with administering Open and FreeBSD, and I am totally unaware of some patching system inherent to all BSD-derived OSes (say, Solaris?). Both Open and Free have similar pkg and port systems, but this is more because Open liked the way Free did it, not because they are both BSD's (that is, BSD refers to the underlying OS components--as opposed to, say, GNU--not anything else (certainly not the kernel, which, on OSX, is Mach-, not FreeBSD-based)). I think you are confused.

      • Oh, yeah. And OpenDarwin provides FreeBSD-style ports, while Fink provides apt and packages based on the .deb format from Debian.
      • "Presumably, they simply didn't take the issue seriously or deemed it unworthy of addressing for some other reason (which leads us back to just how trustworthy your computing really is, if you can't trust the company that designed it)."

        The oft neglected third option is that there's a long list of things to do ahead of a given defect. There are only so many programming monkeys at Microsoft or Apple working on code. In other words: A neglected defect is not automatically an indication that a company is ev
        • Perfectly true, of course. Some amount of bugs and holes are to be expected. But it seems to me that software companies are held to far looser standards than, say, automobile companies. And I think this relates largely to the relative age of each industry.

          People take it for granted that cars work reliably, just as they take it for granted that computers don't. Back when I started using PCs around the time of Windows 3.1, I took it for granted that errors occurred (actually, I remember, though perhaps ina

          • by NanoGator ( 522640 ) on Thursday January 01, 2004 @08:58AM (#7852178) Homepage Journal
            "So too, opinion seems to be that security holes are entirely the fault of the attacker, never of the software designers. ... The point is, if software companies were liable for any serious defects, they might try harder. And if they were liable for ignoring those defects, I betcha they'd be able to find someone to get to work on it... We, the public simply need to weigh in with some careful legislation to balance those priorities with stability, reliability, and maturity. "

            I'm sorry, but I simply don't agree with this point of view. Your heart is in the right place, but this is not the answer.

            First, the hacker *is* guilty. Software is designed for a specific purpose (even general purpose software) and because of that, the creator of that software cannot and should not be held liable for that. Problem #1 is that software is written by humans, who are, by nature, error prone. Problem #2 is that finding defects and using them maliciously requires creativity. Because of this, there's no practical way for a software company to know that their software is 'liability free'. Problem #3 is that there are far too many products out on the marketplace today that can be misused in such a way that a simple modification would prevent that sort of behaviour from happening. Why single out software? Problem #4 is that in cyberspace, monetary damage is very difficult to measure. Problem #5 is that the environments that the software is run on are far too diverse to guarantee any sort of working order. As such, anybody 'relying' on a computer system would be incredibly ignorant without ways of minimizing damage due to loss of functionality or data. (I should pause here a sec to let you know that I'm quite fatigued, and I apologize if what I'm posting is difficult to read.)

            Secondly, unloading legislation that says you are liable for an attack that somebody else carries simply because you didn't cover all your bases is going to do more harm than good. The Open Source Community will be hit the hardest. Who would want to contribute spare time to a project only to open the door for being sued because somebody decides to be a git? I mentioned in an earlier point that there's no real scientific way to certify the 'safety' of software. The only real way to approach that would be heavy testing on a very diverse range of platforms and configurations. I can see Microsoft with their 25+ billion in the bank doing this, I can't see a startup company doing that. Nor can I see that startup company surviving their first lawsuit over this. The only way to minimize this negative effect on the industry would be to tightly define very specific rules about very specific exploits, such as the one you mentioned with Apple. Well, what good is this legislation going to have if it only covers a limited scope? Okay, I'm drifting a bit here. Sorry. I just don't see this doing anything but making software development less accessible, and making megacorps like Microsoft stronger. Software could become 'less exploitable', but the cost of that is growth. Even then, defects will not disappear. BS like the Blaster Worm will still happen, it just might take a little longer.

            Third, how does one even begin to define effective legislation here? In order to prevent a defect from being exploitable, one has to know every single way that defect can be used. I remember back in the Windows 95 days, you could rename your Windows folder. Doing so meant instantly breaking your system. A shortcut or batch file could be made to do this. If somebody sends out an email tricking people into running a shortcut to do this, how do you define Microsoft's guilt due to damage done? The rename feature works perfectly. Using it to rename your Windows folder is like cruising down the highway at 70mph and shifting into reverse. Sure, the car could be made to prevent that, but why would somebody do that in the first place? Should Honda be partly responsible because of deaths caused by somebody saying "
            • I didn't argue for statutory negligence here (i.e., legislated guilt rather than decided by a judge and jury). Rather, the common law definition of negligence works fine.

              In such an instance, Microsoft would not be liable for simply making a mistake as a ``reasonable person'' is apt to do (or, as you said, humans are error-prone). But they would be liable for spending millions on advertising Trusted Computing without actually doing anything in the way of R&D (I don't actually know if they've done anyt

    • by green pizza ( 159161 ) on Thursday January 01, 2004 @04:25AM (#7851663) Homepage
      Since Apple models itself a hardware company, do they offer patches on a similar basis as Microsoft or to they rely more on the BSD patching system?

      Closer to Microsoft than anything else. Apple's patches generally come in the form of installer applications that can be downloaded and installed automatically via the bundled "Software Update" application (GUI and command line) or can be downloaded and installed manually from the support section of their website.

      Apple does not publish the source of any of their GUI applications or the GUI framework itself. It does however release the source to the rest of the OS under the name "Darwin". Patches and other updates to Mac OS X generally find their way into Darwin and can be browsed at http://developer.apple.com/darwin.

      The typical artist/writer/mom-or-dad user can click a couple buttons and have OS X update itself (or even set it to always keep itself updated). More technical users can browse the Darwin website for more details. (This was recently done by several folks wanting to know more about how Panther, Mac OS X 10.3, does its automatic defragmentation and optimizing. They dug around in the Darwin souce until they found that particular part of the HFS+ architecture. Examined the code and made a few posts explaining the process to everyone else).
      • The typical artist/writer/mom-or-dad user can click a couple buttons and have OS X update itself (or even set it to always keep itself updated).

        Correction: the typical artist/writer/mom-or-dad user leaves the default settings, so his/her OS X updates itself every week. You don't need to "set it", it's set by default; you have to "click a couple of buttons" to disable it.

        Actually, I'm not so sure if it's a good to idea to put it as default - I wonder what will happen if a "typical artist/writer/mom-or-
  • by account_deleted ( 4530225 ) on Thursday January 01, 2004 @04:09AM (#7851614)
    Comment removed based on user account deletion
  • by emerrill ( 110518 ) on Thursday January 01, 2004 @04:10AM (#7851617)
    This post doesnt have a real point, and isnt based on an article. It is just stating that today marks 100 years from the point that macs count from. Nothing bad happens from it, it can still cound for another 30ish years (i beleive).
  • by OttoM ( 467655 ) on Thursday January 01, 2004 @04:11AM (#7851618)
    The article confuses epoch and ticks. The epoch is a fixed point in time. Ticks is a number of seconds (or other time unit) since the epoch.
    • Working with date/times is hard enough without having to worry about how it is stored. Why not store it as a text string and be done with it, especially with the huge amounts of RAM and processor speed we have these days.
      • Because not everyone uses the same format? 16283723 might not be human readable, but it atleast means something. "Jan 17 1987" is just arbitrary text that can't easily be manipulated or sorted.
      • Know your rules of normalization. Storing time is similar to storing a field in a database. The simplest way to achieve that is by using the number of seconds from the epoch. If you write it as YYYY-MM-DD, then you are duplicating information that can be creating more simply.
        • Know your rules of normalization. Storing time is similar to storing a field in a database. The simplest way to achieve that is by using the number of seconds from the epoch. If you write it as YYYY-MM-DD, then you are duplicating information that can be creating more simply.

          This has nothing to do with database normalization.

          If you count the seconds since the epoch, you are limiting yourself to those date/times which occur *after* the epoch date/time. With a text format, you have no such limitation. I'
    • I thought ticks are 60ths of a second since boot.
  • Oh Yeah?! (Score:4, Funny)

    by dupper ( 470576 ) on Thursday January 01, 2004 @04:11AM (#7851620) Journal
    Well I set the arbitrary starting time on my OS to January 1st, 1804, so take that MacOS: The dupperOS epoch hits 200 years... 3h14m ago.

    Nya, nya!

  • by catbutt ( 469582 ) on Thursday January 01, 2004 @04:12AM (#7851626)
    Fresnel lens has a small scratch, and vacuum tube port is broken, but otherwise mint. Best offer.
  • Ugh. (Score:5, Insightful)

    by Feztaa ( 633745 ) on Thursday January 01, 2004 @04:19AM (#7851648) Homepage
    Mac OS epoch is unique in that it is time zone-specific.

    It is unique, in the sense that it is crappy.

    On Unix, the epoch is an extremely well-defined moment in time, so then is any point in time measured in epoch-seconds is also extremely well-defined.

    On the Mac, the epoch-seconds depends on the time zone, meaning that in order for a measurement of time in macos-epoch-seconds to be meaningful, you also need to know the time zone. To me, that kind of ruins the whole point...
    • Re:Ugh. (Score:5, Interesting)

      by Bruce Perens ( 3872 ) * <bruce@perens.com> on Thursday January 01, 2004 @04:41AM (#7851700) Homepage Journal
      The epoch was a well-defined moment in time until leap-seconds happened, and Unix ignored them. POSIX perpetuates that error. As a result, the epoch keeps moving.

      Bruce

      • Well since modern evidence shows that leap second(s) might not have been needed After all [slashdot.org] maybe Unix wasn't so wrong in ignoring it =)
      • Re:Ugh. (Score:5, Funny)

        by Red_Winestain ( 243346 ) on Thursday January 01, 2004 @08:48AM (#7852148)
        Well, there hasn't been a leap second since 1999. There won't be one this year. Has the planet finally caught up with Unix?

        Reference [usatoday.com]

    • Re:Ugh. (Score:3, Interesting)

      by gellenburg ( 61212 )
      You're obviously forgetting that GMT (the time-zone which UNIX epoch originated at) is a time-zone in and of itself.

      Sheesh.

      You kids now-a-days.

      (Note - UNIX does not use UTC since UTC incorporates leap seconds which UNIX & POSIX does not honor.)
  • by Nova Express ( 100383 ) <lawrenceperson@@@gmail...com> on Thursday January 01, 2004 @04:28AM (#7851670) Homepage Journal
    Mainly because I have files on my current Mac (a Dual 1 GHz G4) that were present on my Mac Plus hard drive when it crashed in 1991, and they read:

    Dec. 31, 1903, 6:00 PM

    Which may be the default for the Central time zone.

    Do I really need those files anymore? Well sure! Some of them are old entries for the Bulwer Lytton Contest [sjsu.edu], and you never know when I'll have enough to collect for section of a short story collection. Plus, you know that as soon as I throw away a file, I'll need it the next day. That's just how things work.

    This is one of the many, many reasons why I've gone from a 60 Meg to a 60 Gig hard drive. ;-)

  • um.. OK.. (Score:5, Interesting)

    by Phroggy ( 441 ) * <slashdot3@ p h roggy.com> on Thursday January 01, 2004 @04:30AM (#7851678) Homepage
    Note that there's nothing particularly special about hitting 100 years after epoch, being that 100 years is not a technically interesting length of time and the epoch being 1/1/1904 isn't non-technically interesting.

    A technically interesting length of time (such as 2^32 seconds) from epoch would be noteworthy, but that's a few decades off.

    A non-technically interesting length of time (such as 20 years) from the date the Macintosh was first introduced would also be noteworthy, and that's later this month I believe.

    I'm a bit tired; did anyone grok that?
    • Re:um.. OK.. (Score:3, Interesting)


      A non-technically interesting length of time (such as 20 years) from the date the Macintosh was first introduced would also be noteworthy, and that's later this month I believe.

      That is indeed later this month, dated from the SuperBowl 1984 when the Apple SuperBowl commercial aired. And there are some rumors that Apple will air it again, during the 2004 SuperBowl, to get some of that old time feeling back.
  • Hardware clock (Score:3, Informative)

    by norwoodites ( 226775 ) <pinskia AT gmail DOT com> on Thursday January 01, 2004 @04:39AM (#7851696) Journal
    Actually all Macs are defined that way, the hardware clock is defined that way.
    Little know fact (or widely known) almost all Macs will reset to January 1, 1969 if the batter is removed.
  • Ha! (Score:5, Funny)

    by NanoGator ( 522640 ) on Thursday January 01, 2004 @04:52AM (#7851728) Homepage Journal
    Ha! Us Windows users don't have this problem. Microsoft won't let us use a Windows OS that old! *SmUG*
  • Why did these people pick these various epochs? Why 1904? Why 1970? Why is unix going to have (?) problems in 2038?
    • by axxackall ( 579006 ) on Thursday January 01, 2004 @05:38AM (#7851807) Homepage Journal
      I agree all those epochs are too random, including the birthday of Jesus Christ. IMHO the only meaningful and universal epoch is a time of the Big Bang. All time should be count from that.
      • Once we nail it down, an unsigned 64-bit int should fit that nicely, with a factor of ten breathing room just in case we're off a bit, or if seventeen billion years is just too soon for another y2k bug.

        (The universe is somewhere between 2^59 and 2^61 seconds old.)

        If time was constant everywhere in the universe, you could assign 295,147,905,179,352,825,856 IPv6 addresses to every second. Since it ain't, I'm not sure it's useful to count from the moment the quantum sock that is our universe turned inside o
        • If time was constant everywhere in the universe, you could assign 295,147,905,179,352,825,856 IPv6 addresses to every second.

          First, I thought we talk about computer clocks, not IP address space problems.

          Second, what's wrong to assign IPv6 addresses every second *even* when time is not constant everywhere?

          Third, we have sunrise at different moment at Earth, which (and because) is rotating. However we have so-called Universal time, which is the point zero for all other time-zones. In the same way the ag

    • Re:Picking Epochs (Score:2, Informative)

      by BinaryOpty ( 736955 )
      The 2038 problems are going to raise from the integer used to store the Unix time. The maximum value that the signed, four byte Unix integer can reach is 2^16-1, and so when you put that into seconds from 1/1/1970 (The Unix Epoch time) you end up somewhere near January 2038 (leap-seconds and such will throw it off) when the variable will reach its highest value and then reset to zero, essentially setting time back to 1970. The same will happen with the Mac variable at around the same time.
      • Hopefully, just too many drinks for a New Year and not a troll. 2^16-1, which corresponds to unsigned 2 byte int, wouldn't even last for one day. INT_MAX assuming four byte integer is 2^31-1. When the variable reaches it's max value, it will change to a -2^31. Depending on how functions like ctime are implemented, this may work just fine until the start of 22nd century, set the date to 1902 or cause programs to display garbage data or even crash. It will definitely not set the date to 1970, which would corr
  • Palm OS too (Score:4, Interesting)

    by Imperator ( 17614 ) <slashdot2.omershenker@net> on Thursday January 01, 2004 @05:16AM (#7851767)
    Palm OS also uses 1904 as 0. I don't know about Macs, but I do know that the DateType structure uses a 7-bit field for the year, so 2027 will be the end of the world for Palm handhelds.
  • by soft_guy ( 534437 ) on Thursday January 01, 2004 @06:05AM (#7851847)
    The Macintosh traditionally measured time for most purposes in seconds since Midnight, January 1st, 1904. The call to get this value is GetDateTime() which takes a pointer to a unsigned long and returns the number of seconds by assigning the value to the argument.

    Unlike what the article says, GetDateTime() is still available under the Carbon framework in MacOS X. However, there are now other ways of dealing with date/time in the MacOS. Ironically the preferred method, CFDate is also available under MacOS 9. So, I don't really get the point of the write up saying that this works only in MacOS 9.

    Frankly this is of little interest to anyone who is not a Macintosh programmer - and only mild interest to those of us who are Macintosh programmers.

    It is interesting to note that the Apple Newton also measures time from this reference point. However, it measures minutes since 1904 instead of seconds in dealing with its default date handling routines. On the Newton they had no real reason for picking that reference date other than that the Mac already used it.

    On the original Mac, they did have a good reason for picking it. Apparently 1904 is the first leap year in the 20th century and it simplified the algorithm for factoring in leap years by starting at that point. Since they were trying shoe horn a graphical OS onto a 128Kb machine with no HD (but they did have some ROMs), you can't really fault them for taking a few shortcuts.
    • Since they were trying shoe horn a graphical OS onto a 128Kb machine with no HD (but they did have some ROMs), you can't really fault them for taking a few shortcuts.

      IIRC, they were tryng to shoe horn a graphical OS onto a 64Kb machine. At the very last minute, the RAM was doubled. But Andy and the gang had already pulled off a miracle.

    • Frankly this is of little interest to anyone who is not a Macintosh programmer

      Not entirely. Users of Microsoft Excel across Mac and Windows platforms at least used to have to compensate for the 1904 (Mac) or 1900 (Win) date systems when copying data. It was a major pain to always have to add or subtract 1462 days to get the dates to work properly.

  • by Trurl's Machine ( 651488 ) on Thursday January 01, 2004 @06:24AM (#7851872) Journal
    Actually, in just three weeks there will be a real anniversary of the introduction of the Macintosh [lowendmac.com] - January 24th, 1984.
    • So I suppose the release of the Twentieth Anniversary Mac [lowendmac.com] in 1997 was due to some bug in the OS?

      Yes, I know it was the twentieth anniversary of Apple, the company, but isn't the name a little ambiguous?
      • yes, it is ambiguous to those who don't know what they are talking about. Which in the case of apple history 101, is quite a large group.

        However, there were Apple computer's made before the "Macintosh" line of computers were released(nearly 20 years ago to the day). There were both Lisa computers(with a gui) and simply Apple computers(not apple macintosh computers).
  • by Anonymous Coward on Thursday January 01, 2004 @06:59AM (#7851931)
    Related comic [macboy.com]
    :)
  • by tuxedobob ( 582913 ) <<tuxedobob> <at> <mac.com>> on Thursday January 01, 2004 @07:58AM (#7852042)
    1. There is no article.
    2. The story is cool anyway.
    3. Most of the comments (in my threshold, 1+, anyway) are actually funny.
    4. Pudge posted this on time. This means that either he a) lives in PST and spent midnight posting this, or b) lives elsewhere and stayed up so he could post this.
  • by Anonymous Coward

    and all you Mac users are using Mac OS X, right?

    No, actually. You forgot that OS X is optimised for G4 architecture and newer. Even a fast G3 box is often brought to its knees by Jaguar due to its lack of specific hardware features. OS 9 is not dead: that is apple marketing hype. sure, its becoming more of a niche platform, and eventually the market will drive it to being a "retro platform" or whatever but thats another couple years at least. but its preferred if you don't have a particular need for a
    • Try Panther. The performance optimizations on my iBook 500 have been nothing short of phenomenal.

      Personally I could never bring myself to using Macs before OS X simply because they were so different to everything else on the market at that time. OS X bridges the divide and still lets me get my work done with the ease of use of Mac OS X and the fantastic development environment brought about by Unix and Cocoa.

  • by rockwood ( 141675 ) on Thursday January 01, 2004 @12:24PM (#7852821) Homepage Journal
    The time and
    date corresponding to 0 in an operating system's clock and
    timestamp values. Under most Unix versions the epoch is
    00:00:00 GMT, January 1, 1970; under VMS, it's 00:00:00 of
    November 17, 1858 (base date of the US Naval Observatory's
    ephemerides); on a Macintosh, it's the midnight beginning
    January 1 1904. System time is measured in seconds or ticks
    past the epoch. Weird problems may ensue when the clock wraps
    around which is not necessarily a rare
    event; on systems counting 10 ticks per second, a signed
    32-bit count of ticks is good only for 6.8 years. The
    1-tick-per-second clock of Unix is good only until January 18,
    2038, assuming at least some software continues to consider it
    signed and that word lengths don't increase by then.

    Wall Time is the `Real world' time
    (what the clock on the wall shows), as opposed to the system clock's
    idea of time. The real running time of a program, as opposed to
    the number of ticks required to execute it (on a timesharing
    system these always differ, as no one program gets all the ticks,
    and on multiprocessor systems with good thread support one may get
    more processor time than real time).

    Wrap Around of a counter that starts over at zero or
    at `minus infinity' (see infinity) after its maximum value has
    been reached, and continues incrementing, either because it is
    programmed to do so or because of an overflow (as when a car's
    odometer starts over at 0).


    • Weird problems may ensue when the clock wraps
      around which is not necessarily a rare
      event; on systems counting 10 ticks per second, a signed
      32-bit count of ticks is good only for 6.8 years. The
      1-tick-per-second clock of Unix is good only until January 18,
      2038, assuming at least some software continues to consider it
      signed and that word lengths don't increase by then.


      You're confusing the epoch datetime and "ticks," at least on the Macintosh.

      The classic Mac OS had the normal epoch seconds time (which is wh
  • My LC I is running quite fine under v7.0. I just created a folder and it reported the date as Thu, Jan1, 2004.
  • Comment removed based on user account deletion
  • pudge sez: "(Geek note: the Mac OS epoch is unsigned, which is why it can count over 100 years from 0 seconds, and 32-bit Unix can't, though it can count backward to 1901.)"

    What a shame. Mac users obviously weren't able to participate in the net prior to 1904. Well, at least there's archives like Goggle Groups where they can read what they missed.

    BTW, the Apple II has the same calendar scheme as the Mac. My GS's calendar is good through 2038.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...