Follow Slashdot stories on Twitter


Forgot your password?

Oldest Supported Software? 245

Dave Santek writes "In development since the early 1970s, the McIDAS [Man computer Interactive Data Access System] software celebrated its 30th anniversary in October 2003. McIDAS is used to integrate and visualize weather information. The software was originally run on a Datacraft /5 and has gone through 4 major hardware configuration changes over the last 30 years. It is a supported software package that remains in use at more than 100 locations worldwide. A history of the first 25 years (pdf) is available. A freeware version of the software is also available."
This discussion has been archived. No new comments can be posted.

Oldest Supported Software?

Comments Filter:
  • by i_want_you_to_throw_ ( 559379 ) * on Saturday December 20, 2003 @05:14PM (#7774840) Journal
    Air Traffic Control Software: Automated Radar Terminal System (ARTS). The Plain View displays (by Raytheon) in the early '70s had an anticipated lifetime of 1015 years; those in the centers today are now at least 10 years past this estimate.

    These displays fail regularly--according to controllers and technicians. At each en route center, which may have 30 to 60 PVDs in operation, it is not unusual to replace two to four of these units a day. When a PVD goes dark, the controller at that station rushes to another screen and urges the controller there to alter his or her display to include aircraft previously tracked on the failed display.

    PVDs slipping out of adjustment also cause the size and clarity of the alphanumeric type they display to vary--fuzzy type makes controllers confuse 3s and 8s, which can lead to errors, an Indianapolis controller told IEEE Spectrum. And the units themselves are unstable. Their aging ceramic connectors are brittle and falling apart. Insulation on the wires is brittle, too. The vibration caused in moving a display, as is necessary when a replacement must be brought in, often disables it when fragile connections are broken.

    Meanwhile, the Host and ARTS computers that drive the displays are problematically obsolete as well. The Host computer computes radar tracks, maintains a database of flight plans, and issues safety warnings--such as a conflict alert, when two craft are in danger of violating separation standards, and a minimum safe altitude warning, when an aircraft is at risk of hitting terrain. It contains half a million lines of Jovial code and assembly language that was first installed in 1972 and ported from IBM 9020 onto IBM 3083 computers, starting in 1985.

    But Host has at most only 16MB of RAM, a serious limitation. And it badly needs replacing. (The ARTS computers in the Tracons are also severely limited in memory, but those are scheduled for replacement.) "The Host software is our biggest problem," a controller from Chicago told Spectrum. "There are so many patches, no one knows how it works. We can't change anything; no one dares touch it, because if we break it, we're gone."

    In the mid-'80s, a multibillion dollar effort was started to update both the en route centers and the Tracons by replacing their displays and computers with networked workstations. (Airport towers use feeds from Tracon computers for radar tracking of airborne craft; they use separate surface-monitoring equipment for aircraft on the pavement.) That 10-year effort failed and has, for the most part, been abandoned. Called the Advanced Automation System, the program was sunk by unrealistic specifications and human factors difficulties, among other problems. New efforts to help controllers and pilots are under way, but have yet to make an impact on the present system. Here's the rest of the story from MIT []

    If you look in the latest Linux Journal though you'll see that Linux has made inroads in this area [].
    • by DougM ( 175616 ) on Saturday December 20, 2003 @05:27PM (#7774942)
      in the early '70s had an anticipated lifetime of 1015 years; those in the centers today are now at least 10 years past this estimate.

      I'm not surprised they're failing if they're at least 1025 years old!!

      Seriously, though, this is the kind of situation that really scares me. Rarely is such a problem solved with money alone -- a project of this scale and importance needs to be supported by the best.

      The IT industry is characterised by too many enthusiasts and too few professionals.

      • by darkonc ( 47285 )
        in the early '70s had an anticipated lifetime of 1015 years;

        I originally thought of this as a troll, but people are marking it insightfull rather than funny.

        That should have been 10~15 years. It was a cut and past from the refrenced article, but the paste of the non-ascii character was eaten by SlashDot's ever-helpful (not) filters. (sigh).

        Ah, for the days when 'security' meant telling people "don't do this (oops)".

    • JOVIAL--that brings back some memories. Did a report on it as an undergrad. Is there a publicly available compiler anywhere?
    • I went to an interesting conference by a guy named Phillipe Krouchten from Rational (or should I say IBM Rational now?) who was heading their Vancouver division studying software engineering stuff. From my understanding he is the father of the RUP (Rational Unified Process).

      Canada basicly had the same problem as the US with an aging air traffic control system and basicly at the same time they started doing mostly the same thing. And a few years later they discovered that they where not advancing much, so they called in that guy who at the time worked as a consultant and he is supposed to have basicly turned aroudn the whole project and completed it..

      Anyways, the interesing point of his conference was on iterative VS waterfall processes. At first everyone though that the "waterfall" approach was right. First write a good specification, then code, then test, then release.. But it was discovered that it didnt work. So Kruchten basicly transformed the project to use iterative techniques were they would go over 3 months of specify, code, test, and they do it again until the project was completed...

      The part that I dont understand was... Why arent americans buying the Canadian system?
      • The part that I dont understand was... Why arent americans buying the Canadian system?

        Count the number of large scale commercial airports in Canada. Do the same for the US. Now, count the # of flights that pass in/out of each of those airports in Canada on a given day. Do the same for the US.

        The flight congestion problem in the US is literally orders of magnitude greater than Canada. An air traffic control system that was created for Canadian airports and the average levels of flight congestion for thos

        • by Jetson ( 176002 ) on Saturday December 20, 2003 @08:10PM (#7775939) Homepage
          Count the number of large scale commercial airports in Canada. Do the same for the US. Now, count the # of flights that pass in/out of each of those airports in Canada on a given day. Do the same for the US.

          You guys have more airports and more aircraft but also more sectors and more controllers. The net effect is that the number of flights handled at any one display is roughly constant (and limited by human capabilities).

          The real reason the FAA isn't using the Canadian solution is that it's not complete. As I mentioned elsewhere in this thread, we are replacing systems one component at a time using emulation on modern hardware. Our components aren't interchangeable with your components due to differences in system architecture. They might do well to consider following our approach to the problem, but I doubt the resulting systems will ever converge.

    • by Jetson ( 176002 ) on Saturday December 20, 2003 @07:52PM (#7775824) Homepage
      Canada started working on its replacements long after the FAA but it looks like we'll get there sooner. Like the FAA, we initially contracted a system called CAATS that would have done most of the things proposed by AAS. Somewhere in the mid-90's it became obvious that the proposal was pie-in-the-sky and the contractors would never be able to deliver -- every time the players sat down to review the situation they ended up reducing the goals of the project and delaying the acceptance date. Since the existing systems were starting to fall apart we switched to off-the-shelf systems (HP Unix boxes with Sony 2kx2k displays) running software that emulated the old vector tube displays. The main computers were essentially unaware that they were talking to modern hardware. The privatized ATC system also started rewriting the host software to run under Unix, and will be replacing the old hardware in 2004-2005. At that point we will have a new host emulating an old host, talking to new displays emulating old displays. Once a bug-for-bug clone is operational, we will be able to look at modernizing the software to take advantage of the increased computing power available. The original CAATS project has been scaled back to the point where it's little more than a shim layer that manipulates data passing between the host and the displays.

      The British have already purchased a few ATC support systems from Canada and are considering more of them. Since they are running on similar hardware, there's a good chance that we will see common software running on both sides of the Atlantic by the end of the decade.

      The FAA has looked at some of our systems. As the parent post said, however, they no longer know how their own system works and are terrified at the prospect of changing just a portion lest the whole house of cards falls down.

      BTW, with reference to the topic at hand, we are just now replacing our 30-year-old ATC weather system. The original OIDS system ran on an Interdata-70 system with core storage and tape I/O. The only significant changes in the last 30 years was the switch to TTL memory and the addition of a floppy controller (that simulates a tape device). We still boot the machine using the binary switches on the front panel.... The replacement system runs on a network of NT4 machines and has been installed at about half of our facilities. I'm hoping the old system is donated to a computer museum.
    • There has been a lot done to upgrade the Terminal software in the last 8 years and continues to be updated. Since the FDADs at the New York TRACON were failing they were updated with a new color displayed called an ARTS Color Display: s/services/services6.html []

      The old ARTS computer in some of the TRACONs now have been updated with the new off the shelf hardware and software that was converted from the old software that ran on the IOPs. This system whic

    • I have worked on "Host" or NAS as it is formally known (National Airspace System) in the UK. (so yes I know Jovial and BAL assembler). Its REALLY fun programming with nearly every variable being global, using hollerith instead of ascii, and being limited to 7 characters for variable names, and missing many basic programming constructs (no while loop for example). Jovial's memory overlaying techiques were ahead of its time though, and is probably the reason these old systems have been able to keep up pe
  • Strictly speaking, it qualifies as "old" since let's be real - Micosoft software hasn't changed much since the late Eighties.
    • The software may not have changed much, but the support sure has. If you are a corporation, and you make an operating system, that's the OS, what's needed for the computer to run applications, there is no excuse for dropping support for it. Ever.

      Yes, I know of places where Windows 3.1 is still used (legacy database anyone?), and problems still arise. Even in 2003, I have troubleshot Windows 98, 95 and 3.1. And I'm not trying to be all high and mighty about Open Source, RedHat is putting their customers through the same bullshit.

      To make the all too common analogy, if you have a car, and 5 years from now it breaks down, you bring it to a mechanic, he says "sorry, this model isn't supported anymore, time to upgrade!", what the hell do you say to that? The problem of software companies to stop supporting their products is ridiculous. If you're going to make something, do it right, don't pussyfoot around making a good product, and at least have the balls to admit to your mistake and fix it when the shit jumps off. Screw you all software engineers. Where the hell is my abacus?
      • by AntiOrganic ( 650691 ) on Saturday December 20, 2003 @05:52PM (#7775079) Homepage
        To make the all too common analogy, if you have a car, and 5 years from now it breaks down, you bring it to a mechanic, he says "sorry, this model isn't supported anymore, time to upgrade!", what the hell do you say to that? The problem of software companies to stop supporting their products is ridiculous. If you're going to make something, do it right, don't pussyfoot around making a good product, and at least have the balls to admit to your mistake and fix it when the shit jumps off. Screw you all software engineers. Where the hell is my abacus?

        That's a terrible analogy. If your car breaks down five years after you bought it, and you return it to the dealer, do you know what he's going to say? "You only have a five-year warranty on parts and labor. Go find a mechanic." The mechanic is much more akin to service-oriented companies like Progeny who are offering commercial support for products that have been EOLed.
        • and by commercial products, you mean products like redhat that are open source but also happened to be sold... where is my service mechanic to fix the bug in windows 3.1? And when i find him, how exactly is he gonna patch a system he can't get the code to.

          Third party support of a closed source application is like having a mechanic try to service your car without opening the hood. Virtually impossible.
      • by zakezuke ( 229119 ) on Saturday December 20, 2003 @06:20PM (#7775231)
        To make the all too common analogy, if you have a car, and 5 years from now it breaks down, you bring it to a mechanic, he says "sorry, this model isn't supported anymore, time to upgrade!", what the hell do you say to that?

        It depends on what you mean by break down. If you're talking routine maintance that would suck. If you are talking about a major failure such as an engine or transmission then it would be wise to evaluate the cost of a new car, cost of a used car, vs the cost of fixing the old car. In my case, I have a 1998 nissan sentra, were I believe it blue books for about $4500 with a trade in value of $3000. A good mechanic would take this into account.

        But as far as cars vs computers go, every year is equilivent to 10 years in cars. Basided on this logic a good mechanic wouldn't waste their time, "dude it's not worth it, time to buy another one". You can either accept that answer, or reject it if you really love that specific car.

        Computers are little diffrent. When the cost of supporting older stuff gets too high, a wise person considers an upgrade or replacement. However, I take STRONG exception to cases where the software is still good, useful, but the company folded and the copyrights are owned by some back somewhere who couldn't care less about actually looking into selling the rights to it resulting in the problem of can't buy nessicary addon cause no one sell it.

      • Ford will not support any Ford car that is older than 7 years. A buddy of mine wanted to get the AC fixed and the Ford dealer pointed him to another unathorized place.
      • I drive a 1960 Ford. When I go to Pep Boys, and their books only go back to 65 or whatever, that is pretty much what I get... They tell me to look up a specialty store that handles old parts. Maybe someone could make a businesses out of supporting that old MS crap. On the same note, why doesn't MS release the code freely if they no longer support or sell the product? I know that when I bought my first Win95 upgrade, it didn't come with an expiration date on the box.
      • I don't know what point you're trying to make about MS software. MS DOS, Win 3.x, Win 95, and NT Workstation 3.x stopped being supported by Microsoft 11 days shy of 2 years ago. NT Workstation 4.x stopped being supported in June of this year, and Windows 98 enters it's non supported phase in approximately three weeks.

        Don't play this holier than thou, " there is no excuse for dropping support for it. Ever." game. Software (especially operating systems) get EOS'd and EOL'd for good reasons. They're de
    • by zakezuke ( 229119 ) on Saturday December 20, 2003 @05:58PM (#7775114)
      Strictly speaking, it qualifies as "old" since let's be real - Micosoft software hasn't changed much since the late Eighties.

      Sure it has! They have changed "edit" "options" to "view options" in the pull down menus. Win95 there was a "find" fuction that has since changed to "search" however f is still the hot key for it. And the names of their products have changed as well. Windows explorer, Internet explorer, Microsoft Messenger, Windows Messenger. Lots and lots of changes.

      Microsoft - Now where did my documents go today?
    • Are you kidding?

      There is a big difference between Dos and Windows2k3 advanced server.

      MS-SQL server 1.0 for OS/2 was a joke when it came out back then. TOday its one of the most scalable databases in existance that challenges Oracle and DB2.

      MS-Word 2.0 for OS/2 and Dos is nothing compared to OfficeXP with VBA support.

      I could go on and on.

  • by Anonymous Coward on Saturday December 20, 2003 @05:19PM (#7774893)
    ...and still unsurpassed.
    • by Zork the Almighty ( 599344 ) on Saturday December 20, 2003 @05:25PM (#7774933) Journal
      Sadly TeX is being replaced by (what else?) Microsoft Word. Not for scientific documents yet, but in businesses and governments around the world people stuggle to get page references and a proper index out of Microsoft Word. Those poor, damned souls.
      • by WillAdams ( 45638 ) on Saturday December 20, 2003 @06:16PM (#7775212) Homepage
        The situation isn't as simple or straightforward as that, and in may ways, it's much worse (TeX documents taken by a publisher, poured into Word for copyediting, then typeset in Quark w/ all equations reset using proprietary XTensions such as PowerMath or York Graphics' XMath).

        Opensource software in many ways is catching up and surpassing Word---LyX, is one of the most promising and innovative, a ``What You See Is What You Mean'' document processor, it's actually used by some compositors to make LaTeX documents accessible to mere mortals so that they may then by typeset using the publisher's style---let me know what you think of Kaplan's _Introduction to Scientific Computation_, just released ;)

      • This is only true in certain areas. As an academic, I read many documents from the most diverse sources, and I can tell you for a fact that in my field (Economics) Word is rapidly fading away and everyone is using LaTeX (at least, people under 50 ;-)).

        It used to be 50/50 just 2-3 years ago, but if you go check working paper repositories like IDEAS [], which is LARGE, you ll'notice that most recent working papers are written using some version of LaTeX.

        Apart from the inherent qualities of LaTeX, it's just a

    • by TedCheshireAcad ( 311748 ) <> on Saturday December 20, 2003 @05:45PM (#7775033) Homepage
      TeX has survived for so long, and will continue to thrive, because someone put some fucking throught into the design instead of into the ship date. If you write software, have the balls to make it good, don't be a pussy.
    • Is it just me, or does anybody else find TeX hard to compile and get set up? I've spent who knows how long screwing around with metafont trying to get the thing working. Also, I know there must be a way to choose a better font, but everybody who uses TeX seems to use the same set of (butt ugly) fonts for everything. It's rather telling that most tech people can instantly tell when something was typeset in TeX.
      • Also, I know there must be a way to choose a better font, but everybody who uses TeX seems to use the same set of (butt ugly) fonts for everything. It's rather telling that most tech people can instantly tell when something was typeset in TeX.

        You mean Computer Modern []. It's the font that Donald Knuth designed to be used together with TeX and METAFONT. I guess it's a matter of taste whether you like it or not, but at least it's a well designed font and serves its purpose. People tend to use it since it's th

    • So is ED. Where would would be without that wonderfull editor. Emacs/VI can not touch it!

  • by Exiler ( 589908 ) on Saturday December 20, 2003 @05:20PM (#7774900)
    There's an Asteroids machine in this pizza place down the street, that count?
    • Try getting Atari to support your Asteroids machine. Actually, try getting anybody at Atari on the phone that even remembers what Asteroids is. It'll be difficult for sure.
      • Re:What about... (Score:3, Informative)

        by x136 ( 513282 )
        Especially considering that the Atari that made Asteroids is long dead. The company called "Atari" now isn't actually Atari in any form other than the name. Infogrames bought the rights to the Atari name from whoever the last company to own the name was, and changed their name to Atari to try to acquire some brand recognition.

        The Atari we all remember is long gone. The company in its somewhat original form was torn apart some time after the Atari Jaguar tanked, IIRC.

        (BTW, I'd like to know where this pizza
    • Right here []
  • by f1ipf10p ( 676890 ) on Saturday December 20, 2003 @05:24PM (#7774921)
    I still sometimes enter in the code for lunar lander on my 1975 HP-25 RPN calculator...

    That is the oldest software I support ;)
  • by glomph ( 2644 ) on Saturday December 20, 2003 @05:24PM (#7774923) Homepage Journal
    void main(){printf("Hello World!\n");}
  • IDRS (Score:5, Informative)

    by Grech ( 106925 ) on Saturday December 20, 2003 @05:24PM (#7774926) Homepage
    The Integrated Data Retrieval System had been part of the American tax administration since the mid 60s. It's not 40 years old yet, but it probably will be before it is replaced.
  • by HotNeedleOfInquiry ( 598897 ) on Saturday December 20, 2003 @05:25PM (#7774932)
    Much of the CTS aka BRC aka Sequoia Systems software was originally written to count punch cards on mainframes. The mainframes were replaced by minis, the minis replaced by PCs, the PCs replaced by imbedded micros. All the while, the original elections coding software was ported/translated to each successive generation of machine.

    Contrary to what many slashdot readers seem to think, election coding is non-trivial, encompassing variations in laws and tradition in virtually every county of every state in the US. Since execution time is not an issue, and accuracy is, emulation and translation make lots of sense.

  • PDF? (Score:2, Funny)

    Man, my TRS-80 won't open PDFs. Damn you Adobe!!!
  • by SexyKellyOsbourne ( 606860 ) on Saturday December 20, 2003 @05:37PM (#7774981) Journal
    NASA still runs software, to this day, from the 1960s due to funding cuts and that fact that it "still works," much of it on the same computers from the 1960s.

    In fact, most of this software is so old it actually can no longer be maintained because the people who wrote it are DEAD, and modern programmers who replaced the retirees can't make heads or tails of all the spaghetti code within. There's all kinds of fascinating data from the golden age of space exploration that we could still use, but it's in proprietary, decayed backup formats in proprietary structures.

    If they started using Linux and open standards now, though, 30-40 years from now, they won't be having this problem, as Linux will still be around then -- and the rest will be in the dustbin of history.
    • I like Linux as much as the next person, but there is no way of knowing it will still being in use 30-40 years from now. It certainly will have changed very much by then. I doubt that somebody of the street will be able to read code from a Linux system that is 40 years old because they have experience with a "modern Linux" 40 years in the future.
    • NASA still runs software, to this day, from the 1960s due to funding cuts and that fact that it "still works," much of it on the same computers from the 1960s. Nothing of any importance though. MCC-H (Mission Control-Houston) was upgraded and revamped about a decade ago. KSC-Launch Control was completely replaced with a UNIX based system less than five years ago. The ISS control center is (IIRC) about eight years old now... And that's about the sum total of large important systems at NASA.
  • by cfallin ( 596080 ) on Saturday December 20, 2003 @05:38PM (#7774986) Homepage
    30 years is a long time for a software project to evolve. The question, however, is how much of the original code remains today. Lots of software, especially stuff that changes fast, is this way - I'm sure that not much of the Linux 0.01 code remains in the 2.6.0 tree. It's just a matter of replacing things one piece at a time (or completely rewriting things, in some cases).
    • Yes, some lines from Linux 0.01 exist in the 2.6.0 kernel. Many are '}' and are in fact claimed by SCO to be lifted straight from the SysV source code.
    • by Billly Gates ( 198444 ) on Saturday December 20, 2003 @09:02PM (#7776183) Journal
      Actually Bill Joy was commenting on MacOSX ( his own platform of choice for personal use) that he viewed some of the FreeBSD/Mach sourcecode and saw code he wrote more then 20 years ago still in it! Much of the original code from BSD Unix is still around in Free/Net/OpenBSD systems. The disk loading, i/o, inet, and posix utilities have been updated little. Infact mostly only the proprietary Unix code has been removed.

      It depends on the development model. BSD for example is very conservative as is most corporate software. Linux is an odd exception which changes radically from kernel to kernel.

      I read an article earlier this year on slashdot about how traditionally Unixies have not updated the BSD 4.2 TCP/IP stack much in their versions while Linux has. HP-UX and Solaris still use almost the same stack as 10 years ago with the exception of adding IPSEC and IPV6 support.

      If it aint broke why fix it? In corporate America which is controled by bean counters and rediolous deadlines, much old code remains. Especially of proprietary software companies who must meet shareholder expectations of regular releases.

      People today claim Linux is not as reliable as Solaris or FreeBSD. A few years ago it was when 2.0 and 2.2 ruled the kernel scene. 2.4 had radical and controversal changes to the VM and I/o. 2.6 supposed to be stable again. So it varies on software.

  • EMACS (Score:3, Flamebait)

    by smittyoneeach ( 243267 ) on Saturday December 20, 2003 @05:39PM (#7774993) Homepage Journal
    has got to loom large in the discussion.
    If only the text editor in that operating system had friendlier keyboard shortcuts...
  • 1970s only? (Score:4, Interesting)

    by Mainframes ROCK! ( 644130 ) <.watfiv. .at.> on Saturday December 20, 2003 @05:41PM (#7775002) Homepage
    IBM mainframe operating systems have been around since the mid 1960s and are still being supported and updated. VM, first called CP/67, then VM/370, then VM/SP, then VM/ESA and now zVM has been around since 1967. Some such as DOS (1965?) are even older but I mentioned VM since its my favourite :-)
  • unix% isotd

    33 year old program "Oldest Supported Software?" ---Slashdot front page story

    isotd = idiotic statement of the day

  • SyncSort and Ditto (Score:5, Informative)

    by js7a ( 579872 ) * <james AT bovik DOT org> on Saturday December 20, 2003 @05:49PM (#7775059) Homepage Journal
    The last time Slashdot covered this [], the best guess turned out to be SyncSort [], which has been for sale since 1969 [], and is still supported.

    However, I believe that a version of IBM's DITTO [] was available on System/360 in 1965. I've not been able to confirm this, though.

  • IBM VM/CMS (Score:4, Interesting)

    by sglines ( 543315 ) on Saturday December 20, 2003 @06:09PM (#7775174) Homepage Journal
    This wonderful old operating system was introduced in 1967. It's still around and still supported as far as I know. It grew out of Project MAX as did Unix and Multix.
  • by 192939495969798999 ( 58312 ) <info&devinmoore,com> on Saturday December 20, 2003 @06:09PM (#7775175) Homepage Journal
    In terms of the oldest supported computer, the abacus far outstrips any modern programmed computer's support, having existed for thousands of years. Is there any chance that a piece of software could be written that would be supported in functionality and design for that long? Any thoughts on that?
  • FORTRAN? (Score:5, Interesting)

    by billsf ( 34378 ) <> on Saturday December 20, 2003 @06:17PM (#7775215) Homepage Journal
    FORTRAN is perhaps not the oldest supported software of all but it pre-dates Unix, McIDAS and ARTS and is still widely used and supported today. Some common utilities, widely used today, were adapted and used in the first Unix. Research for all this dates back to the early 60's.

    Is there anything from the 50's or earlier that is still supported today? Surely someone at IBM must know...
  • Well.. (Score:5, Interesting)

    by nate nice ( 672391 ) on Saturday December 20, 2003 @06:17PM (#7775220) Journal
    Lots of software is still in use every day. The algorithms anyways. Quicksort was from the 60's, the numerous algorithms Dijkstra gave us. Many, many more as well.
  • Insurance Companies (Score:5, Informative)

    by lscoughlin ( 71054 ) on Saturday December 20, 2003 @06:51PM (#7775425) Homepage
    Insurance Companies are still running mainframe systems to track your annuities and policies that have been under active development and support since the 1960's.

    Systems like lifecomm, all writen in assembly are still worked on.

  • I remember first using one ca. 1986. Up until that point most weather data was still distributed via fax. Having the ability to loop satellite imagery and make custom graphics with real-time data was somewhat revolutionary.
  • A company I recently did some work for has a Sperry Univac on-line as a backup for a "critical government system". They search computer junk-yards for spare parts. They say the problem is that the software will take forever to port and qualify on a new system. I guess I never understood why they didn't start that process, say 10 years ago.

    • Money and Risk (Score:3, Insightful)

      by Detritus ( 11846 )
      It can be very expensive and risky to replace the old systems. Many organizations just don't have the money to replace their older systems. You may not have all of the original/current source code. The written requirements, if any are still available, are hopelessly out-of-date and useless. As soon as a proposal for a replacement is floated, gnomes will come out of the woodwork with endless lists of new features and buzzwords that they want in the new system. The Microsoft zombies will insist that you only
  • by karlandtanya ( 601084 ) on Saturday December 20, 2003 @08:04PM (#7775900)
    I hired on with a company 11 years ago.

    They asked me to rewrite a piece of production software, so I did. Done, let's move on.

    A year later, they asked me to rework it.

    Done, let's move on.

    3 years later, I was still working on the software--adding functions, changing screens, etc.

    I left the company, and hired on with a consulting firm.

    2 Years later, they call me back to help with a validation guessed it!

    5 Years after that, they called my boss. We're halfway through a rewrite, and we need help. Hello...who do we have that can do this? Yup.

    After I die, these fuckers are going to hold a seance and ask my ghost to rewrite this app!

  • by dpbsmith ( 263124 ) on Saturday December 20, 2003 @08:16PM (#7775966) Homepage
    ...DATACRAFT? Wow, did that bring back memories. However, I suddenly realized that more than coincidence was at work, when I went to the McIDAS website and saw the "Dayton Street, Madison" address.

    The University of Wisconsin, circa the late seventies, was a hotbed of Datacraft users. I believe it was Geophysics that pioneered their use with a Datacraft 6024/3. They introduced the cheaper 6024/5 at about the same time Digital came out with, IIRC, the PDP-11/20.

    Departments at UW that needed minicomputers in the $50,000 class started buying Datacrafts right and left. Digital lost a lot of sales selling PDP-11's against Datacrafts. But the price/performance comparison, at that time and place, was really compellingly in favor of Datacraft.

    Datacraft was headquartered in Fort Lauderdale, and I believe a lot of its engineering staff consisted of Cuban emigres. The Datacraft machines were 24 bits versus Digital's 16. I forget how many bits were in the mantissa and exponent, but there was a very usable 24-bit floating-point format. The instruction set was well designed for doing floating point without a dedicated processor (though a dedicated FPU was available). One of the things that sold us on the Datacraft was that without an FPU, the Datacraft's times for floating point add, subtract, multiply, and divide were all less than forty microseconds; the comparable times for Digital was about a millisecond.

    The Datacraft had a hardware square root function.

    The instruction set was the most godawful asymmetrical mess I've ever seen. If you were used to the elegant orthogonality of, say, a PDP-8, a Datacraft was a bit of a shock. (It made even a 6502 look pretty). Most instructions took a 15-bit address, and the natural address space was limited to 32K (of 24-bit words). However, in order to win some bid that required 65K, they had shoehorned in a few instructions that accepted a 16-bit address. This meant that when working in an address space of more than 32K, the linker (and compiler) had to keep track of an incredible number of linkage flavors, and probably about half the bugs reported had to do with things that happened when you crossed the 32K boundary.

    There were three sort-of-index registers, named I, J, and K (if you used the variables I, J, and K in a FORTRAN program they were automatically assigned to the index registers). They were all slightly specialized, though. I don't remember what each of them did, but there were some instructions in which the I register, and only the I register had some special role, and ditto for J and K.

    There was a 3-bit index register field, and most of the instructions that moved data into or out of index registers used the field to specify the register. That meant, of course, that those instructions could NOT themselves perform indexed addressing.

    A very cool feature was an instruction that swapped the contents of a register and memory in a single cycle. The same architectural feature that enabled this also enabled another cool feature: there were functions that simultaneously set a word to all zeros or all ones and set the condition register to reflect the previous contents of the word. That is, a single instruction could you whether a word was zero or nonzero at the same time as it set it to zero.

    Generally speaking--if there was anything general about the architecture, which I doubt--you could, at the binary level, specify more than one index register, which resulted in storage into all of the specified registers if it was a store instruction or loading the OR of the contents of the specified register if it was a load instruction. This resulted in a lot of possible instructions for which there were no assembler mnemonics defined. (And the assembler syntax was IBM-style card-oriented, with a single mnemonic going in a specified set of columns--you couldn't just OR the mnemonics themselves). Some of these instructions were actually useful, and there was always controversy, never quite resolved by Datacraft, as t
  • by James Youngman ( 3732 ) <> on Saturday December 20, 2003 @09:03PM (#7776188) Homepage
    IBM IMS is over 35 years old [] (the first version dates from August 14, 1968, the same day Halle Berry was born). It's still supported [].
  • Great job, guys.

    But with all the new found publicity, expect McDonald's laywers to be knocking on your door regarding the obvious trademark infringment...

  • I've been working with high-energy physicists for the last year or so. They have some great (if apocryphal) stories. Apparently there is a great deal of old Fortran code that has been untouched since the 70s, but is still linked into modern programs. The reason? The code was written by someone in the 70s. That person then went on to win a Nobel prize for their work. They they died. No one feels competant to replace it.

    I don't know if it's true, but I believe that the physicists I know believe it.

  • The company I work at was established in 1972, I think and we still have LOTS of legacy vax basic code. It really sucks to work on this stuff, but its a job.
  • by douglasgodfrey ( 698581 ) on Saturday December 20, 2003 @09:44PM (#7776405)
    IBM's First OS for the 16k IBM 360 model 30 in 1964 was DOS. DOS evolved to DOS/VS release 34 by 1968. DOS/VS release 34 is still supported and runs on several hundred systems. The LAST bug in DOS/VS r34 was fixed in 1980.
  • sol.exe (Score:4, Funny)

    by ejaw5 ( 570071 ) on Saturday December 20, 2003 @10:28PM (#7776613)
    does MS Solitare count? It has survived (pretty much unchanged) through Win3.x, 95, 98, ME, 2000 and XP.
  • MERLIN (Score:3, Informative)

    by pherris ( 314792 ) on Saturday December 20, 2003 @11:00PM (#7776742) Homepage Journal
    MERLIN, the DEA's intelligence database has been around for a long time (I know the '70s and maybe the '60s). I don't know if it's running under all new code but it's always been a beast.
  • Oil company software (Score:3, Interesting)

    by macdaddy ( 38372 ) on Sunday December 21, 2003 @01:35AM (#7777339) Homepage Journal
    I worked for Haddock Computer Center [] (seems to be down) in Wichita, KS for a short time when I was in college. The owner, Richard Haddock, was a programmer from the 60s or 70s (I'm not sure when exactly). He wrote some sort of accounting software or management software for oil companies. It ran on some ancient computer system that I can't recall off the top of my head. I remember sitting in the back room one day eating my lunch. I was kicked back in a chair with my feet up on some big box. My Burger King lunch (I remember that detail) was laid out on top of that box. Richard's father came into the back and we got to talking while I ate. He told me how Richard got his start. Then he said something about that computer right there was what Richard used. He was referring to the really big box I had my feet on and my lunch on. I hadn't noticed that it was a computer system. I thought it was a generator or ancient telco pedestal. It was some honkin' big computer system that Richard kept on hand just in case one of the companies that ran his ancient software went down and needed a quick recovery. Ie, he was still providing some level of support for his oil field software. I thought that was neat.
  • by blair1q ( 305137 ) on Sunday December 21, 2003 @02:32AM (#7777555) Journal
    Just try building Perl 2.8.2 on cygwin.

    Frankly, I don't see how it builds anywhere, but some machines must be ignoring the unterminated string buried somewhere in the B extension makefile...if that's really where it's what Make says...
  • Geez! (Score:3, Informative)

    by slickwillie ( 34689 ) on Sunday December 21, 2003 @03:08AM (#7777659)
    I worked on McIDAS around 1980-1982. It ran on a Harris minicomputer and the OS was named VULCAN (IIRC). I had it downloading weather and satellite maps to a Apple ][. The Mount St Helens eruption was visible on it.

Did you hear that two rabbits escaped from the zoo and so far they have only recaptured 116 of them?