Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Software Operating Systems

Dan Bricklin on Software That Lasts 200 Years 359

Lansdowne writes "Dan Bricklin, author of VisiCalc, has written a great new essay identifying a need for software that needs to last for decades or even centuries without replacement. Neither prepackaged nor custom-written software is fully able to meet the need, and he identifies how attributes of open source might help to produce long-lasting 'Societal Infrastructure Software'."
This discussion has been archived. No new comments can be posted.

Dan Bricklin on Software That Lasts 200 Years

Comments Filter:
  • by dosun88888 ( 265953 ) on Thursday July 15, 2004 @06:38AM (#9705569)
    I think the subject line says it all. You can't worry about your software working for that long until your hardware can last that long.

    ~D
    • by Keruo ( 771880 ) on Thursday July 15, 2004 @06:43AM (#9705590)
      we have the hardware, paper and pen only problem is that the software, human generally dies of old age around 70-100 years I haven't yet seen custom-written software from this field, but some re-packaged with silicone enhancemets did catch my eye
    • by Deag ( 250823 ) on Thursday July 15, 2004 @06:52AM (#9705616)
      Well Dan Bricklin does point out that software of today can run on different hardware and having software tied to specific hardware is a bad idea.

      He says that the system should stay fundamentaly the same and components can be replaced and upgraded, not having everything replaced completely every five years.

      He is not just talking about one specific program that doesn't change, but rather open standards and techniques that mean data that is stored today, will be accessible in 200 years time.

      • by julesh ( 229690 ) on Thursday July 15, 2004 @07:14AM (#9705711)
        Well Dan Bricklin does point out that software of today can run on different hardware and having software tied to specific hardware is a bad idea

        Software of today can run on a variety of different hardware, but there is a degree of similarity between the different types of hardware that probably won't exist between todays computers and those available a hundred years from today, much less two.

        He is not just talking about one specific program that doesn't change, but rather open standards and techniques that mean data that is stored today, will be accessible in 200 years time.

        That, on the other hand, I can agree with. Anyone storing information in a format that isn't publically documented really ought to consider whether they'll still need it in 30 years time, and start migrating it to an open format now if they will. However, there are very few formats that are completely undocumented. I believe the most commonly used might be Microsoft Access databases. I'm not sure what documentation exists on the formats of various other commercial database systems; I believe Oracle's formats are well documented (?). What about MSSQL? Informix?

        Most accounts packages have documentation available on their database formats I believe. Certainly Sage and Pegasus provide such documentation. What about Great Plains, etc.?
        • by Simon Brooke ( 45012 ) * <stillyet@googlemail.com> on Thursday July 15, 2004 @08:21AM (#9706020) Homepage Journal
          Software of today can run on a variety of different hardware, but there is a degree of similarity between the different types of hardware that probably won't exist between todays computers and those available a hundred years from today, much less two.

          When I was a young programmer, I was shown a water quality analysis program used by an English water authority that some colleagues of mine at ICL were particularly proud of. Not that they'd written it. It was running on an ICL 2900 series mainframe running VME. But the software wasn't written for a 2900 series, so it actually ran on ICL 1900 emulator, running MOPS. But the software didn't run on a 1900 series, so the emulated 1900 was running an emulator of an older English Electric computer whose designation I've forgotten. But the software wasn't written for the English Electric computer, so the English Electric emulator running on the the 1900 emulator running on the 2900 was running an emulator of the world's first commercial computer, LEO, for which the software was actually written.

          When I saw this program in 1985 it was already thirty years old; it was still being used because it was still useful. If it is still in use it will be fifty years old, and (as 2900s are now very rare) is probably running under a further layer of emulation on an x86.

          Any Turing equivalent machine can, in principle, emulate any other Turing equivalent machine. Of course, true Turing equivalence requires unlimited memory, so in practical terms it's only possible to emulate machines which have less memory than the machine that's doing the emulating. But it's reasonable to suppose that the machines of 100 years in the future will have at least as much horsepower and at least as much memory as the machines of today. So they will be able to emulate the machines of today.

          A program written today may not be able to fully exploit the user interface features of a machine of two hundred years hence, any more than a BBC emulator [cloud9.co.uk] can exploit the full graphics resolution of a modern workstation. But what a modern workstation can do is a superset of what the BBC Micro could do, so it can be emulated without compromise.

          In other words, hardware compatibility is a non-issue in making software which will last and which will remain useful.

      • [...] mean data that is stored today, will be accessible in 200 years time.

        and a huge part of this is hardware support and, interestingly enough, storage bandwidth. You see, you have to migrate data from obsolete hardware/media to newer hardware/media, but in the near future the amount of data stored on obsolete hardware/media may become too large to transfer to newer hardware/media before it dies/decays/whatever, simply because the throughput of the transfer mechanism is too low.

    • Well, hardware can be worn out, but you can replace it, component by component, as he has suggested.

      The problem with software is, as long as there is proprietory formats (or if you don't have the source codes), you can forget about what he calls "societal infrastructure software". If the government is thinking about being able to retrieve its data 50 years from now, better enforce that an open data format be used in your application, right now, and with clean and precise documentation. That means, no MS Wo
      • Governments can enforce that vendors must provide proper documentation of their software data formats before a deal is struck, especially if the system is going to run national infrastructures, such as IRS, etc. Especially when the system costs in the hundred of millions (if not billions), why don't they enforce that? I would be multibillionaire if I knew the answer.

        Well, for large scale data apps, it is available. Heck, I've taken courses in both Informix and Oracle internal structures - in memory and o
    • by WillWare ( 11935 ) on Thursday July 15, 2004 @09:04AM (#9706330) Homepage Journal
      Lots of software includes or utilizes standardized hardware abstraction layers. Think about the POSIX standard, or the virtual machines for Java or Python or Common Lisp. These abstraction layers mean that large amounts of code are portable (sometimes with some effort, sometimes with none) across any hardware platform that supports the abstraction layer.

      Hardware manufacturers will always have a powerful incentive to support the abstraction layer, because by doing so, they'll instantly pick up a huge set of killer apps for their new CPUs. Standardized abstraction layers therefore provide an economically efficient way to divide the labor of porting software to new platforms.

      Are you thinking that in order to have software that's useful in the long term, it must run continuously on exactly the same piece of hardware? Think about Google (a very useful thing in our society). They must be bringing newer, faster computers on-line all the time. But if they're not total boneheads, they don't need to rewrite all their code to do this.

    • You can't worry about your software working for that long until your hardware can last that long.

      I'm using software that orignally ran on an 8086, then a 286, the a 486, then two or three generations of Pentiums. The whole point is that hardware dies, software doesn't. Not to mention the bunch of Unix-derived software that I run as DOS or Linux apps, essentially unchanged for almost 30 years, though the hardware on my desk is more powerful than the whole server room at the university I learnt it on, and

    • As has been pointed out elsewhere in this thread, one way around this problem is to use hardware which can be emulated in other hardware. The problem with this approach is that if you want to assume that a perfect emulator of your hardware will always be available you need to use highly standardized hardware. With the commoditized hardware of today its a stretch to imagine perfect emulation for any given component besides maybe the CPU. In the personal computing world, the closest to perfect emulation of
    • You can't worry about your software working for that long until your hardware can last that long.

      Oh, nonsense. Consider the well-known "Hello, world." program in the K&R C "bible". It's been around 30 years or so, and the hardware they were working on now only exists in a few museums. But that program is still in routine use on millions off computers.

      Note also that K&R included not only the code for the program, but the commands to compile and execute it. They correctly pointed out that this
    • You can't worry about your software working for that long until your hardware can last that long.

      Bzzzzzzt! I call Bullshit...

      The C programming language has been with us 30 years. Most of the non machine-specific coding from 30 years ago would work today with almost no modification on today's Ghz PCs.

      I develop large, powerful applications in PHP that will work well on a Linux, Solaris, Irix, Windows, or AIX system, with virtually no porting whatsoever. Furthermore, the software itself is the executable
  • Any presidents set for this to show 200 years is a good target to aim for?

    Must have had one hell of a beta test phase.

  • Open Source (Score:5, Funny)

    by Anonymous Coward on Thursday July 15, 2004 @06:40AM (#9705575)
    It seems like most open source has been less than 1.0 for at least 200 years. But all for a quality product right? Oh you found a bug? Well thats because its pre-1.0!
  • by Biotech9 ( 704202 ) on Thursday July 15, 2004 @06:44AM (#9705593) Homepage
    No company in the world will ever try and develop software that never needs (costly) upgrades and add-ons. Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types. Even the open source community will be too interested in improving and adding on to thier pet projects to consider leaving it alone.

    this article seems pretty flawed.

    We need to start thinking about software in a way more like how we think about building bridges, dams, and sewers.

    The fundamental difference being bridges cost more to alter than software does. And the capabilities of hardware allows more freedom in software, to which there is no correlation in bridges.

    hmmm, just my 2 euro-cents.
    • by tessonec ( 620168 ) on Thursday July 15, 2004 @06:54AM (#9705629) Homepage
      I think you do not understand completely the point of the article...

      The point is that, given the fact that there is a vast amount of information in computer files, you must be aware that if you can't retreive that information in the future, it will be lost.

      You are right, most of the software gets updated. But it is the interface that understands the format the thing that must last for much more time than a couple of software-updates-cycles

      This is exactly another reason to consider OS standards instead of closed-source formats, as MS in 100 years (if it does still exist) will have forgotten how .doc in windows 2000 looked like
    • Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types.

      Maybe before, but the document format hasn't changed since Office 2000.

      You can send me a document written in Word 2003 and I can happily open it in Word 2000.

      • by pheede ( 37918 ) on Thursday July 15, 2004 @07:07AM (#9705684)
        In fact, the Word document format hasn't changed since Word 97. So any Word version from 1997 or onwards will do the job.

        And changing the settings to saving in RTF format by default (enabling Word versions from Word 6.0 through 2003, as well as basically all other word processors, to read the documents) isn't all that hard. Not even in a corporate setting.

        Microsofts encourages upgrading of Office installations through a lot of questionable means, but the Word document format isn't one of them.
        • by peragrin ( 659227 ) on Thursday July 15, 2004 @08:06AM (#9705917)
          Let's try it, Let's take an Office XP doc saved in the Office XP format and open it up in Office 97,

          What it doesn't open properly? Geez that's A shocker.

          Now you can save an office XP doc in office 97 format, and office 97 doc's can be opened in office XP but office XP doesn't open in Office 97.

          MS does this because when one business upgrades, it forces the partners to upgrade as well. Why because most people have a hard time understanding what the different formats are.
        • In fact, the Word document format hasn't changed since Word 97. So any Word version from 1997 or onwards will do the job.

          And changing the settings to saving in RTF format by default (enabling Word versions from Word 6.0 through 2003, as well as basically all other word processors, to read the documents) isn't all that hard. Not even in a corporate setting.


          The word format is heavily platform dependent. If you embed objects into word documents, or use scripting, it's pretty much a guarantee it will not wor
    • I think that the fundamental reason that construction industry generally succeeds in producing bridges that don't fall down is the existence of building and planning regulations that require product to be of a good standard before they are sold. For example, in the UK if a bridge falls down and someone's killed it's corporate manslaughter and the MD is going to jail. Perhaps what we need is more regulation for the software industry ;-) For example instead of customers paying for software support maybe it s
    • Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types.

      Hmm? I still install Office97 on any brand new computer I get and it works just fine. Why would I need to upgrade it? All the functionality I need to do reports is there and it's 7 year old software.

    • You've observed the environment and drawn the wrong conclusions. Yes, software companies release add-ons. My software company is just releasing version 1.4 right now. Is that because we all sat around in a boardroom, smoking cigars and laughing about how we were going to screw our customers out of money?

      Actually, it comes from two reasons. First of all, we never have enough time to deliver all of the features we would like. Software release schedules are driven by sales cycles, so when the cycle rolls ar
  • by MavEtJu ( 241979 ) <[gro.ujtevam] [ta] [todhsals]> on Thursday July 15, 2004 @06:46AM (#9705600) Homepage
    I think the trick is to use simpler hardware, which is easy to replace.

    Take todays computer: motherboard with one big black chip, CPU on it, network card also one chip on it, videocard is too impossible to figure out how it works. Due to the integrated design, you can't fix it if it is broken. And in five years you won't be able to replace it one-on-one.

    On older hardware (8 bitters), you were able to repair it yourself because you knew how it worked and you know you were capable of replacing a failing chip. Even if you didn't have exactly the same chip, you can use one of a newer family which did the same but would be capable of switching much much faster.

  • No (Score:5, Insightful)

    by Mr_Silver ( 213637 ) on Thursday July 15, 2004 @06:49AM (#9705609)
    Neither prepackaged nor custom-written software is fully able to meet the need

    I disagree. It's got nothing to do with the software but the data.

    If the data format is clearly documentented, then it doesn't matter whether the application that generated it is open or closed.

    True, you could argue that since the code is open the data format is also documented, but personally I'd find it easier if it was written in a properly structured document.

    Otherwise you'd have to resort to learning and then plouging through an application written in some 200 year old programming language (by someone who possibly hacked it up with a hangover at the time) to try and understand what they were doing and why.

  • by BladeMelbourne ( 518866 ) on Thursday July 15, 2004 @06:50AM (#9705611)
    I wonder if there will still be holes/bugs in Microsoft Internet Explorer 6 SP1 in 2204?

    Now excuse me while I get back to writing my "Hello World" application that will last two centuries :-)
    • Now excuse me while I get back to writing my "Hello World" application that will last two centuries :-)

      Unfortunately, in 200 years the language will have evolved, and the words and phrases we use today will have completely different meanings. People of the future will understand "Hello World" to mean "All Your Base Are Belong To Us", and believing your program to be a dire threat, they will fire up their time machine and send back Arnold Schwarzenegger's great-great-great-great-great-great-grandson to el

  • 2 letters (Score:5, Funny)

    by News for nerds ( 448130 ) on Thursday July 15, 2004 @06:51AM (#9705612) Homepage
    vi
    • Re:2 letters (Score:3, Insightful)

      I think tex/latex has the capability. We have the some documents (20 years old), and they compile fine and look prefect. If anything, improvements made it easier to "enhance" the document without messing with anything.

      S
  • It's a tool... (Score:2, Insightful)

    by tgv ( 254536 )
    For Christ's sake, computers are mostly used as tools. And who keeps their old tools around for so long? Only neanderthals: [paleodirect.com]...
    • there is an important difference between tools and infrastructure. true, much software is used as tools--for accomplishing discrete tasks that evolve as societies and technology evolves. but much software--databases, routers, control devices for physical infrastructure, etc--is used more as infrastructure; that is, as a resource expected to be reliable and predictable by many users and necessary for accomplishing other tasks that ride on top of it, including employing new tools.

      infrastructure, because of
    • Re:It's a tool... (Score:4, Insightful)

      by kfg ( 145172 ) on Thursday July 15, 2004 @08:29AM (#9706089)
      The violin dates from the 1600s. While it has undergone a certain amount of "support" since then it is essentially the same tool as designed by Amati. Some consider it one of the finest tools ever devised by man. Many of the older ones are considered superior to the newer ones.

      I have an automotive body hammer that is nearly identical to a 1500s war hammer, although the upgrade to a fiberglass handle is a nice touch for reducing shock. The basic design goes back some thousands of years with little more than some minor updates in materials.

      My 100 year old desk holds up my computer just fine. It is as technologically advanced as what I can get new at Office Max, except I expect it can last another few hundered years due to the quality of its construction.

      I'm wearing woven fabric clothing, a technology that reaches back at least 10,000 years. There have been a number of attempts to replace this technology over the past 40 years or so. They've all proven inferior except for certain special applications. Hell, even indigo dye for work clothes has proven to be a durable technology for thousands of years that you can still purchase in nearly any clothing store and the "jeans" that are the most common example of the type are about 400 years old (Jacob Davis added rivets to the existing design. He didn't invent the jeans themselves).

      I've been watching a new office building go up in town. It's post and beam, about as old a house building technology as you can get, although the building is considered "modern."

      I also have a couple of fires that burn continuously in my home. It proves rather useful, although the technology is a bit long in the tooth.

      I fully expect that ASCII will be just as viable a way to represent the Latinate alphabet 200 years from now as it was a few decades ago, and the Latinate alphabet is another example of a multithousand year old technology.

      Innovation for innovation's sake often "progresses" to the rear.

      Build it right and build it good. Don't be afraid to change it when there's damned fine reason to on solid theoretical and practical grounds, but otherwise leave it the hell alone if it works.

      That isn't being a Luddite. That's being an engineer.

      KFG
  • Already there? (Score:3, Interesting)

    by rudy_wayne ( 414635 ) on Thursday July 15, 2004 @06:53AM (#9705623)
    Remember Y2K? Did anyone notice that the world didn't come crashing down on Jan. 1, 2000?

    It seems that all those old mainframes running programs from the 60's weren't in such bad shape after all.

    This is an over-simplification of course -- people did have to do some work to make sure there weren't any "Y2K" problems.
  • Not Possible (Score:5, Insightful)

    by deutschemonte ( 764566 ) <lane.montgomery @ g mail.com> on Thursday July 15, 2004 @06:53AM (#9705627) Homepage
    Constant standards are what is needed to make software last that long.

    Language standards don't even last 200 years, how do we expect something as new as software standards to be more uniform than language standards? Language has been around for thousands of years and we still can't agree on that.
    • Re:Not Possible (Score:3, Insightful)

      by _|()|\| ( 159991 )
      Language standards don't even last 200 years, how do we expect something as new as software standards to be more uniform than language standards?

      How do you know? Lisp has been around for a while, and it's not dead, yet. Some Lispers are working on a language called Arc [paulgraham.com], which they hope will last a hundred years. On another front, perhaps Parrot or .NET will provide a stable base that will allow languages to evolve, while remaining compatible.

      That said, I don't think it's necessary for a long-lived softw

    • Language standards (Score:3, Informative)

      by gr8_phk ( 621180 )
      "Language standards don't even last 200 years"

      Lisp is about 50.

  • think back 200 years (Score:3, Interesting)

    by Keruo ( 771880 ) on Thursday July 15, 2004 @06:55AM (#9705635)
    We've had software and computers for ~30 years now
    Going back 200 years, we only began the proper industrialization and everything was pretty much running on steam.
    I think it's flawed to try to design software that lasts for decades or centuries.
    The technology is constantly evolving, and as the hardware changes, so does the software.
    If the hardware developement continues as it does, in 2200 we, or the people then, might be working with hardware running at terahertz speeds with 4096 bit architechtures.
    Probably that's an underestimatement, since the evolving curves tend to be exponential.
    I don't really think they would still need the software someone wrote for windows 95 200 years ago.
  • by amitofu ( 705703 ) on Thursday July 15, 2004 @06:57AM (#9705641) Homepage
    Standards are what must be designed to last for decades, not the software that conforms to the standards. Things like XML, RDF and POSIX will be supported for decades, if not centuries. Who cares if it is Linux running your POSIX apps, or FreeBSD, or HURD? I don't think it matters if software uses libxml2 to parse your XML data, or some yet-unconceived API--as long as it understands XML!

    If it is stability and reliable infrastructure that is desired, it is standards that must remain constant and software that must evolve to make the standards work with new technology.
  • The world is different now than it was even just a decade or two ago. In more and more cases, there are no paper records.

    The point that the author makes here is really that without electricity we will lose great parts of recent history.

    • by I confirm I'm not a ( 720413 ) on Thursday July 15, 2004 @07:06AM (#9705674) Journal

      The point that the author makes here is really that without electricity we will lose great parts of recent history.

      When I was at secondary school, in Britain during the 1980s, we participated in a UK-wide project to create a modern version of the "Domesday Book", the 11th-century record of people and property.

      The project we worked on was recorded onto a (state-of-the-art) laserdisc so it would "last through the generations".

      Last year I read an article saying that dedicated enthusiasts were desperately trying to assemble a working laserdisc system, in order to archive all the data collected just 20 years earlier.

      Moral: it's not just electricity we need to worry about - media and the equipment necessary to access specific media is vital, too.

  • by Jotham ( 89116 ) on Thursday July 15, 2004 @06:59AM (#9705651)
    I disagree with the common comparison of Software to Civil Engineering and Standards Bodies.

    Data Structures would be a better analogy, which Standards Bodies have done a really good job declaring. So in 200 years time you'll still be able to read the DVD data format (assuming the media is still good), even though the software that plays it will likely be different.

    Software is more like mechanical engineering, where things do break and improvements keep being found. You wouldn't for example use a 1960's car engine in a car today, even though the basic principle is the same. No ones asks why they didn't get it right 40 years ago and aren't still using the same design.

    Unfortunately, what would often be considered an early prototype in engineering, is often released as v1.0 -- the cause of which is a long post all unto itself.
  • Lasting 200 years (Score:3, Interesting)

    by banana fiend ( 611664 ) on Thursday July 15, 2004 @07:00AM (#9705656)
    Societal infrastructure is the key part here.

    How many democracies are older than 200 years? How many governmental structures have survived 200 years? Bridges may last that long, but 200 years ago, Ireland was a very different place. America was a very different place, England was a very different place (see Ireland and America for why ;) ) as a matter of fact, EVERYWHERE was very very different

    200 years ago, the Americans loved the french for helping them in the civil war, the english hated the americans as barbarians, the Irish as "Paddies" and the Irish hated the english. The English hated the French ..... Come to think of it - only some things change.

    Back to the point - Software, or those parts of it that do qualify as societal infrastructure will have to change, simply to keep up with the rate of societal change and anything that lasts for 200 years is a very fundamental tool indeed.

  • See also (Score:4, Informative)

    by CGP314 ( 672613 ) <CGP@ColinGregor y P a lmer.net> on Thursday July 15, 2004 @07:02AM (#9705660) Homepage
    See also The Long Now Foundation. [longnow.org]

    I read their book in college and, though it is a bit pie-in-the-sky, I thought it raised some interesting ideas. One of their projects was to build a clock that could last a thousand years. When I moved to London [colingregorypalmer.net] one of the first things I did was go to see the thousand-year clock in the National Science Museum. There it was, it all it's broken-non-time-telling glory. About a month ago I checked up on it again. Status: still not fixed : \
    • Interesting article. I don't think he's necessarily right in all aspects of that, but he has some good ideas.

      It's clear that he was approaching the question with the LISP-user's mindset: simpler is better. There is such a thing as too simple, in my opinion.
  • by Janek Kozicki ( 722688 ) on Thursday July 15, 2004 @07:05AM (#9705671) Journal
    hey, all this babbling about long-term and short-term reminds me of xterm. Soon xterm will be 200 years old. Or at least sooner than almost anything else. (except for getty ;)
  • Too young (Score:2, Insightful)

    The problem with comparing computer practices with civil engineering practices, is the age of the two industries.

    Software is such a young industry that best practices, standards etc. have yet to be settled upon and thus will be hard to implement. Most engineering practices have come about after centuries of development, I somehow feel software development will have to mature for a while before we can see similar licences and standards bodies.
  • Legacy COBOL (Score:3, Interesting)

    by FJ ( 18034 ) on Thursday July 15, 2004 @07:08AM (#9705686)
    There are already legacy COBOL programs that are key pieces of many businesses. Some of those are almost 30 years old. Not really exciting code, but still important to many businesses.
  • Those Duke Nukem guys should have this problem pegged by now...
  • Not possible (Score:2, Interesting)

    by kcbrown ( 7426 )
    Software is technology as much as it is art, and as such is subject to the same dependencies that other technologies are subject to.

    The nature of technology is to evolve over time. Only the most basic tools we have haven't changed significantly over time: things like the knife, the hammer, etc. Even the screwdriver has seen significant development in the 20th century (Torx screws, for instance).

    Only those things for which the underlying rules don't change can remain constant over time. Software i

  • Just find me a customer that wants to pay for "robustness, testing, maintainability, ease of replacement, security, and verifiability" and I'll deliver.
  • by jellomizer ( 103300 ) * on Thursday July 15, 2004 @07:16AM (#9705720)
    Sure it is possible to write a program that is platform independent and could possible run for 200 years. But the problem is this. How many organizations can last for 200 years without changing their policies or without society changing. Lets compare us Now and 200 years ago 1804. How many companies have lasted sense 1804 not to many. And all of them have changed the way that they did business since then. How many companies 200 years ago would have enough foresight to allow policies for IT workers. Maybe 1 who was swiftly locked away for his crazy talk. Also a lot of todays terminology will go away in 200 year. I predict the term "Race" would be an out dated word confined to the old literature and newspapers, this is because with the steady decline in racial prejudice and inter racial marriages. It would be like 200 years ago a business man will ask you for your religion in order for them to decide to do business with or not, and now there would be some problems even if they asked as just a personal question. Or say we get visited by space aliens, Sex: M F X A I C. Who know what new and unheard of categories will be added or perhaps a method of doing things is drastically changed who even what the company does changes, heck the company I worked for started repairing mainframes, now we do mostly IT Consulting, and that is in 10 years imagine 200 year.
    So to make a program this customizable you need to make it a programming language with everything to you need to add and delete change and alter over time. Now even programming languages think Fortran 30 years ago it was the most popular language out there. And now it is tossed aside for the newer languages, even with fortran compilers for linux, most people will rewrite their fortran code to a more modern language then just port it. To take advantage of new features such as GUI, Internet Connectivity, Color Printing, Web Access. More thing that seemed useless or impossible 30 years ago, are now becoming important. Sure it is possible to make a program run for 200 years. But is is possible to make it useful for 200 year. And beside all this extra design time to make a program that can run for 200 years will cost a lot of money and time to do. Are the users of the applications are willing to pay $1,000,000 for a java program that number crunches their numbers. Or will they pay $50,000 for a program that will last them 10 years, and will be a lot less bloated and simpler to use.
  • by Grab ( 126025 ) on Thursday July 15, 2004 @07:17AM (#9705723) Homepage
    I love the way that everyone presents written records as a good example of a "perpetual" medium which surpasses digital.

    You may note that the author says "you can read 100-year-old newspapers *on* *microfiche*". This point practically jumps up and down to be noticed - even in the world of printing, paper copies are not seen as suitable for long-term storage, due to difficulties of preservation and physical bulk. So these paper copies are transferred to some other medium for long-term storage. This medium relies on readers existing - if all companies making microfiche readers went out of business (which probably won't be too many years ahead) then the microfiches will be unreadable. And the microfiches themselves are fragile - a scratch in the wrong place will make it difficult to read, and it's on plastic which will degrade over time.

    Why should digital be any different? If you want ultra-long-term storage of digital data, use punch holes in gold sheets. Otherwise you use a storage medium which gives you a reasonable storage size and reasonable data security.

    On reading the data back, suppose microfiche readers went obsolete and you couldn't buy them. The method of reading the data is still known and recorded, and can be reconstructed by someone needing to get the data back. Similarly, the most common bulk storage methods today are the CD-R and the DVD+/-R (tape backups are practically obsolete). Now the standard for data storage on CD and DVD is, well, *standard*. So if in 200 years time someone wants to read one back, they could build a CD player from first principles.

    Grab.
    • Similarly, the most common bulk storage methods today are the CD-R and the DVD+/-R (tape backups are practically obsolete). Now the standard for data storage on CD and DVD is, well, *standard*. So if in 200 years time someone wants to read one back, they could build a CD player from first principles.

      Neither tape nor the organic dyes on CD-Rs are nearly as long lasting as acid-free paper. I've read 200 year old books, but reading a 200 year old tape or a 200 year old CD-R would require *much* more effort
  • by Alain Williams ( 2972 ) <addw@phcomp.co.uk> on Thursday July 15, 2004 @07:17AM (#9705724) Homepage
    The cost of changing software can be looked at 2 ways:
    1. Move the software to a new box (but similar) since the old one is worn out or not fast enough or ... In practice this is not too difficult since you can either just copy the binaries or buy new ones or
      ./configure && make
      This I would not call a real change and is not too expensive.
    2. Move the software to a new (or much changed [the current] version of the same one) operating system. This is expensive as there is a lot of recoding that must be done and then work configuring it on the new platform.
    Note that the above is only valid if the software being copied does not really change it's functionality as the customer has not changed the requirement spec.

    One of the nice things about Unix (Linux/...) is that you can still run very old software on new boxes with at most minimal changes - I still use code that I first wrote some 20 years ago.

    There has been much assumption in this discussion that the whole system (hardware, OS, software) has to live unchanged for many years; I think that is missing the point as the true cost of software change is only big in case (2) above.

    Note that some software does need to be regularily changed, eg payroll - because the governments change the rules every year or two.

  • Ink and Paper (Score:4, Insightful)

    by Quirk ( 36086 ) on Thursday July 15, 2004 @07:22AM (#9705741) Homepage Journal
    What's needed is ink and paper. It's our proven technology for archiving. Micro fiche and magnetic storage devices are now more prevalent than any time before but the book industry and published journals and daily newspapers show no sign of diminishing. And as the article points out newspapers dating back 200 years are still available in the public libraries. Electronic voting protocol is just now hashing out whether a paper trail is prudent. Granted the article rightly points out the need to develop an archiving industry that is able to meet the needs for computers to replace paper, based archiving but as long as hardware development thrives in an open competitive economy the market will dictate the timing of implementing the necessary hardware. Unless some body like the library of congress undertakes financing the necessary hardware and software.
  • paper books (Score:3, Informative)

    by spectrokid ( 660550 ) on Thursday July 15, 2004 @07:33AM (#9705781) Homepage
    In Belgium, notary's still pay law students to copy by hand important documents on thick books, made from acid-free paper and solidly bound together. Stacked in a basement, you can throw a jerrycan of gasoline over them and set fire to it. You will lose (almost) nothing. Instead of relying on laser discs (see other post), print everything out and count on OCR.
  • requirements for the project must be set by the users

    I've yet to meet a client commissioning a project who knew well how his own business operated, still less was able to understand how any knowledge he did have might be usefully turned into a specification. One of the reasons some software projects have a short life is that the intended users fundamentally misunderstood how their business worked, or that its way of working was likely to change.

  • Maybe they don't have uptimes of 200 years, but they could probably have used the same software written 40 years ago. In the future calculators might be so cheap the 4-function might be an antique, but right now their selling point is cheapness (keep one in the car for MPG calculations!). Why write the same software over and over again for the same chip architecture?

    Software that lasts forever is the simplest kind.

  • by Bazzargh ( 39195 ) on Thursday July 15, 2004 @07:38AM (#9705802)
    The idea of software that lasts 200 years reminded me of a discussion on the radio the other day about the origin of a joke: "I've had this broom 50 years, its had 5 new heads and 3 new handles". The identity issue played with here dates back at least to Plutarch's Ship of Theseus [washington.edu] - if you keep replacing parts of a thing, until no original parts remain, is it still the same thing?

    The relevance to software is captured with an example: Is Linux still Linux? How much remains of the kernel originally published by Linus? Would would you say that Linux has been around for X years (pick X to suit)?

    Most people would agree that it's still Linux. What Linux, the broom, and Theseus' ship have in common is that they could be modified to meet the demands of time, while retaining their identity.

    I've always thought that maintainability is the highest virtue software can strive for, above other quality-oriented goals like being bug-free, or performant. If its buggy, but maintainable, it can be fixed; if its slow, but maintainable, we can make it faster. I think it could also be argued that software, like Theseus' ship, needs to be maintainable to last 200 years; but the version 200 years from now may not resemble the original in the slightest.

    Just my 2c

    Baz
  • Wait... what do I care. In 200 years I'll be dead.
  • by mvw ( 2916 ) on Thursday July 15, 2004 @07:46AM (#9705831) Journal
    Prof. Knuth [stanford.edu] was unhappy with the degrading typographical quality of the printings of his The arts of Computer Programming [stanford.edu] series. So he took 10 years of his research time to develop the TeX [stanford.edu] computer type setting system. (A stunt hard to pull off, if you are not a professor or rich :-). Now look at how he published the TeX System. There is a set of 5 books [stanford.edu] containting
    • TeX user manual
    • TeX commented source code
    • Metafont user manual
    • Metafont commented source code
    • The Metafont programms to generate the computer modern fonts
    What is that good for?

    If you, say in 500 years, get a copy of these 5 volumes (and if they are printed on good paper, there is good chance that these survive). You just need some kind of computing device and the skillset to implement some easy pascal like programming language. Then you type in the programms and fonts from this book and voila, you have working a TeX system!

    Of course you need to write a .dvi driver for whatever output device you want to need and have at that time.

    If you now find some .tex source of one of Knuth's books, be it in print or some crude hyperflux memory cube, you are then able to reproduce that book in the quality Knuth intended it to have!

    Thus TeX is explicitly developed to transfer the typographic quality of Knuth's books into the future, without depending that lots of software vendors establish lots of data format (e.g. Word 2325 to Wort 2326) converters!

    Regards,
    Marc

  • Document Format (Score:4, Interesting)

    by os2fan ( 254461 ) on Thursday July 15, 2004 @07:56AM (#9705877) Homepage
    The main problem is not so much with "applications" but data format. We talk of the aging db2 formats used of data bases. The reason that these hang around for so long, is that much of the corporate history hangs out in it.

    When i design projects, i tend to think more about keeping the data clean, simple and robust over time, rather than the ease which certian applications can reproduce it.

    For example, when i designed KML, the idea was that it was meant to be a robust format that could be defined outside the context of any word-processor, and ultimately aimed at HTML, TeX, etc. At the moment, it is Regina REXX's job to render my markup. Nothing stops this from becoming Perl's or CEnvi's job! It's just a matter of writing a new parsing engine.

    Because it is not something like HTML or TeX or RTF, i have considerable control over the format, and i can map several internal styles onto the same output, eg like {emphasis} vs {bold} in html. But the thing can be to the structure of the document.

    It is more data standard, rather than program standard that is important. The latter is also important, since we don't want to either run dusty decks or old programs.

    But what can you expect from an upgrades-driven market?

  • Long Now Foundation (Score:3, Informative)

    by handy_vandal ( 606174 ) on Thursday July 15, 2004 @08:18AM (#9705997) Homepage Journal

    The Long Now Foundation: 10,000 Year Clock and Library
    "The
    Long Now Foundation [longnow.org] was established in 01996* to develop the Clock and Library projects, as well as to become the seed of a very long term cultural institution. The Long Now Foundation hopes to provide counterpoint to todays 'faster/cheaper' mind set and promote 'slower/better' thinking. We hope to creatively foster responsibility in the framework of the next 10,000 years."

    * The Long Now Foundation uses five digit dates, the extra zero is to solve the deca-millennium bug which will come into effect in about 8,000 years.
    Long Now is the brainchild of Stewart Brand [everything2.com].

    -kgj
  • by squarooticus ( 5092 ) on Thursday July 15, 2004 @08:31AM (#9706093) Homepage
    In A Deepness in the Sky, Vinge posits a collection of software of ancient origins that handles all of the Qeng Ho's automation. This software is never replaced, but simply evolves as better ideas appear. While not technically open source (the Qeng Ho considered this software to be one of their proprietary advantages), it is open to every member of the group. By the time of Pham Nuwen, it had existed in some form or another for literally thousands of years, and over that time had been inspected by thousands of people.
    • In A Deepness in the Sky, Vinge posits a collection of software of ancient origins that handles all of the Qeng Ho's automation.

      If you calculate the offset between the starting date of their oldest calendar and the epoch date of their software, it seems that their software is based on something written in the '70s, or at least that's when its calendar started.

      Things that make you go hmmm...
  • by Trolling4Dollars ( 627073 ) on Thursday July 15, 2004 @08:33AM (#9706108) Journal
    ...I agree heartily, but were the United States is concerned, this will probably never happen. The brand of capitalism that currently drives the U.S. is not friendly to goods and services that are expected to last a long time. In the past, you could buy a TV and the company would guarantee it's picture tube for up to ten years. These days you're lucky to find a TV with a five year guarantee on the picture tube and in most cases you are forced into buying an extended warranty that you have to renew.

    The way that homes were built in the early part of the 20th century, those homes could be expected to still last up to 100+ years. These days, the cheap 'lick em and stick em' jobs that people pay hundreds of thousands of dollars for are certain to start falling apart in 10-25 years. I know this because I used to work on some of them. The materials are not meant to last. Many of the homes develop probles with the plumbing, roof, even the electrical in some cases. A lot of these homes can't stand up to tornadoes as well as the old houses could. There was a neighborhood in a city south of me where all the bran new houses were torn apart by a tornado. These houses were built in the late 1980s and 1990s. Within a few blocks, there were a few old farm houses that were unscathed. My point is that houses these days are made of crap, more expensive and are not built to last. They are essentially disposable after one generation grows up in them (while having to fix problems).

    This is all evidence of the disposable, recurring payment culture of the U.S. today. I exclude other countries even though many of them have the same problem, but to a lesser extent. Those other countries are fr more likely to try and build a long-lasting, open source infrastructure. When I was a kid in the 70s, recurring fees were rare other than utilities and mortgage or car payments. Today, you can get nearly anything for a recurring fee. Although all the fees themselves are small, they total to whopping bills if a person needs or wants all those goods and services. Whatever happened to the day when you could buy something and it was yours. 100%. No strings attached. No recurring fees. Just yours. Sure there are a few things, but keep in mind that recurring fee or not, they are not built to last. Durability is anathema to profit in the new American way. The idea of having long-lasting, open source/free software is going to have a lot of opponents in the American software business soley because there is money to be made.
    • I used to subscribe to this way of thinking - after all "I'll always have a car payment" and
      "As long as I can make the minimum payment, it doesn't matter what my credit card balance is."

      This was foolish youth talking, and 'buy now, pay later' immediate gratification marketing that led me for years.

      I had a wise aunt and uncle who advised me that I could spend 10% more than I earned, or 10% less. The first way I'd sweat payments for the rest of my life, the second way I'd always have money in my pocket. T
    • The brand of capitalism that currently drives the U.S. is not friendly to goods and services that are expected to last a long time

      Sure it is. The problem is that you're just looking at consumer goods that are expected to be cheap, so there's no incentive to make them long-lasting. Quality costs, and most people don't need the added expense if it's equivalent to the cost of a replacement unit in a few years. (and just how can anyone be *forced* into buying a warranty?)

      We have bridges that have been up fo

  • by ausoleil ( 322752 ) on Thursday July 15, 2004 @09:12AM (#9706399) Homepage
    In C:
    #include <stdio.h>

    main()
    {
    for(;;)
    {
    printf ("Hello World!\n");
    }
    }
  • by Detritus ( 11846 ) on Thursday July 15, 2004 @09:16AM (#9706438) Homepage
    There are standards that have been around for decades and have preserved upward compatibility. Well-written FORTRAN programs (no jokes from the peanut gallery) from the 1960s can be compiled and run on modern machines. At one time, there was a strong movement to standardize high-level languages so that an application could be compiled and run on any computer. The idea was that an applications programmer should be able to write usable programs without knowing or caring about the operating system and other machine dependent trivia. That idea seems to have been lost with the advent of microcomputers and the rise of operating system monocultures such as MS Windows and UNIX.

    Another problem is the advent of the GUI. Give a user, or even a programmer, a text-oriented application today and be prepared for much wailing and gnashing of teeth. As someone who started writing programs on mainframes, it doesn't bother me, but I've seen users look at me like I'm some kind of Martian when I give them a command-line program to solve a problem, even though it is supplied with step-by-step documentation on how to use it.

    Where are we today? I don't believe that there has been much progress made in recent years. You can write portable programs in COBOL, FORTRAN and Ada. ISO Pascal and ANSI BASIC seem to be near extinction. Portable programs are theoretically possible in C, but the pitfalls and temptations are many. I'm not a database programmer, but I would hope that there is a portable subset of SQL that would support the portable use of RDBMS. Why should I know or care that the system is using Oracle or SQL Server?

    • There actually are some compatibility issues before the ANSI standardization of COBOL and FORTRAN, and both languages let one use non-portable extensions. A program has to be written with portability in mind, just like most other compiled languages. No commercial database application uses pure ANSI standard SQL to get real world work done....try to substitute one db for another behind the scenes and things will break. Many applications can use multiple dbms on the back end only because they have configu
  • by ErichTheRed ( 39327 ) on Thursday July 15, 2004 @10:23AM (#9707019)
    The author describes a lot of what's wrong with software development right now. Being on the admin side of things, I've often had to deal with very buggy stuff custom-written by an internal IT department. Lots of key systems at large companies are still running on either the original hardware or upgraded versions of the platform. (There was an article a while back about VAX finally being killed by HP...that should tell you something.) Any improvements are hindered by the original framework (think screen-scraping apps, multiple file format translations, etc.)

    Civil engineers also run into this problem. For instance, take any large city whose highway system was built more than 50 years ago (NYC and Boston come to mind immediately.) No one ever dreamed that everyone would have their own car and stop using the trains/buses/ferries to commute to work. Therefore, overcapacity was never seen as a problem, and the rush hours just get longer every year as everyone tries to stagger their commutes. And since the roads are right next to buildings, in-place upgrades are very rare.

    I think that once the whole IT labor market shakes itself out, software engineering will become another branch of traditional engineering. Just like power plants, dams, airports, etc., we're now dependent on computers, and it's time to put some standards into place. Software needs to be built such that it's portable, easily understood by a similarly-trained engineer, and conducive to improvements. In other words, it needs to be able to outlive the coder.
  • by Diabolical ( 2110 ) on Thursday July 15, 2004 @10:54AM (#9707362) Homepage
    If the data formats are standardized it should not matter what kind of hardware or media is used, the data just migrates from one technical platform to another.

    I firmly believe that without this we will lose a significant part of our history. Current history is known because of durable "storage" like paper and fossiles, stone tablets or murals. The materials are all degrading but last longer then something digital.

    If we keep on trusting on technology we use right now we would be very lucky if anyone in the near future would be capable of finding anything significant which would be representable of our time. All our information is being recorded in digital format. This includes important things like presidential speeches, signed documents etc.

    This society is more and more dependent on electronic information. Alot of information isn't available in printing anymore let alone in a true durable format. If for some reason there will be some major catastrophy any survivors' offspring in the future will know nothing about this age and it's mistakes and would not learn a thing about it.

    We had the opportunity to study history because of the durability of it's information. Our information, however, doesn't even last a lifetime.
  • by Master of Transhuman ( 597628 ) on Thursday July 15, 2004 @10:55AM (#9707370) Homepage
    if he thinks ANY software could last a century or more. Or even SHOULD so last.

    HUMANS won't last through this century! How does he expect software to do so?

  • From a programmer. (Score:3, Insightful)

    by Gordon Bennett ( 752106 ) on Thursday July 15, 2004 @12:05PM (#9708105)
    I take a bow to the report and the author.
    Being a computer programmer, I wouldn't call myself a 'Software Engineer' due to the appalling state of writing software in its current state. There has been for quite some time this in-bred mentality of Versions, that nothing is ever finished, mostly driven by commercial greed - despite the huge advances in computer power, our OS'es and their applications are still struggling to keep up with an Amiga, for chrissake.
    Moreover, it can have lethal consequences; for example, radiation treatment, or airplane control. Deaths ensued. "Sorry that your college outing ended in all their deaths, we were running 1.1.3 of the aileron system."
    Sure, even mechanical engineers get it wrong, but their main onus is to make something that will work, not, as in the software case, 'get it out now, fix it later'.
    So, if someone says they are a 'Software Engineer', ask them, what is it they do that merits the 'Engineer' tag - would they build a bridge that lasts? Nope.
  • by AnotherBlackHat ( 265897 ) on Thursday July 15, 2004 @12:58PM (#9708659) Homepage
    We build disposable software, because computers are still disposable.
    Not because they can't be built to last, but because they quickly become obsolete.

    If Moore's law continues to hold for 40 years, computers will be over a million times more powerful than they are now, the cheapest drive you could buy would hold more than a petabyte, and we'll be saying things like "I remember when a thousand bucks for a terabyte of ram seemed like a good deal, and now I can't even buy a ram stick that small".

    Once the breakneck pace of expansion stops (or at least slows to a reasonable rate) then we should look at making software that lasts.

    Video compression technology is big business today, but it's probably going to seem like a silly idea in the future.

    We don't need buggy whips that last 200 years.

    -- less is better.
  • by garyebickford ( 222422 ) <gar37bic@IIIgmail.com minus threevowels> on Thursday July 15, 2004 @01:30PM (#9709028)
    The Clock of the Long Now [longnow.org] is a clock designed Danny Hillis to last 10,000 years with maintenance using only Bronze Age technology. Ticking will be avoided. The century hand will advance every 100 years, and the cuckoo will come out on the millennium. The first 9 foot tall prototype was built in time for "New Year's Eve 01999" (note extra digit, and the second is under construction now.

    One might argue that the clock incorporates firmware, in the sense that there will be relatively complex algorithms to maintain accuracy by comparing different timing signals, and simpler algorithms to decide when to move the century hand, or cuckoo the millennium. It's not a stored-program system though, so it doesn't meet the criteria that the Babbage engines meet. Nevertheless, this is a good example of hardware designed realistically to operate continuously for 10 millennia. For this project Hillis invented a mechanical serial-bit-adder, a mechanical digital logic element, which evidently lacks the "wearing problem" of a standard clock mechanism. The clock knows about leap years and such.

    The website has images of the prototypes and the design, but I'm on dialup so I didn't look at them. The Principles Page [longnow.org] discusses some of the problems to be overcome. For example, power source - right now Hillis is tending toward a temperature-based power source - and maintaining accuracy, which may be based on a phase locked loop using a mechanical oscillator and solar alignment. There are ways to support the foundation, such as buying Brand's Book, or Eno's tunes

    IMHO he might want to use three or four other checks as well. An extension of phase locking can work well with multiple nodes in a network, e.g., the multiple nodes in the human heart rhythm controller. Such networks rapidly converge to a common cycle, and this would provide additional reliability. The NTP network time algorithm is based on multiple sources of the same type, but analogous in concept. Just for fun, it'd be great if the clock also included a display of the 64-bit Unix time, in binary!

    This Wired article [wired.com] was written by Danny Hillis about his original idea. The Long Now Website [longnow.org] has other interesting links about long term stuff. Hillis has some interesting friends, like Brian Eno [enoshop.co.uk] who named "The Clock of the Long Now", and Stewart Brand [edge.org]. Other links: Intro to Brand talk [edge.org], The actual talk [edge.org]. Buy the book [gbn.org], or the Eno CD "January 07003" [enoshop.co.uk] to support the foundation.
  • by Spazmania ( 174582 ) on Thursday July 15, 2004 @01:32PM (#9709041) Homepage
    Many things in society are long-term

    Not really true.

    Those historical buildings? They've been gutted and rebuilt from the inside out at least once during the past 50 years for the installation central air conditioning and elevators for the handicapped. And they're the exception to the rule. Few commercial buildings go more than 15 years without major renovation and few residential buildings make it more than 30.

    Roads? Sure, US Route 1 does still travel approximately the same route but its repaved frequently, expanded and changed frequently, and its been supplanted in its original purpose as the major east-coast north-south route by Interstate 95. And even Route 1 has existed for less than a century. Before automobiles at the begining of the 20th century, there was no need for anything like it. Before automobiles, who could conceive of a multilane asphalt highway that needed to sustain speeds over 500 miles per day? How could yesteryear's engineers possibly plan for it?

    The US constitution, the foundation of our law, has seen two major overhauls in the past two centuries: first due to the civil war and again because of the great depression. Even where parts of the text remain the same, their meanings have been drastically altered by the courts. Free speech has become freedom of expression. The right to bear arms somehow doesn't exist at all inside Washington DC except for police. The states have gone from being the primary seats of governance to being almost entirely subsidiary to the federal government. We're living under an almost totally different government than what saw the dawn of the 19th century.

    Even the Catholic Church publishes a new catechism each year, a book which defines the religion. You'd think during two millennia they'd figure it out once and for all, yet it continues to evolve and change.

    Few things last, either in their original purpose or their original design. They're continuously rebuilt, redesigned and reinvented... Even things like roads, buildings and governments for which our design experience goes back thousands of years.

    Our software experience goes back 40 years, if you can call what we did 40 years ago software by any current definition. Why should we build it to last longer than than the roads and buildings, and indeed longer than software in any form has existed?

    I'm sorry, but I'm not smart enough to successfully plan ahead two centuries and neither are you.
  • by wcrowe ( 94389 ) on Thursday July 15, 2004 @04:16PM (#9710775)
    Right now, somewhere, there is a government agency putting important data into long term storage, which was created in Microsoft Word. In a few years that data may be unreadable, not because the medium has deteriorated, but because the software that created it will have evolved or no longer exist.

    This is just one example of how proprietary formats are bad for storing important data, long term. This problem was noted years ago when it was discovered that VA tapes, tucked away in underground facilities back in the 60's, could no longer be read because the software that created them is gone.

    An ideal data scheme would include information which describes the data being stored along with the data itself. An example is XML. This concept needs to be pushed.

    It is more likely that we can solve the problem of proprietary data storage schemes long before we can implement 200 year software.

What is research but a blind date with knowledge? -- Will Harvey

Working...