Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Software Operating Systems

Dan Bricklin on Software That Lasts 200 Years 359

Lansdowne writes "Dan Bricklin, author of VisiCalc, has written a great new essay identifying a need for software that needs to last for decades or even centuries without replacement. Neither prepackaged nor custom-written software is fully able to meet the need, and he identifies how attributes of open source might help to produce long-lasting 'Societal Infrastructure Software'."
This discussion has been archived. No new comments can be posted.

Dan Bricklin on Software That Lasts 200 Years

Comments Filter:
  • by dosun88888 ( 265953 ) on Thursday July 15, 2004 @06:38AM (#9705569)
    I think the subject line says it all. You can't worry about your software working for that long until your hardware can last that long.

    ~D
  • by Biotech9 ( 704202 ) on Thursday July 15, 2004 @06:44AM (#9705593) Homepage
    No company in the world will ever try and develop software that never needs (costly) upgrades and add-ons. Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types. Even the open source community will be too interested in improving and adding on to thier pet projects to consider leaving it alone.

    this article seems pretty flawed.

    We need to start thinking about software in a way more like how we think about building bridges, dams, and sewers.

    The fundamental difference being bridges cost more to alter than software does. And the capabilities of hardware allows more freedom in software, to which there is no correlation in bridges.

    hmmm, just my 2 euro-cents.
  • No (Score:5, Insightful)

    by Mr_Silver ( 213637 ) on Thursday July 15, 2004 @06:49AM (#9705609)
    Neither prepackaged nor custom-written software is fully able to meet the need

    I disagree. It's got nothing to do with the software but the data.

    If the data format is clearly documentented, then it doesn't matter whether the application that generated it is open or closed.

    True, you could argue that since the code is open the data format is also documented, but personally I'd find it easier if it was written in a properly structured document.

    Otherwise you'd have to resort to learning and then plouging through an application written in some 200 year old programming language (by someone who possibly hacked it up with a hangover at the time) to try and understand what they were doing and why.

  • It's a tool... (Score:2, Insightful)

    by tgv ( 254536 ) on Thursday July 15, 2004 @06:52AM (#9705617) Journal
    For Christ's sake, computers are mostly used as tools. And who keeps their old tools around for so long? Only neanderthals: [paleodirect.com]...
  • Not Possible (Score:5, Insightful)

    by deutschemonte ( 764566 ) <lane.montgomery @ g mail.com> on Thursday July 15, 2004 @06:53AM (#9705627) Homepage
    Constant standards are what is needed to make software last that long.

    Language standards don't even last 200 years, how do we expect something as new as software standards to be more uniform than language standards? Language has been around for thousands of years and we still can't agree on that.
  • by tessonec ( 620168 ) on Thursday July 15, 2004 @06:54AM (#9705629) Homepage
    I think you do not understand completely the point of the article...

    The point is that, given the fact that there is a vast amount of information in computer files, you must be aware that if you can't retreive that information in the future, it will be lost.

    You are right, most of the software gets updated. But it is the interface that understands the format the thing that must last for much more time than a couple of software-updates-cycles

    This is exactly another reason to consider OS standards instead of closed-source formats, as MS in 100 years (if it does still exist) will have forgotten how .doc in windows 2000 looked like
  • by Mr_Silver ( 213637 ) on Thursday July 15, 2004 @06:55AM (#9705634)
    Take a look at Micrsofts behaviour with MS Office, it's a complete cash cow because they can update it when they want and force people into upgrading with changed document types.

    Maybe before, but the document format hasn't changed since Office 2000.

    You can send me a document written in Word 2003 and I can happily open it in Word 2000.

  • by amitofu ( 705703 ) on Thursday July 15, 2004 @06:57AM (#9705641) Homepage
    Standards are what must be designed to last for decades, not the software that conforms to the standards. Things like XML, RDF and POSIX will be supported for decades, if not centuries. Who cares if it is Linux running your POSIX apps, or FreeBSD, or HURD? I don't think it matters if software uses libxml2 to parse your XML data, or some yet-unconceived API--as long as it understands XML!

    If it is stability and reliable infrastructure that is desired, it is standards that must remain constant and software that must evolve to make the standards work with new technology.
  • by clsc ( 730336 ) on Thursday July 15, 2004 @06:57AM (#9705642) Homepage Journal
    The world is different now than it was even just a decade or two ago. In more and more cases, there are no paper records.

    The point that the author makes here is really that without electricity we will lose great parts of recent history.

  • by Jotham ( 89116 ) on Thursday July 15, 2004 @06:59AM (#9705651)
    I disagree with the common comparison of Software to Civil Engineering and Standards Bodies.

    Data Structures would be a better analogy, which Standards Bodies have done a really good job declaring. So in 200 years time you'll still be able to read the DVD data format (assuming the media is still good), even though the software that plays it will likely be different.

    Software is more like mechanical engineering, where things do break and improvements keep being found. You wouldn't for example use a 1960's car engine in a car today, even though the basic principle is the same. No ones asks why they didn't get it right 40 years ago and aren't still using the same design.

    Unfortunately, what would often be considered an early prototype in engineering, is often released as v1.0 -- the cause of which is a long post all unto itself.
  • Re:200 years??? (Score:4, Insightful)

    by pjt33 ( 739471 ) on Thursday July 15, 2004 @07:02AM (#9705663)
    I've stared at your post for a long time trying to work out what you mean. Please put me out of my misery by telling me whether the second word should read "precedents".
  • Too young (Score:2, Insightful)

    by frankthechicken ( 607647 ) on Thursday July 15, 2004 @07:08AM (#9705685) Journal
    The problem with comparing computer practices with civil engineering practices, is the age of the two industries.

    Software is such a young industry that best practices, standards etc. have yet to be settled upon and thus will be hard to implement. Most engineering practices have come about after centuries of development, I somehow feel software development will have to mature for a while before we can see similar licences and standards bodies.
  • by julesh ( 229690 ) on Thursday July 15, 2004 @07:14AM (#9705711)
    Well Dan Bricklin does point out that software of today can run on different hardware and having software tied to specific hardware is a bad idea

    Software of today can run on a variety of different hardware, but there is a degree of similarity between the different types of hardware that probably won't exist between todays computers and those available a hundred years from today, much less two.

    He is not just talking about one specific program that doesn't change, but rather open standards and techniques that mean data that is stored today, will be accessible in 200 years time.

    That, on the other hand, I can agree with. Anyone storing information in a format that isn't publically documented really ought to consider whether they'll still need it in 30 years time, and start migrating it to an open format now if they will. However, there are very few formats that are completely undocumented. I believe the most commonly used might be Microsoft Access databases. I'm not sure what documentation exists on the formats of various other commercial database systems; I believe Oracle's formats are well documented (?). What about MSSQL? Informix?

    Most accounts packages have documentation available on their database formats I believe. Certainly Sage and Pegasus provide such documentation. What about Great Plains, etc.?
  • by Dr. q00p ( 714993 ) on Thursday July 15, 2004 @07:16AM (#9705718)
    Just find me a customer that wants to pay for "robustness, testing, maintainability, ease of replacement, security, and verifiability" and I'll deliver.
  • by jellomizer ( 103300 ) * on Thursday July 15, 2004 @07:16AM (#9705720)
    Sure it is possible to write a program that is platform independent and could possible run for 200 years. But the problem is this. How many organizations can last for 200 years without changing their policies or without society changing. Lets compare us Now and 200 years ago 1804. How many companies have lasted sense 1804 not to many. And all of them have changed the way that they did business since then. How many companies 200 years ago would have enough foresight to allow policies for IT workers. Maybe 1 who was swiftly locked away for his crazy talk. Also a lot of todays terminology will go away in 200 year. I predict the term "Race" would be an out dated word confined to the old literature and newspapers, this is because with the steady decline in racial prejudice and inter racial marriages. It would be like 200 years ago a business man will ask you for your religion in order for them to decide to do business with or not, and now there would be some problems even if they asked as just a personal question. Or say we get visited by space aliens, Sex: M F X A I C. Who know what new and unheard of categories will be added or perhaps a method of doing things is drastically changed who even what the company does changes, heck the company I worked for started repairing mainframes, now we do mostly IT Consulting, and that is in 10 years imagine 200 year.
    So to make a program this customizable you need to make it a programming language with everything to you need to add and delete change and alter over time. Now even programming languages think Fortran 30 years ago it was the most popular language out there. And now it is tossed aside for the newer languages, even with fortran compilers for linux, most people will rewrite their fortran code to a more modern language then just port it. To take advantage of new features such as GUI, Internet Connectivity, Color Printing, Web Access. More thing that seemed useless or impossible 30 years ago, are now becoming important. Sure it is possible to make a program run for 200 years. But is is possible to make it useful for 200 year. And beside all this extra design time to make a program that can run for 200 years will cost a lot of money and time to do. Are the users of the applications are willing to pay $1,000,000 for a java program that number crunches their numbers. Or will they pay $50,000 for a program that will last them 10 years, and will be a lot less bloated and simpler to use.
  • by Anonymous Coward on Thursday July 15, 2004 @07:18AM (#9705729)
    Whole point? Umm.. no.

    Side benefit? Yes.
  • Ink and Paper (Score:4, Insightful)

    by Quirk ( 36086 ) on Thursday July 15, 2004 @07:22AM (#9705741) Homepage Journal
    What's needed is ink and paper. It's our proven technology for archiving. Micro fiche and magnetic storage devices are now more prevalent than any time before but the book industry and published journals and daily newspapers show no sign of diminishing. And as the article points out newspapers dating back 200 years are still available in the public libraries. Electronic voting protocol is just now hashing out whether a paper trail is prudent. Granted the article rightly points out the need to develop an archiving industry that is able to meet the needs for computers to replace paper, based archiving but as long as hardware development thrives in an open competitive economy the market will dictate the timing of implementing the necessary hardware. Unless some body like the library of congress undertakes financing the necessary hardware and software.
  • by Vitus Wagner ( 5911 ) <vitus@wagner.pp.ru> on Thursday July 15, 2004 @07:23AM (#9705751) Homepage Journal
    Whole point is that software is comprehandable by humans. Anyone can read, fix and imporve.

    Porting to new architecture is integral part of fixing and improving.

    And it is no side effect, it is quite significal part of RMS's "free as freedom" - independence from any vendor (hardware vendor in this case).
  • by cardpuncher ( 713057 ) on Thursday July 15, 2004 @07:36AM (#9705793)
    requirements for the project must be set by the users

    I've yet to meet a client commissioning a project who knew well how his own business operated, still less was able to understand how any knowledge he did have might be usefully turned into a specification. One of the reasons some software projects have a short life is that the intended users fundamentally misunderstood how their business worked, or that its way of working was likely to change.

  • by mvw ( 2916 ) on Thursday July 15, 2004 @07:46AM (#9705831) Journal
    Prof. Knuth [stanford.edu] was unhappy with the degrading typographical quality of the printings of his The arts of Computer Programming [stanford.edu] series. So he took 10 years of his research time to develop the TeX [stanford.edu] computer type setting system. (A stunt hard to pull off, if you are not a professor or rich :-). Now look at how he published the TeX System. There is a set of 5 books [stanford.edu] containting
    • TeX user manual
    • TeX commented source code
    • Metafont user manual
    • Metafont commented source code
    • The Metafont programms to generate the computer modern fonts
    What is that good for?

    If you, say in 500 years, get a copy of these 5 volumes (and if they are printed on good paper, there is good chance that these survive). You just need some kind of computing device and the skillset to implement some easy pascal like programming language. Then you type in the programms and fonts from this book and voila, you have working a TeX system!

    Of course you need to write a .dvi driver for whatever output device you want to need and have at that time.

    If you now find some .tex source of one of Knuth's books, be it in print or some crude hyperflux memory cube, you are then able to reproduce that book in the quality Knuth intended it to have!

    Thus TeX is explicitly developed to transfer the typographic quality of Knuth's books into the future, without depending that lots of software vendors establish lots of data format (e.g. Word 2325 to Wort 2326) converters!

    Regards,
    Marc

  • Re:2 letters (Score:3, Insightful)

    by sisukapalli1 ( 471175 ) on Thursday July 15, 2004 @08:03AM (#9705903)
    I think tex/latex has the capability. We have the some documents (20 years old), and they compile fine and look prefect. If anything, improvements made it easier to "enhance" the document without messing with anything.

    S
  • Re:It's a tool... (Score:4, Insightful)

    by kfg ( 145172 ) on Thursday July 15, 2004 @08:29AM (#9706089)
    The violin dates from the 1600s. While it has undergone a certain amount of "support" since then it is essentially the same tool as designed by Amati. Some consider it one of the finest tools ever devised by man. Many of the older ones are considered superior to the newer ones.

    I have an automotive body hammer that is nearly identical to a 1500s war hammer, although the upgrade to a fiberglass handle is a nice touch for reducing shock. The basic design goes back some thousands of years with little more than some minor updates in materials.

    My 100 year old desk holds up my computer just fine. It is as technologically advanced as what I can get new at Office Max, except I expect it can last another few hundered years due to the quality of its construction.

    I'm wearing woven fabric clothing, a technology that reaches back at least 10,000 years. There have been a number of attempts to replace this technology over the past 40 years or so. They've all proven inferior except for certain special applications. Hell, even indigo dye for work clothes has proven to be a durable technology for thousands of years that you can still purchase in nearly any clothing store and the "jeans" that are the most common example of the type are about 400 years old (Jacob Davis added rivets to the existing design. He didn't invent the jeans themselves).

    I've been watching a new office building go up in town. It's post and beam, about as old a house building technology as you can get, although the building is considered "modern."

    I also have a couple of fires that burn continuously in my home. It proves rather useful, although the technology is a bit long in the tooth.

    I fully expect that ASCII will be just as viable a way to represent the Latinate alphabet 200 years from now as it was a few decades ago, and the Latinate alphabet is another example of a multithousand year old technology.

    Innovation for innovation's sake often "progresses" to the rear.

    Build it right and build it good. Don't be afraid to change it when there's damned fine reason to on solid theoretical and practical grounds, but otherwise leave it the hell alone if it works.

    That isn't being a Luddite. That's being an engineer.

    KFG
  • by WillWare ( 11935 ) on Thursday July 15, 2004 @09:04AM (#9706330) Homepage Journal
    Lots of software includes or utilizes standardized hardware abstraction layers. Think about the POSIX standard, or the virtual machines for Java or Python or Common Lisp. These abstraction layers mean that large amounts of code are portable (sometimes with some effort, sometimes with none) across any hardware platform that supports the abstraction layer.

    Hardware manufacturers will always have a powerful incentive to support the abstraction layer, because by doing so, they'll instantly pick up a huge set of killer apps for their new CPUs. Standardized abstraction layers therefore provide an economically efficient way to divide the labor of porting software to new platforms.

    Are you thinking that in order to have software that's useful in the long term, it must run continuously on exactly the same piece of hardware? Think about Google (a very useful thing in our society). They must be bringing newer, faster computers on-line all the time. But if they're not total boneheads, they don't need to rewrite all their code to do this.

  • Re:Not Possible (Score:3, Insightful)

    by _|()|\| ( 159991 ) on Thursday July 15, 2004 @09:24AM (#9706501)
    Language standards don't even last 200 years, how do we expect something as new as software standards to be more uniform than language standards?

    How do you know? Lisp has been around for a while, and it's not dead, yet. Some Lispers are working on a language called Arc [paulgraham.com], which they hope will last a hundred years. On another front, perhaps Parrot or .NET will provide a stable base that will allow languages to evolve, while remaining compatible.

    That said, I don't think it's necessary for a long-lived software project to use one language, exclusively. Standard interfaces can commoditize the language, to some extent.

  • by jsebrech ( 525647 ) on Thursday July 15, 2004 @09:28AM (#9706522)
    In fact, the Word document format hasn't changed since Word 97. So any Word version from 1997 or onwards will do the job.

    And changing the settings to saving in RTF format by default (enabling Word versions from Word 6.0 through 2003, as well as basically all other word processors, to read the documents) isn't all that hard. Not even in a corporate setting.


    The word format is heavily platform dependent. If you embed objects into word documents, or use scripting, it's pretty much a guarantee it will not work correctly across office and windows versions. Not having the right fonts available will ruin your layout too. Word is not pdf or postscript, it is not a stable or cross-platform format.

    And suggesting rtf is a stable or widely supported format is silly given how many dialects of rtf there are. Every new office version comes with a new rtf dialect.
  • by 1u3hr ( 530656 ) on Thursday July 15, 2004 @09:39AM (#9706603)
    You can't worry about your software working for that long until your hardware can last that long.

    I'm using software that orignally ran on an 8086, then a 286, the a 486, then two or three generations of Pentiums. The whole point is that hardware dies, software doesn't. Not to mention the bunch of Unix-derived software that I run as DOS or Linux apps, essentially unchanged for almost 30 years, though the hardware on my desk is more powerful than the whole server room at the university I learnt it on, and I doubt has a single commmon piece of hardware.

    If you'd RTFA: "Today, hardware is capable enough that software can be written that will continue to run unmodified as hardware is changed." Consider, perhaps, all the games people play in emulators like MAME.

  • by Anonymous Coward on Thursday July 15, 2004 @10:20AM (#9706993)
    Why?

    Society changes, why do we want software from 200 years ago to work today.

    Hell Hardware changes so damned fast. Moore's law or not.

    Making something "just work" for a long time kinda kills inovation. It's when things "need" to be changes or upgraded that new ideas come along.
  • by Anonymous Coward on Thursday July 15, 2004 @10:40AM (#9707218)
    The brand of capitalism that currently drives the U.S. is not friendly to goods and services that are expected to last a long time.

    Absolute fucking bullshit. There's simply different stratas of goods. People can buy something cheap or something midrange or something expensive. The expensive stuff will, in generall, last longer and be of much higher quality- Harmon Kardon versus Kraco, for example.

    It's called choice. And you know what? Even the cheap stuff lasts a decent amount of time. A lot of this "stuff was build so much better in the OLD DAYS" is myth and rose colored rememberances.

    And stop blaming "capitalism" for everything and see it for the boon that it has been. It's a highly imperfect boon, but, shit, it gets blamed for things that are stupid. Capitalism didn't create a class based society. Capitalism BROKE THE BACK of the serf/aristocracy bullshit that plagued humanity for millenia. There's still classes, but what capitalism gave us is:

    1. A continuum of classes. There's more than rich and poor. Incomes range from zero to the millions in a smooth gradient.

    2. Class mobility. Hard work DOES pay off in this society far more than many others. My parents grew up in abject poverty. I grew up fairly poor. I now make $160K a year. Every single "class warfare" jackass I meet is invariably a lazy dumbass who refuses to pull his or her own weight.

    Yeah, some people get what could be called far too rich, and I know it's a bitter pill for some, but that doesn't stop you or me from succeeding. Wealth can be created anew in our system by creating a value, or perceived value, where none previously existed. There isn't a fixed amount of money floating around.

    And those farm houses a few blocks away survived because the destructive path of a typical torando is very narrow. If the twister had come closer to the farms, they would have been nuked just as thoroughly. This is a force that toses around cars and can *pick* *up* a typical house. The house doesn't exist that can survice a direct hit or even a near miss from a tornado.

    Today, you can get nearly anything for a recurring fee. Although all the fees themselves are small, they total to whopping bills if a person needs or wants all those goods and services.

    The key word being "wants". And most of the new fees are for what amounts to new utilities: cable/satellite service, cell phones, internet access, etc. People buy them voluntarily, and most seem somewhat satisfied. Mosdt of the problems arise from these things being new servies, and the bugs remain to be worked out.

    Why not just leave people alone? You don't like paying fees? Fine. Don't pay them. My mom is writing her autobiography on a Mas IIsi running Mac OS 8. If you don't need to upgrade, don't. Stop whining and let others make the choices that are best for them.

  • by DoctorHibbert ( 610548 ) on Thursday July 15, 2004 @10:41AM (#9707232)
    It's a mistake to compare quality of old houses with new houses. Why? Because all the old poorly built houses are already gone. There are well constructed houses built today that will last centuries (provided sufficient maintainence of course). Most of the poorly built houses today won't be around in a hundred years, just like the ones houses built 100 years ago aren't around today.

    And really, so much of that depends on the amount of maintainance over the years. Old construction techniques and materials are generally inferior to modern ones, yet if maintained properly they can last a long time. I've just completely renovated a 100+ yo Victorian, I know what I'm talking about. Example, our house has a fieldstone foundation that every decade or so needs new mortar in many of the joints, while a modern reinforced concrete foundation generally needs no maintanence. If we don't do that work, the foundation starts to fail and portions of the house sink. Its already happened to some degree, not single room is level. Yes, that can happen in new construction, but not nearly as much and its usually because of problems in the plots geology not the foundation construction.

    Anyway, to reiterate, you don't see many crappy old houses because they are already gone.
  • by Ashtead ( 654610 ) on Thursday July 15, 2004 @10:54AM (#9707359) Journal
    The computer might have become different, but the knowledge of how it used to work will not be lost. Just look at Charles Babbage's mechanical computer (difference engine) from the 1830s; no theoretical difficulties in re-implementing that in the 1980s, some 150 years later.

    Similarly, unless some catastrophic loss of historical information should occur, someone 200 years from now would still be able to fathom the concept of a command-line or even a desktop-metaphor GUI. They will of course think it is clunky and old-fashioned, but hey, it is a couple centuries past the state of the art....

  • by Diabolical ( 2110 ) on Thursday July 15, 2004 @10:54AM (#9707362) Homepage
    If the data formats are standardized it should not matter what kind of hardware or media is used, the data just migrates from one technical platform to another.

    I firmly believe that without this we will lose a significant part of our history. Current history is known because of durable "storage" like paper and fossiles, stone tablets or murals. The materials are all degrading but last longer then something digital.

    If we keep on trusting on technology we use right now we would be very lucky if anyone in the near future would be capable of finding anything significant which would be representable of our time. All our information is being recorded in digital format. This includes important things like presidential speeches, signed documents etc.

    This society is more and more dependent on electronic information. Alot of information isn't available in printing anymore let alone in a true durable format. If for some reason there will be some major catastrophy any survivors' offspring in the future will know nothing about this age and it's mistakes and would not learn a thing about it.

    We had the opportunity to study history because of the durability of it's information. Our information, however, doesn't even last a lifetime.
  • by Travis Fisher ( 141842 ) on Thursday July 15, 2004 @11:21AM (#9707633)
    As has been pointed out elsewhere in this thread, one way around this problem is to use hardware which can be emulated in other hardware. The problem with this approach is that if you want to assume that a perfect emulator of your hardware will always be available you need to use highly standardized hardware. With the commoditized hardware of today its a stretch to imagine perfect emulation for any given component besides maybe the CPU. In the personal computing world, the closest to perfect emulation of an old machine is for something like the commodore 64, where modern emulators are close to bug-for-bug support of every chip in the machine, including video and i/o. In the x86 world emulators aren't close to that level of approximation.

    So the next thing you might consider is a program which doesn't depend that much on the hardware details, but uses a well-defined interface between its execution and the hardware. This "well-defined interface" is precisely what a modern operating system provides. This next level of abstraction means that you don't have to anticipate exact emulation of hardware, just emulation of hardware sufficient to run the OS and also provision of drivers for the OS to interface with either emulated or physical hardware (like video cards, disk drives, etc.) This potentially requires a lot of things to be updated in a nontrivial way for even small changes in the host platform, not very good.

    I would suggest the best way to get software that could be run for 200+ years is for it to be written for a particular virtual machine. The code for both the software and the virtual machine should be open source (so long-term portable and fixable), and the virtual machine should be very well-defined and not subject to version changing. The virtual machine software should also be cross-platform today so it is easier to port to tomorrows platforms. Java is close to the right idea, but it doesn't have a single virtual machine well-enough defined across various implementations and versions, nor is it open source.

    So why is it better to have a virtual machine written in say C which has to be ported to each new hardware/OS combination rather than having the base application written in C and ported to each new hardware/OS combination? The economy of scale. The virtual machine should have many users who will participate in the port and check for bugs. Then each coroporation/municipality/whoever who wants to run long-term stable code doesn't have to do this porting for themselves. For software which uses minimal I/O (no graphics whatsoever, only stdin/stdout) it probably would make more sense to keep the software in C and port it. But otherwise I/O isn't standard enough without a virtual machine...

  • by Zardog ( 685943 ) on Thursday July 15, 2004 @11:34AM (#9707786)
    Seems silly to me when you consider even human language has changed so much in the past 100-200 years. Just think what it will look like in 100+ years. If you take any novel or movie today, aren't they just rewritten rehashes of plots that have existed for the past 1000+ years? Important software, like important ideas will be maintained, migrated, changed, morphed and improved upon. The really bad ideas will hopefully decay out of existence. Really, who the &$^% cares what was in our Oracle DB 100 years from now?
  • From a programmer. (Score:3, Insightful)

    by Gordon Bennett ( 752106 ) on Thursday July 15, 2004 @12:05PM (#9708105)
    I take a bow to the report and the author.
    Being a computer programmer, I wouldn't call myself a 'Software Engineer' due to the appalling state of writing software in its current state. There has been for quite some time this in-bred mentality of Versions, that nothing is ever finished, mostly driven by commercial greed - despite the huge advances in computer power, our OS'es and their applications are still struggling to keep up with an Amiga, for chrissake.
    Moreover, it can have lethal consequences; for example, radiation treatment, or airplane control. Deaths ensued. "Sorry that your college outing ended in all their deaths, we were running 1.1.3 of the aileron system."
    Sure, even mechanical engineers get it wrong, but their main onus is to make something that will work, not, as in the software case, 'get it out now, fix it later'.
    So, if someone says they are a 'Software Engineer', ask them, what is it they do that merits the 'Engineer' tag - would they build a bridge that lasts? Nope.
  • by rjstanford ( 69735 ) on Thursday July 15, 2004 @12:43PM (#9708491) Homepage Journal
    Governments can enforce that vendors must provide proper documentation of their software data formats before a deal is struck, especially if the system is going to run national infrastructures, such as IRS, etc. Especially when the system costs in the hundred of millions (if not billions), why don't they enforce that? I would be multibillionaire if I knew the answer.

    Well, for large scale data apps, it is available. Heck, I've taken courses in both Informix and Oracle internal structures - in memory and on disk. The information is certainly useful to have in rare but uncomfortable-when-they-happen situations. So not only are the structures (semi)-public, but there is a pool of people who can help you if you need the help. Also, a lot of major companies use source-code escrow in case they ever fold, with automatic release clauses in their larger contracts. This is not an unknown problem these days.
  • by AnotherBlackHat ( 265897 ) on Thursday July 15, 2004 @12:58PM (#9708659) Homepage
    We build disposable software, because computers are still disposable.
    Not because they can't be built to last, but because they quickly become obsolete.

    If Moore's law continues to hold for 40 years, computers will be over a million times more powerful than they are now, the cheapest drive you could buy would hold more than a petabyte, and we'll be saying things like "I remember when a thousand bucks for a terabyte of ram seemed like a good deal, and now I can't even buy a ram stick that small".

    Once the breakneck pace of expansion stops (or at least slows to a reasonable rate) then we should look at making software that lasts.

    Video compression technology is big business today, but it's probably going to seem like a silly idea in the future.

    We don't need buggy whips that last 200 years.

    -- less is better.
  • by Spaceman40 ( 565797 ) <(gro.mca) (ta) (sknilb)> on Thursday July 15, 2004 @01:02PM (#9708715) Homepage Journal
    You know why we can drive the latest vehicle over an old bridge, or fill a new high-tech water bottle from an old well's pump? It's because the way water works has stayed the same - it's a liquid with certain wonderful properties - and the way bridges "interface" with land vehicles has stayed the same.

    When we have constantly changing standards, often incompatible with earlier ones, software that works wonderfully with the earlier one will die. This isn't the software's fault any more than it would be the pump manufacturers fault if H2O's density suddenly rose (or viscosity or something).

    It's all about the standards.
  • by Spazmania ( 174582 ) on Thursday July 15, 2004 @01:32PM (#9709041) Homepage
    Many things in society are long-term

    Not really true.

    Those historical buildings? They've been gutted and rebuilt from the inside out at least once during the past 50 years for the installation central air conditioning and elevators for the handicapped. And they're the exception to the rule. Few commercial buildings go more than 15 years without major renovation and few residential buildings make it more than 30.

    Roads? Sure, US Route 1 does still travel approximately the same route but its repaved frequently, expanded and changed frequently, and its been supplanted in its original purpose as the major east-coast north-south route by Interstate 95. And even Route 1 has existed for less than a century. Before automobiles at the begining of the 20th century, there was no need for anything like it. Before automobiles, who could conceive of a multilane asphalt highway that needed to sustain speeds over 500 miles per day? How could yesteryear's engineers possibly plan for it?

    The US constitution, the foundation of our law, has seen two major overhauls in the past two centuries: first due to the civil war and again because of the great depression. Even where parts of the text remain the same, their meanings have been drastically altered by the courts. Free speech has become freedom of expression. The right to bear arms somehow doesn't exist at all inside Washington DC except for police. The states have gone from being the primary seats of governance to being almost entirely subsidiary to the federal government. We're living under an almost totally different government than what saw the dawn of the 19th century.

    Even the Catholic Church publishes a new catechism each year, a book which defines the religion. You'd think during two millennia they'd figure it out once and for all, yet it continues to evolve and change.

    Few things last, either in their original purpose or their original design. They're continuously rebuilt, redesigned and reinvented... Even things like roads, buildings and governments for which our design experience goes back thousands of years.

    Our software experience goes back 40 years, if you can call what we did 40 years ago software by any current definition. Why should we build it to last longer than than the roads and buildings, and indeed longer than software in any form has existed?

    I'm sorry, but I'm not smart enough to successfully plan ahead two centuries and neither are you.
  • by wcrowe ( 94389 ) on Thursday July 15, 2004 @04:16PM (#9710775)
    Right now, somewhere, there is a government agency putting important data into long term storage, which was created in Microsoft Word. In a few years that data may be unreadable, not because the medium has deteriorated, but because the software that created it will have evolved or no longer exist.

    This is just one example of how proprietary formats are bad for storing important data, long term. This problem was noted years ago when it was discovered that VA tapes, tucked away in underground facilities back in the 60's, could no longer be read because the software that created them is gone.

    An ideal data scheme would include information which describes the data being stored along with the data itself. An example is XML. This concept needs to be pushed.

    It is more likely that we can solve the problem of proprietary data storage schemes long before we can implement 200 year software.

  • by mcrbids ( 148650 ) on Thursday July 15, 2004 @11:25PM (#9713537) Journal
    You can't worry about your software working for that long until your hardware can last that long.

    Bzzzzzzt! I call Bullshit...

    The C programming language has been with us 30 years. Most of the non machine-specific coding from 30 years ago would work today with almost no modification on today's Ghz PCs.

    I develop large, powerful applications in PHP that will work well on a Linux, Solaris, Irix, Windows, or AIX system, with virtually no porting whatsoever. Furthermore, the software itself is the executable - it's human readable in its production form!

    And aren't Java applications run in a sandbox Virtual Machine?

    As I've developed increasinly powerful and complex Internet applications, I've discovered that the key to reliability is to develop applications where the individual computer really doesn't matter - "It's the software, stupid!"

    Utilizing standards based, open languages and protocols (PHP, PostgreSQL, Linux, TCP, HTTP, etc) means that my applications will work today, tomorrow, and for many years to come on whatever hardware.

    Linux/OSS software is poised to take the seat of this critical software infrastructure, away from Microsoft who, having this position, have abused the power it brought them.

    So, I use open source software, open protocols and strategies where it makes sense, and relax knowing that the stuff I write the *nix/POSIX standards will be quite accessable and usable in the future.

    -Ben

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...