Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

No More Coding From Scratch? 323

Susan Elliott Sim asks: "In the science fiction novel, 'A Deepness in the Sky,' Vernor Vinge described a future where software is created by 'programmer archaeologists' who search archives for existing pieces of code, contextualize them, and combine them into new applications. So much complexity and automation has been built into code that it is simply infeasible to build from scratch. While this seems like the ultimate code reuse fantasy (or nightmare), we think it's starting to happen with the availability of Open Source software. We have observed a number of projects where software development is driven by the identification, selection, and combination of working software systems. More often than not, these constituent parts are Open Source software systems and typically not designed to be used as components. These parts are then made to interoperate through wrappers and glue code. We think this trend is a harbinger of things to come. What do you think? How prevalent is this approach to new software development? What do software developers think about their projects being used in such a way?"
This discussion has been archived. No new comments can be posted.

No More Coding From Scratch?

Comments Filter:
  • Oh my (Score:2, Insightful)

    by Lord Duran ( 834815 ) on Saturday November 04, 2006 @06:59PM (#16719681)
    If you think debugging is a pain NOW...
  • by Anonymous Coward on Saturday November 04, 2006 @07:02PM (#16719713)
    Machine language maybe. But if you are using the for loops, arrays, stacks, conditions, or whatever of your high-level language, you are already reusing code at that point. We just keep taking the theft to higher and higher levels. All good art starts with theft.
  • by Zepalesque ( 468881 ) on Saturday November 04, 2006 @07:13PM (#16719809)
    As the developer of freely available software, I find the prospect of people using my code a mixed bag. Partially, I feel an ownership of the code I write and am somewhat offended by the idea of people using it . However, as a paid engineer, I go through this at regular intervals; older projects get handed to others for support, as I work on new components.

    On the other hand, I welcome the idea that my free code would be used by others - it is a flattering prospect, I suppose.

    Others profit from this sort of re-use: COM, CORBA, Jars, etc...
  • No way Jose. (Score:4, Insightful)

    by 140Mandak262Jamuna ( 970587 ) on Saturday November 04, 2006 @07:27PM (#16719935) Journal
    Yeah, all European Space Agency was trying to do was to use the Ariane 4 code in Ariane 5. And the rocket blew up 40 seconds [umn.edu] after the launch. Why? Ariane 5 flies faster than Ariane 4 and hence it has larger lateral velocity. The main software thought the readings were too high and marked lateral velocity sensors as "failed". All four of them. Then without sensors all the computers shut down. The vehicle blew up. But by that time some bean counter had already shown millions of francs in savings, claimed credit for specifying ADA language for flight control software, collected his bonus.

    Some basic tasks like file io or token processing and such minor things might be reused. But even then porting something so simple like a string tokenizer written for a full fledged virtual memory OS like Unix/WinXP to a fixed memory handheld device is highly non trivial, especially if you want to handle multi-byte i18n char streams

    If the author sells what he was smoking while coming up with the article, he stands to make tons of money.

  • by Anonymous Coward on Saturday November 04, 2006 @07:43PM (#16720065)
    Its not theft when the people are intentionally creating the programming language for you to use and giving the consent for you to use it.

    Furthermore, its not "All good art starts with theft" its "Good artists copy, great artists steal".
  • Re:No way Jose. (Score:3, Insightful)

    by StrawberryFrog ( 67065 ) on Saturday November 04, 2006 @08:15PM (#16720339) Homepage Journal
    all European Space Agency was trying to do was to use the Ariane 4 code in Ariane 5. And the rocket blew up 40 seconds after the launch. Why? Ariane 5 flies faster than Ariane 4 .. the main software thought the readings were too high and marked lateral velocity sensors as "failed"

    You claim that this rocket failure is due to software reuse. That just sounds wrong to me. I don't think that not starting from scratch is that relevant. I could more convincingly argue that the failure is due to the software not being tested with the input values that it would receive during operational use. That is important, be the code new or old; and when a failure costs millions, not doing so is inexcusable.
  • Or maybe... (Score:2, Insightful)

    by JaWiB ( 963739 ) on Saturday November 04, 2006 @09:07PM (#16720723)
    Libraries
  • by Jerf ( 17166 ) on Saturday November 04, 2006 @09:46PM (#16720933) Journal
    Some context for people who didn't read the book... or didn't read it carefully enough.

    First, Vernor Vinge has a PhD in Computer Science. This obviously doesn't guarantee he can't be wrong, but to those commenters who said something like "these ideas are idiotic"... you've got an uphill battle to convince me that you're that much smarter than Vernor Vinge, especially as most of you saying that don't show me you understood what he was saying in the first place.

    Second, A Deepness In The Sky is set in his "zones of thought" universe. In this universe, the fundamental limits of computation vary depending on where you are physically in the galaxy. This is only faintly hinted at in A Deepness In The Sky, it is explicitly spelled out in A Fire Upon the Deep. This limit on computation may or may not be real. One of the effects of this limit on computation is that you can build a system larger than you can really handle, and eventually all such systems come apart in one way or another. This story is set thousands of years in the future and it is explicitly (albeit subtly) pointed out that the software running the ramscoop ships has direct continuity with modern software. (Qeng Ho computers use the UNIX epoch as the most fundamental form of timekeeping; apparently even the relativistic compensation is layered on top of that.) We are at the very, very beginning, where it is still feasible to burn an OS down entirely and start from scratch.... or is that really still feasible? (Perhaps Microsoft will soon find out.)

    Those of you posting that "we can always wrap it up in an API or whatever", I'd say two things: First, you get the Law of Leaky Abstractions working against you. The higher up the abstractions you go, the more problems you have. (Look it up if you don't know what that is.) The more sub-systems you make, the larger the state space of the total system becomes (exponentially), and the harder it is to know what's going on. It is entirely plausible that you eventually hit a wall beyond which you can't pass without being smarter, which, per the previous paragraph, you can't be.

    In other places in the galaxy, you can be smarter, and Vernor Vinge postulates the existence of stable societies on the order of thousands or tens of thousands of years or beyond, where the society actually outlasts the component species, because the software base that makes up the species does not exceed the ability of the relevant intelligence to understand it.

    Both cases (software might exceed intelligence, intelligence might grow with software) are extremely arguable, and I do not think he is advocating either one per se. (Leave that for his Singularity writings.) But you do him a disservice if you think he is not aware of the issues; he's extremely aware of the issues, to the point that he is the reason some of us are aware of the issues.

    (Even this is a summary. In isolation, probably the best argument is that it is always possible to create a software system one can not understand or control, but one person can be wise enough to avoid that situation. However, in A Deepness in the Sky Vernor Vinge explicitly talks about how in a societal situation, one can be forced by competitive pressures to over-integrate your system and make it vulnerable. "OK, but the government can be smart enough to realize that's going to happen and step in to stop it." First... smart government? But even if you overcome that objection, now your society faces death-by-surveillance and other automated law enforcement mechanisms, which since they can't be human-intelligent will fail. If you avoid that (and it is a big temptation), then you face the problem of anarchy. And remember that "governance" is anything that governs; even if the "formal government" doesn't regulate you to soceital death, private corporations may do it. Anyhow, upshot, Vernor Vinge has done a lot of thinking on this topic, it shows in his books, it is not showing in the criticisms I've seen posted, and when it gets down to it he really has more questions than answers.)
  • by someone300 ( 891284 ) on Saturday November 04, 2006 @09:57PM (#16721017)
    Component based programming for Linux turned out very different. Since Linux stuff tends to be developed from the bottom up, starting with a library, then a console app, then a GUI, most "components" ended up just being stuff lower on the stack, like libpng, gzip or whatever. That, and file formats tend to be standardised and read with a library, rather than file access being implemented in any specific application that needs to be loaded every time the file needs to be accessed.

    On DOS and Windows, applications tended to be all-in-one blobs that use proprietary file formats and only depend on the operating system itself. Thus, OLE made sense. There was OLE-type technologies on Linux (such as Bonobo and KParts) but they never took off in any big way due to the general lack of need. In most cases, they were just used for automation, which better technologies such as DBus now exist.

    As for "real code reuse" on Linux... try ldd-ing any application and seeing how many other libraries it uses to do it's bidding. Many application's I've written are just sticking together bits and pieces of other libraries, and that's in C++. I'd go as far as saying that code reuse on Linux is currently better than that in Windows. Generating quick applications with Linux tends to be pretty easy using bash, Python and even C, C++ or C#, where libraries have directly translated into "components" and "modules". The thing on Linux slowing down development isn't having to reinvent the wheel every time you write code, but rather, deciding what libraries you want to depend on and the difficulty of the languages favoured in Linux development, like C.

    Since you bring it up, OLE seems to be taking a backseat in Windows now, where ActiveX is rarely used by anything other than IE plugins. Now clipboard/d'n'd/DDE and Microsoft Office are the only times I actively see it being used. COM is still heavily in use by libraries such as DirectX, however. It was never really used to it's full potential, such as to open Photoshop documents in Microsoft's preview application. Windows, with .NET, seems to be more of a developer friendly way of providing and using library components as a developer does in Linux. .NET components remind me of how libraries are used in Linux. IMO this is a cleaner way of working and it's nice that Microsoft are starting to think this way.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...