Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming IT Technology

C with Safety - Cyclone 392

Paul Smith writes: "New Scientist is carrying a story about a redesigned version of the programming language C called Cyclone from AT&T labs. "The Cyclone compiler identifies segments of code that could eventually cause such problems using a "type-checking engine". This does not just look for specific strings of code, but analyses the code's purpose and singles out conflicts known to be potentially dangerous.""
This discussion has been archived. No new comments can be posted.

C with Safety - Cyclone

Comments Filter:
  • AT&T has solved the traveling salesman problem by translating it into an input their program understands...

    wasn't this supposed to be an NP-Complete problem?
    • by MarkCC ( 40181 )
      Grrr... You're making one of my least favorite ignorant twit mistakes.

      Back in grad school, I used to read comp.theory, and at least once a month, we'd have some jerk post to the newsgroup "NP solved!", followed by some stupid, exponential time algorithm for 3-SAT or something. Invariably, the poster would spend thousands of lines defending his supreme genius in being the person who solved an NP complete problem!

      NP complete does not mean unsolvable. It means slow.

      Roughly speaking, NP problems are problems for which their is no non-exponential time solution known, but for which solutions can be tested for correctness in polynomial time. (To translate, exponential time means that the time to compute the solution for a problem of size n is bounded by x^n for some n. Polynomial time means that the time for a problem of size n can be bounded by some polynomial of n.)

      The travelling salesman is a classic example of an NP problem. Given a set of cities, and the distance between any two cities, compute the shortest route that visits each city once. It's a trivial problem, but as far as anyone knows, it's not possible to write a program that quickly determines the correct shortest route in every case.

      NP complete problems are problems which have the fascinating problem that *if* you found a polynomial time solution for that problem, then you would have found a polynomial time solution for all NP problems.

      The travelling salesman is, if I recall correctly, slightly *worse* than NP complete. Again, if I recall correctly, if you have a P-time solution to the TSP, then you provably have a P-time solution to any NP-complete problem; but if you have a P-time solution for an NP-complete problem, that doesn't mean that you have a P-time solution to the TSP. The proof is actually quite interesting, so go get an algorithm textbook, and read it. I'd suggest the Corman, Leiserson and Rivest text [fatbrain.com], which is my personal favorite.

      There are perfect, well known solutions for all of the classic NP complete problems. They're just exponential time. (For instance, for the travelling salesman: enumerate all possible routes; compute the length of each route; and pick the shortest one.)

      • by zatz ( 37585 )
        Be careful pointing the finger about ignorant mistakes.

        TSP cannot be worse than NP-complete, because it is obviously in NP. Phrased as a decision problem (is there a Hamilton path through this graph shorter than length y?) it is trivial to verify a solution in polynomial time. If you can verify in P, you can solve in NP.

        Note that rephrasing as a decision problem doesn't change the order much, because you can just do a binary search with O(log N) steps where each is a decision subproblem. Also note that transforming it into a decision problem is *necessary* to discuss its NP-completeness, because the very concept is only defined for decision problems.
  • by dave-fu ( 86011 ) on Friday November 16, 2001 @02:15PM (#2575490) Homepage Journal
    Not a flame, but more "modern" languages such as Java and C# have constructs explicitly built to avoid the buffer overflow/pointer gone insane problems.
    For the rest of the world, secure C programing [google.com] is far from a secret.
    • by The_egghead ( 17079 ) on Friday November 16, 2001 @02:22PM (#2575541)
      There's a key difference here though. Java requires a run-time stack to do all of it's safety checking. This project aims to all of it's checking at compile time, so it's static, rather than dynamic. This is _VERY_ hard problem, and is where virtually all of the programming languages research is centered today. However, you're instinct that this is not a new concept is correct. Microsoft is actually doing very similar research in the form of a project called Vault [microsoft.com].
      • I heard those groans of disgust!

        Seriously, modern Pascal compilers like Delphi/Kylix are capable of some compile-time checking...Pascal already has strict var type checking, and all you have to do is make sure its turned on when you compile.

        This also includes bounds checking for arrays. Pointers are handled better than most C compilers, too.

        The key difference here is that it sounds as if Cyclone checks the code for *intent* rather than just checking the types and such. That IS a hard problem. :-)
      • You are wrong.

        Java does not rely on a "run time stack" for its type checking, whatever that means. Java does plenty of checks at compile time (and load time, if you're using dynamic loading/linking).

        Java, like Cyclone, Vault and every other language you'd ever want to use (and many you wouldn't), relies on a combination of static and dynamic checks to ensure safety. Cyclone does move more checks over to the static side than Java does, so it might get higher performance. But no compiler, and certainly not Cyclone's, will be able to eliminate all dynamic checks (for array bounds and null pointers, for example). Vault moves even more over than Cyclone.

        There is a spectrum that describes the amount of dynamic checks that have to be performed for safe execution of a language. It looks a bit like this:
        Vault ... ML ... Cyclone ... Java ... Perl ... Scheme

        (C and C++ aren't on there because they don't have any concept of "safe execution" :-).)
      • Interesting, but the 'Vault' page hasn't been updated in almost a year. I guess Microsoft doesn't think that buffer over-runs are a serious problem.
      • This is a misconception - The only runtime phenomenon that Java requires is array bounds checking. Everything else - including all of its safety rules are performed in a static verification pass. This is what allows Java to be "jit'd" to native code and run at full speed.

        However because Java has so much more structure by virtue of the intermediate byte code language a runtime profiler can dynamically optimize sections of code based on their behavior, not just their static characteristics.

        Ask yourself - what can I do at compile time that I can't do better with more information at runtime? The answer is nothing... The only trade off is in start up performance and it's just a matter of time before the OSs handle that better.

        Pat Niemeyer,
        Author of Learning Java, O'Reilly & Associates
      • Tagged unions
      • Parametric polymorphism
      • Pattern matching
      • Anonymous structs equivalent by structure
      • Parameterized typedefs


      (right on the web page detailing the language)
    • Lclint (Score:5, Informative)

      by Ed Avis ( 5917 ) <ed@membled.com> on Friday November 16, 2001 @02:39PM (#2575642) Homepage

      A lot of the static checking made possible by Cyclone can be done for ordinary C with lclint [virginia.edu], which lets you add annotations to C source code to express things like 'this pointer may not be null', 'this is the only pointer to the object' and so on. You write these assertions as special comments, for example /*@notnull@*/. These are checked by lclint but (of course) ignored by a C compiler so you compile as normal. (If you weaken the checking done, lclint can also act as a traditional 'lint' program.)

      Also C++ provides a lot of the Cyclone features, not all of them, but it certainly has a stronger type system than C. I'd like to see something which combines all three: an lclint-type program that lets you annotate C++ code to provide the extra checks that Cyclone (and lclint) have over C++.

    • Here it is from the User's Manual [cornell.edu]

      There are other safe programming languages, including Java, ML, and Scheme. Cyclone is novel because its syntax, types, and semantics are based closely on C. This makes it easier to interface Cyclone with legacy C code, or port C programs to Cyclone. And writing a new program in Cyclone ``feels'' like programming in C: Cyclone tries to give programmers the same control over data representations, memory management, and performance that C has.
    • Java is great for applications, but you'd never want to start writing device drivers or a virtual memory system in Java. For that you need c, which is basically just a step up from assembly language. Still, people make mistakes, and this will help them.

      Of course, if you're still writing applications in c, you're just asking for it. Cyclone might help, but you probably have other issues anyway.
    • There is a whole host of languages more "modern" than C, Java, C++, C#, Pascal, Ada, Perl, or any other of the essentially von Neumann-style languages out there. I highly recommend that anyone out there who is interested in advanced type-safe languages take a look at SML, O'Caml, Haskell or Clean. Most of these languages have more or less formalized language semantics (as in mathematically precise). Formal descriptions and strong type systems allow the compiler to *prove* (again, in a mathematically precise sense) that a program can not go wrong at run time.

      Benjamin
  • by mshomphe ( 106567 ) on Friday November 16, 2001 @02:18PM (#2575520) Homepage Journal
    buggy code to tell me when my code is buggy.
  • by Anonymous Coward on Friday November 16, 2001 @02:20PM (#2575532)
    C is *supposed* to be dangerous, damnit.
    • yep

      in C you can access memory

      and so you have to to write to hardware mmaped locations to control devices

      how do you think low level manimpulation of devices is done in java/scheme

      the point is they use very little and it provides a generic functionality its easy to audit and keep clean of bugs

      so kludgeing stuff to make things typesafe seems to me at least silly
      but remember Wickedness is a myth invented by good people to account for the curious attractiveness of others

      regards

      john jones
  • It seems to me that much of what this does could be easily implemented in a C library directly or with #define'd replacements of the C library functions in question. The type issues seem to be all that is unique here.
  • No No No (Score:5, Funny)

    by VFVTHUNTER ( 66253 ) on Friday November 16, 2001 @02:21PM (#2575535) Homepage
    We had C, then C++, then C#. So shouldn't the next logical step be C followed by three vertical lines and three horizontal lines (that'd be C-tic-tac-toe)?
  • by Tsar ( 536185 ) on Friday November 16, 2001 @02:22PM (#2575547) Homepage Journal
    I like the notion of building protection against common, insidious errors, but why did they have to create a new language to accomplish it? I didn't quite understand that point.

    And isn't a cyclone an infinite loop?
    "Our ultimate goal is to have something as humongous as the Linux operating system built in Cyclone," says Morrisett.
    You have to like a scientist who uses the word humongous.
    • I like the notion of building protection against common, insidious errors, but why did they have to create a new language to accomplish it? I didn't quite understand that point.
      The problem lies in the difficulty of reasoning about the semantics (and therefore the correctness) of the program being analyzed. Put simply, C is a disaster for semantic analysis. In newer languages whose design is informed by modern PL research, a goal is often to avoid the sorts of design pitfalls that make analysis difficult.

      Ever had an agressive optimizer break code, such that you had to use a lower optimization setting? This can be a symptom of weakness in the compiler's ability to statically analyze the program. Not just a garden variety "bug", but rather the optimization is correct only for a subset of valid input source code! I.e. it can be difficult to impossible to prove that a given optimization is safe, aka "semantics preserving".

      Many modern PL researcher/designers thus aim to give compiler writers a head start by ensuring that the language design permits increasingly powerful forms of static program analysis. Functional language work in particular has focused heavily on utilizing language and type system design to enable more powerful analysis support. (cf. the various published papers on the Haskell and OCaml languages as a starting point).

  • by kaisyain ( 15013 ) on Friday November 16, 2001 @02:26PM (#2575566)
    Someone created a language the enforces types and does bounds checking! It's news!
  • by Anonymous Brave Guy ( 457657 ) on Friday November 16, 2001 @02:29PM (#2575595)

    I'm a professional software developer, and all for anything that makes my code safer without unduly compromising it. But I can't help thinking that starting from C is probably a mistake.

    C is a fundamentally unsafe language. It has some easy fixes (remove the always-unsafe gets() function from the library, for example). It has some fundamental "flaws" (pointer arithmetic and the use of void*, for example). I quoted "flaws" because, while these features make the language necessarily unsafe, they are also very helpful in the low-level programming that got C to where it is today.

    The underlying problem here has never been with C, it's been with using C for the wrong jobs. Application code, and certainly high-level code where security is essential, just aren't C's strong suits. I can't see how even the geniuses we're talking about can start from such a broken language (in the context we're discussing) and successfully make a non-broken language out of it.

    I would expect a much better solution to be that followed by later C-like languages. C++ retains the low-level control, but other languages (Java, C#, etc) are available to those willing to sacrifice some of that control in exchange for added safety, and consequently may be better tools for different types of project. The biggest problem at the moment is that none of these "safer" languages has yet developed the same raw expressive power of C++. As they evolve, and catch up on the 20-odd year head start, hopefully we'll see programmers given a genuine choice between "safe but somewhat limited" and "somewhat safe but unlimited".

    • by Black Parrot ( 19622 ) on Friday November 16, 2001 @03:28PM (#2575872)
      > The biggest problem at the moment is that none of these "safer" languages has yet developed the same raw expressive power of C++.

      Take a look at Ada [adapower.com]. Extremely safe, extremely powerful, extremely unpopular. Go figure.

      It's object-oriented, it supports generic classes ("packages", in Ada terminology), it has built-in support for multitasking and distributed programming, it lets you (optionally) specify even such details as numeric representations for the ultimate in portability, and it has a set of first-class and well-documented bindings for GTK+ [act-europe.fr].

      There's a free compiler called GNAT, which is built on gcc and will actually be rolled in to gcc 3.1 or thereabouts. There's also a Linux-specific [gnuada.org] site for gathering and distributing component packages.

      And pace ESR, it wasn't designed by a committee.
    • ``The underlying problem here has never been with C, it's been with using C for the wrong jobs. [...] The biggest problem at the moment is that none of these "safer" languages has yet developed the same raw expressive power of C++.''

      You seem to have assumed, for the purpose of the above exposition, that implementation languages are chosen by well-informed people, and substantially on the basis of technical merit. That's not always the case. Well, outside your shop in any case. ;-)

      In my opinion, acceptably safe languages that are quite expressive do already exist. I do not believe that the alleged deficiencies of safe languages explains the continued use of "unsafe" languages in domains for which the latter are not a good fit; I believe that, on the average, ill-conceived implementation strategies are more likely at fault. How many projects struggle with inadequate languages as a result of misinformed (or even uninformed) managers' inconsiderate (and uncontestable) decrees? Too many. :-(

      I am happy to learn that smart people are busy inventing the next great programming language, but I think that, collectively, we need to spend less time improving our tools and more time addressing the organizational deficiencies that result in our having to use the wrong tools when we know better.

    • I've been using ada at a job for 6 months now. It is a "safe" language.

      I didn't like it at first. Now I find I'm liking it more and more. It does a lot that make it really usefull in the "very very high" reliability programing.

      It does have very strong type checking al la jave. You can make you own range constraints on types you create ..Y is an it between -360 and 360.
      If you try to make Y bigger or smaller that that range you throw a contraint exception

      You pay a little in performance for this, but although I hear that if you did all that checking manually in another language it would be even slower.

      It has some other nice features that other programming languages have in various forms, including enumeration types , records types (like a struct) and you can specify down to the bit level the arrangement of the struct..ie which fields go where. It even has "packages" which are a bit like objects.

      Ada isn't as powerful as C though and it lacks a lot of the tools and libraries . Its also hard to find good books on it too.

      One joke at work is that Ada actually more powerfull because you can bind it to C code.

      We hear stories about other projects having problems with C and bigger problems with C++. Ada although slow to program in does nice for systems that require very high reliability.

      There is a Gnatt compiler which is free and open source too... Try it.
  • by Anonymous Coward
    I have been beta testing the cyclone development environment for some time now. For mature cyclone development, the amount of code output generated is equal to that being dissipated due to bug tracking. The dissipation rate per unit area is code density times the lag coefficient times the CPU speed cubed (See Emanuel 1999 for details). One could either integrate a typical code profile over a range of radii from the projects center to the outer radius encompassing the core, or assume an average CPU speed for the inner core of the system. Doing the latter and using 40 m/s (90 mph) coding on a scale of radius 60 km (40 n.mi.), one gets a code dissipation rate (bug generation rate) of 1.5 x 1012 Watts. This is equivalent to about half the world-wide script generating capacity - also an amazing amount of bugs being produced!

    Either method is an enormous amount of overhead being generated by Cyclone. However, one can see that the amount of lines of code released in a release (by creating overflows) that actually goes to maintaining the Cyclone System spiraling bugs is a huge ratio of 400 to 1.

    Stick with C++ I think.
  • by DaoudaW ( 533025 )
    The Cyclone compiler will rewrite the code or suggest fixes to avoid potential bugs

    I don't mind suggestions, but I'm not sure I like the idea of having my code rewritten.

    Couldn't the same error-checking be incorporated into a pre-processor rather than developing an entirely new compiler/language?
    • I don't mind suggestions, but I'm not sure I like the idea of having my code rewritten.

      In the early 90's, we were using one of the C compilers at the time (dont remember which, sorry, we quickly dumped it when Borland came out) one of the error messages was "Need semicolon here" with a ^ to show where. My reaction, every time, was "Shit howdy, if you know that, put it in, and make it a warning!"

    • Too Bad (Score:2, Insightful)

      by gaudior ( 113467 )
      If you are using ANY modern compiler, targetted for a modern CPU, your code is getting re-written without you knowing about it. It's getting re-arranged for pipeline efficiency, loops are getting unrolled, common sub-expressions are getting stripped. The notion held by some C programmers that they are smarter than the compiler is quite silly.

      I am not sure of the usefulness of this particular language/compiler/etc, but I like the direction they are going. DWIM(Do What I Mean) programming is becoming more and more possible, with this kind of language research. We want programmers to solve problems in the macro world, not be bothered with the minutia of the language they are using. This has been one of the appeals of perl over the years.

    • If you're using an optimizing compiler, then your code is being rewritten. Unrolling loops, storing of computed values, register assignment etc.
  • by Pemdas ( 33265 ) on Friday November 16, 2001 @02:34PM (#2575622) Journal
    The Cyclone compiler will rewrite the code or suggest fixes to avoid potential bugs. Even if a bug still occurs, the compiled system will lead the program to halt safely, not crash.

    Am I the only one to whom this sounds like potentially a really bad idea? I mean, think about it, coding along one day:

    #include

    int main() {
    printf("He

    At this point, small, cute cartoon versions of Kernighan and Ritchie pop onto the screen and say "It looks like you're writing a Hello World program! Click here to check this program for bugs automatically..."

    I'm just shuddering at the thought...

  • It can be done in C, if necessary:

    if (!infile) { perror("input file"); exit(1); }

    The advantage of C is that you are allowed to not use it, if you think it's not recommended in that case.
  • I'm sorry, Dave, I can't compile that.

    I know it's cliche, but really, do we expect it to be as smart as another competent programmer reviewing code?


    • Sure.
      Lots of us have been programming in statically-typed, safe languages for a long time. We do it not because we're poor, weak-minded programmers but because we don't have time to spend tracking down aliasing bugs and memory leaks. Though the compilers are not as "smart" (in a very strong sense) as people, they are much much more patient, and are actually very good at finding or preventing exactly these kinds of boring bugs.
      Most of these languages are very abstract. (ie, SML). Cyclone is actually a project to bring some of these ideals to the systems world, where concern over data layout and memory usage are more pronounced. They've added a few useful features to C, too (polymorphism! datatypes! pattern matching!)... so I think this is a good thing, even for hardcore C cowboys.
    • Maybe not as smart, but more constant. When C compilers started warning about things like

      if(a=b){

      }

      then we found code which had been reviewed by many different eyes without seeing these mistakes. We're just not good at doing repetative tasks, especially if we think we know what should be there. It's the the same reason we run a spell checker to check for the same word in a row, we're just not good at looking for that sort of mistake.

  • by jd ( 1658 )
    ...is this any better than the Semantic Validating Compiler that Stanford University developed?


    Other than "new" and "improved" sell products better than "useful".

  • New language? (Score:5, Interesting)

    by LinuxDeckard ( 457253 ) <linuxdeckard.netscape@net> on Friday November 16, 2001 @02:40PM (#2575649) Homepage
    I always let out a bit of a grumble when a new programming language comes out; they seldom add anything truly new to programming. When I read that Cyclone was strikingly similar to C, I was intrigued enough to skim through the docs [http].

    Put bluntly, Cyclone seems to be little more than C for lazy programmers. Fat pointers for those who can't follow the logic of pointer arithmetic and *`H for those intimidated by malloc() is not a beneficial service.
    • Yeah, just safety switches on firearms and nuclear weapons are for "lazy" gunmen and silo missile controllers, traffice lights and airbags are for "lazy" automobilists?

      There is a ring in the Inferno dedicated to people like you.
  • Seems to me PC-LINT gives you the same contextual checking... but I could be mistaken.
  • by Animats ( 122034 ) on Friday November 16, 2001 @02:45PM (#2575681) Homepage
    Cyclone is a long way from C. It requires garbage collection, has exceptions, and quite a bit of new syntax. Bell Labs has generated quite a few C derivatives. C++ is the only one to catch on, but Cyclone is at least the fifth C derivative to come from There was also C+@ (a Smalltalk-like variant) and two others previously discussed on Slashdot.

    I'd like to see Cyclone's kind of safety, but if you're going to require garbage collection and forbid pointer arithmetic, you may as well use Java.

    I've proposed "Strict Mode" for C++ [animats.com], a compatible retrofit to C++ that uses reference counts like Perl, but with some optimizations to get the overhead down.

    A basic decision is whether to have garbage collection. If you have garbage collection, C++ destructors don't fit well. (Java finalizers, called late, during garbage collection, can't be used for things like closing files and windows. Microsoft's C' has destructors, but the semantics are confusing and ugly, and we don't have much mileage yet on how well that will work.)

    Reference counts work reasonably well. There's a problem with not releasing circular structures, but that doesn't keep Perl from being useful. Perl now has "weak" pointers (they won't keep something around, and turn to null when their target goes away), and if you use weak pointers for back pointers, most of the circularity problem goes away. True rings of peer objects are rare, and they're the main case where weak pointers won't solve the problem.

    If you don't have garbage collection or reference counts, programs obsess on who owns what. A basic problem of C and C++ is that it's essential to track who owns which objects and when they're supposed to be released, yet the language offers no help whatsoever in doing so. This is the fundamental cause of most crashes in C and C++ programs. Almost every core dump, "bus error", or "general protection fault" comes from that problem. So it's worth fixing.

    It's the right time to address this. We're in a period of consolidation, now that the dot-com boom has collapsed. Our task as programmers over the next few years is to make all the stuff that sort of works now work 100%.

    • One day a student came to Moon and said: "I understand how to make a better garbage collector. We must keep a reference count of the pointers to each cons."

      Moon patiently told the student the following story:
      "One day a student came to Moon and said: `I understand how to make a better garbage collector...

      -- Jargon File
      • Reference counts are only one way to implement garbage collectors, and seen as one of the simplest, but not the best.

        A better way is to do reference tracing, where you trace all the objects which are currently in scope, and follow all their references. Anything which is not followed, is obviously collectable.

    • by elflord ( 9269 )
      Reference counts work reasonably well. There's a problem with not releasing circular structures, but that doesn't keep Perl from being useful.

      It doesn't prevent perl from being useful, but no language which uses reference counts is ever going to replace C or C++. The problem with reference counts is that sometimes they cause more problems than they solve. A good example is in GUI programs, where a lot of objects might be mutually aware of each other. That's not to say that reference counts are not useful. Rather, forcing programmers to use reference counting to manage memory whether appropriate or not is problematic.

      If you don't have garbage collection or reference counts, programs obsess on who owns what. A basic problem of C and C++ is that it's essential to track who owns which objects and when they're supposed to be released, yet the language offers no help whatsoever in doing so.

      C++ givas the programmer the flexibility to choose a memory management strategy that suits the problem at hand. Sometimes pool allocation works. Sometimes reference counting works. Sometimes, parent/child management works. It's very simple to implement reference counted classes in C++. It's certainly not necessary to exclusively use an "exclusive ownership" model in C++.

      Almost every core dump, "bus error", or "general protection fault" comes from that problem.

      They come down to a lot of problems -- library incompatibilities, bounds errors, and other things can cause these problems. I think it's naive to assume that using reference counting for everything will just make the problem "go away". Writing reference counted code without memory leaks gets quite difficult when the data structures are more complex.

      The URL you have is interesting, and I think for some types of problems, using an object system where you just reference count everything is probably a good idea. But I question its value as a cure-all.

    • A basic problem of C and C++ is that it's essential to track who owns which objects and when they're supposed to be released, yet the language offers no help whatsoever in doing so.

      C++ provides plenty of support for resource managements issues. The standard library includes vector, string, auto_ptr and many other related tools, all of which assist with guaranteeing memory is released properly. The fact that ill-trained C++ programmers continue to use raw arrays and pointers, when they should almost never be used beyond low-level code, is not C++'s fault.

  • Error 0 (Score:2, Funny)

    by VA Software ( 533136 )
    Compiling...
    test.c
    C:\stuff\test.c(3) : 'int main(void) {' : Error 0. Program is in C. This section of code could cause problems.
  • by Embedded Geek ( 532893 ) on Friday November 16, 2001 @02:49PM (#2575701) Homepage
    In my shop, we do everything on a shoestring, kludging together tons of C legacy code from multiple generations of our products. We take an application that ran on a homebrewed executive and stick it on an RTOS, spoofing it so it doesn't know the difference. We grab code written on an 8 bit microcontroller and port it to our 32 bit x86 with minimal testing. Given all this, my first thought at reading the article was to raise three cheers. The idea of making a system already written a lot safer... I can hardly find the words.

    Then I got chewing on it and realized something: when I came on board and suggested running lint on our code, I was shot down by both the rank & file and by management (who each blamed the other). When I suggested a concerted effort to rewrite our code to eliminate or justify (in comments) every warning our compiler spewed on a build, I got a similar reaction.

    Don't get me wrong. I think cyclone still sounds great, especially the pattern matching and polymorphism indicated on its home site [cornell.edu]. If it can gain some momentum, it stands to have a real place (niche?) in dealing with legacy systems. For my shop, though, I fear much of the value would be wasted. Until we change our motto from "There's never time to do it right, but always time to do it over" we're going to continue repeating our mistakes.

  • lint - is name of it. And it was made 20 years ago.
    p.
  • This sounds even more annoying than lint. :-)
  • (I'm not associated with this at all, but I read about it in Game Developer once, and it's really interesting.) @ Gimpel software [gimpel.com].

    I generally don't like internal type-checking within a language, because it results in slowness, and some los of power. (Sometimes there are times you want to do things that you normally shouldn't be doing, in order to speed up routines.) A language which prevents "bad programming practice" ends up screwing itself over. However, having an external source-code checking utility that tests for bad programming, while still allowing complete power would be much more useful, to me, at least....

    • I think you must have had bad experiences with safe languages (Java?). Static checking doesn't result in slowness (in fact, it can make compiled code faster in many cases, for instance by enabling alias analysis).

      Static typing and safety also allow for *more* power than a "do anything you like" language. One kind of power I get when I write in a language like this is the ability to enforce invariants without runtime checks. So if I am writing a program with several other people (or by myself across several evenings, except I am drunk some of those evenings), I can arrange my code such that bugs in one part of the program can NEVER affect other parts of the program. Thus, it is easier to figure out who to blame and where the bug is. This is impossible in a language like C, where any code can write over another module's memory, free its data structures more than once, or cast, etc.

      Speeding up routines with hacks is pretty overrated; there are very few places where this is necessary, and even fewer where it is desirable. In those cases, we can always fall back to C or assembly.
  • Cannot cast what I want? Oh, I feel cast-rated!!
  • Microsoft Word's grammar check has suggested to me in the past that "do it for the greater good" should probably be "do it for the greater well ".

    It's sometimes helpful in helping my catch my grammar mistakes. But more often than not, it's a PITA, and the act of wading through its incorrect suggestions is more work than I think it's worth. And that's when it's SO easy to figure out if the suggestion is right or wrong...the sentence is on the screen, standing alone, and I can instantly decide if it's right or not.

    Now, imagine wading through a bunch of suggestions and warnings on your code. Imagine having to figure out the context for the flagged code segnments, and having to review the code and all code which references it to see if it's correct or not.

    Sure, if you've got free time or resources to throw at it, using computer heuristics to attempt to help out humans is nice. But you have to realize that at this stage in the game, it often takes a lot of work to vet those results in order to glean any gain.

    • Of course, there is a huge difference between a language that can be described by mathematical logic (well, almost :-), ie. Cyclone, which is supposedly designed for intent checking, and natural language, which isn't even consistent, much less mathematically consistent and describable. The restricted domain and expression structure of Cyclone may enable it to do a much better job than any rules based, context ignoring, english grammar checker ever could.

  • get safety from the vm like java does. that way you don't have to re-write all your code. even java still has null pointer exceptions at runtime and it is regarded as very safe.

    i'd say more but i cut my right hand today and typing sucks.
  • Some time during the roaring 80's, Bill Joy made the following two predictions at a Sun Users Group talk:

    a) Computers would increase in speed, to the tune of 2^(year-1984) MIPS. [That would put us at 131,072 MIPS today, and 262,144 MIPS in a few months.]

    b) He predicted the rise of a safe system programming language he called C+++=-- (pronounced "see plus plus, plus equals, minus minus), which is a safe subset of a C++ superset.

    Java hadn't been invented yet, but Gosling (who was busy inventing NeWS at the time) wrote Oak aka Java several years later, and it fit the description to a tee, but just had a different name or two.

    [I'll never forgive Bill Joy for writing VI and CSH. Ewwww icky yucko!]

    -Don

  • by jdfekete ( 316697 ) on Friday November 16, 2001 @03:34PM (#2575919) Homepage
    Hi,

    In 1999, the Ariane 5 launcher exploded a few seconds after leaving the ground. The faulty program, written in type-safe Ada, has been submited to a static program analyzer developped by Alain Deutsch at INRIA in France. The analyzer spotted the error right away!
    It was a number going out of range after too many iterations and wrapping back to 0.

    The verification technique used was based on abstract interpretation.
    This is just to say that even a strongly type-checked language can fail and that type checks, whether static or dynamic, are not the only way to catch bugs.

    Alain Deutsch has started a company called Polyspace that sells static verifiers for Ada and C (See www.polyspace.com). The idea is not to rewrite C or Ada but to spot potential bugs inside programs.
    I have no special interest in this company, (I know Alain Deutsch), but I mean that improving C does not imply removing the type-unsafe onstructs.
    • > It was a number going out of range after too many iterations and wrapping back to 0.

      Which is impossible in Ada - wrap around semantics only happen if you specifically ask for them. The actual bug, as I've heard it told, was that the code wigged out when the physical environment became impossible for the Ariane 4 (since the code was written and designed soley for the Ariane 4.) Nothing could have found this bug without taking into consideration the differing enviroments of the Ariane 4 and 5, and that alone would have prevented the bug.

      • > The actual bug, as I've heard it told, was that the code wigged out when the physical environment became impossible for the Ariane 4

        That's correct. Something like this happened:

        A: Let's build a new rocket!
        B: Okay!
        A: Let's reuse this "smart part" from the old one!
        B: Okay!
        A: Let's don't review the smart part's code, or even test it on a simulator, since it worked flawlessly on the Ariane IV!
        B: Okay!
        AV: Crash!
        A&B: Ooops!

        Hopefully everyone can spot where the plan went awry.

        Here's a short from-memory explanation of what happened; you can find the official report on the Web pretty easily with a search engine:

        The part in question looked at acceleration/velocity/displacement (I forget which), made some decisions about them, and put some appropriate commands on the control bus. Alas, the Ariane V was so much powerful than the IV that the acceleration/velocity/displacement soon ran up to a number that was physically impossible for the Ariane IV, so the module concluded (correctly, according to its original design) that it was getting garbage in, so it started dumping debug info on the control bus. The engines tried to interpret the debug info as control commands, with predictable results.

        In lots of programs you could branch to some failsafe mode rather than dumping debug info on the bus, though it's not clear what the "failsafe mode" is for a rocket during launch. (If there were such a mode, you would just use that for your control logic to begin with!)

        There's not a language, compiler, static analyzer, or theorem prover on the planet that can catch this kind of problem, though the engineers should have "caught it" during the earliest phases of the design by specifying appropriate reviews/tests for the software and software specs on the reused part.
  • What about function pointers? What "region" do they live in? Say I create a struct with a bunch of function pointers (dur, to emulate OO), and the struct goes out of scope, what about the functions? I guess my question is, are all functions in global scope?
  • A mascot. It needs a little animated tornado, maybe named Cyclonius, to pop up and interact with the user.

    "You appear to be coding with Visual Studio. Please stop!"
  • Here it is, 2001, and we're still typing text in flat ascii files, remembering all of the arcane syntax and rules of the compiler, then submitting our attempts to it, awaiting it's response. Things haven't changed at all in 30+ years, what makes this different?

    We need to apply some of the innovations that have been built for everyone else, such as text with attributes, letting the compiler keep track of certain details, etc. Why do I have to track down every instance of a variable if I decide to change it's name? Why can't I simply change the value in the symbol table, and have the compiler spit it out with that new name when it saves it?

    Why not integrate the compiler, editor, runtime, all into an effecient kernel of an environment, similar to FORTH, but with the added benefits of type checking?

    It's been a long time, yet nothing has changed... what a waste.

    --Mike--

  • The researchers say C programmers can often create code that will results in a serious bug when the application is fully implemented.

    Clearly what's needed is a new version of English that doesn't permit grammatical errors.

  • by retrosteve ( 77918 ) on Friday November 16, 2001 @05:03PM (#2576397) Homepage Journal
    Back in the days when "speed" meant catching a train with a full head of steam, railroad repair engineers were a brave bunch, and many sported stumps of arms and legs as mute witness to their bravery in repairing moving trains.

    One day, a city slicker with a spotless seersucker suit and a perfectly pointy moustache was reported travelling from station to station, selling his new technology suite. It included remote manipulators for making repairs from a higher level, without having to go under the trains. It also came equipped with "parking brakes" for trains, to prevent them accidentally moving while they were under repair.

    This new "high level" technology was a hit in many towns, where the young repair technicians were unenthusiastic about life with missing limbs. In addition, the new technology came with many interlocking "safeguard" mechanisms to make sure that no fittings were left unsecured when the repair was completed. This saved many a "crash".

    But there remained many towns with older engineers, who had grown up doing things the "fast" way, repairing the trains on the fly (because things went faster that way!), and of course having the scars and stumps to show for it. They were also unenthusiastic about the "safeguards", declaring that they were "smarter than any newfangled machine", and could remember to close the latches and fittings themselves.

    In one of these Ancient Telegraph Towns, one of the older engineers, Cyclone Bob, came up with his answer to the newfangled "high-level machines" -- special steel braces to wear over arms and legs while repairing the moving trains. "In most every case, these braces will protect your precious limbs from the hazards of moving wheels!", enthused Cyclone Bob.

    The older engineers, who, when all was said and done, actually enjoyed mucking about under trains, and who had already paid their dues in missing limbs, were rather proud of the new braces, and wore them proudly. "My trains hardly ever crash now", they would say, "and now I don't always have to lose a leg to prove it!".

    The younger, smarter engineers continued using their "high-level" machines, and were happy that they still had arms so they could snigger up their sleeves.

  • Cyclone is a remarkable achievement, given they
    started with C...

    MISRA-C is also a good effort, although somewhat
    built on sand.

    The safety-critical community over here in Europe,
    and also a few projects in the USA use SPARK
    though, which is a high-integrity, annotated
    subset of Ada. It's static analysis tool
    is really remarkable - anyone for static proof
    of exception freedom? (e.g. static proof of
    no buffer overflow for all input data)

    Eiffel is also very good from a high-integrity
    point of view, and well worth a look. It amazes
    me how much effort goes into researching static
    analysis of langauges that are simple not designed
    for that purpose at all...ah well...

    - Rod Chapman

  • First off, good programming practices will resolve 99% of these problems. They aren't unavoidable, they're just the result of being careless. Of the few that any good programmer will let slip through once in a while, most could probably caught with an advanced lint-like tool that checks for things in the source code, or for that matter just a little bit of peer code review. I can't see much in the way of difficult-to-avoid problems that require runtime support to adequately detect in plain old C.

    In any case, a programmer's failure to be able to adequately program in C is no excuse for moving to a whole new language, compilers, runtime, libraries, standards, etc. The cost associated with migrating to the new language is excessive. It's like buying $10,000 gold-plated titanium training wheels for your sportsbike to solve your initial problem of being unable to ride the thing without falling over.
  • I wonder if this Cyclone makes programming "safer" by making it more difficult. What I mean by this is that some languages out there don't let you use pointers at all, or perform all sorts of checks on array bounds before each access. I like to call this "broken programming" simply because it isn't right in my opinion.

    A programmer should have all tools available to him, and should choose the best tool for the job when solving any given problem. Taking away tools doesn't make programming safer--it makes programming messier.

    I didn't read the article or the language description or anything, so I don't know if this is the case with Cyclone. But it certainly is with many languages. I thought this is what Lint is for. Lint is a program which performs source-level sanity checks on your code. You write your program in C and/or C++, and whenever you compile, you first run Lint to make sure everything's ok. Sure, it's not perfect, and probably won't find all problems, but it will find quite a few things wrong that you didn't even know about. (There are free and commercial implementations of various source-level things like this.)

    I think that careful programming and use of a tool like Lint can make a better improvement than taking away some of the most powerful tools in programming just because some people don't know how to use them. Oh well.

  • by mj6798 ( 514047 ) on Saturday November 17, 2001 @05:01AM (#2577891)
    I pulled it over and installed it. Running their own benchmarks, it seems 5-10x slower than C on most benchmarks. Also, looking more at the documentation, this is not merely a "safe version of C", it's a pretty complex language with C-like syntax but many ML-style features.

    Cyclone could be a winner if it gave you C-like performance with safety and minimal changes to your programs. But it doesn't match C performance as it is and I don't think large, existing C programs will port to it easily, despite superficial similarities.

    The way it is, I think you are better off using O'CAML [ocaml.org] or MLton [clairv.com]. They are probably easier to learn and give you better performance. O'CAML, in particular, has already been used for a number of UNIX/Linux utilities. And Java is probably as C-like as Cyclone and runs faster (although programs have a bigger footprint).

You can measure a programmer's perspective by noting his attitude on the continuing viability of FORTRAN. -- Alan Perlis

Working...