Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

The End of Native Code? 1173

psycln asks: "An average PC nowadays holds enough power to run complex software programmed in an interpreted language which is handled by runtime virtual machines, or just-in-time compiled. Particular to Windows programmers, the announcement of MS-Windows Vista's system requirements means that future Windows boxes will laugh at the memory/processor requirements of current interpreted/JIT compiled languages (e.g. .NET, Java , Python, and others). Regardless of the negligible performance hit compared to native code, major software houses, as well as a lot of open-source developers, prefer native code for major projects even though interpreted languages are easier to port cross-platform, often have a shorter development time, and are just as powerful as languages that generate native code. What does the Slashdot community think of the current state of interpreted/JIT compiled languages? Is it time to jump in the boat of interpreted/JIT compiled languages? Do programmers feel that they are losing - an arguably needed low-level - control when they do interpreted languages? What would we be losing besides more gray hair?"
This discussion has been archived. No new comments can be posted.

The End of Native Code?

Comments Filter:
  • Re:What?!?!? (Score:3, Interesting)

    by tomhudson ( 43916 ) <barbara.hudson@[ ... m ['bar' in gap]> on Monday June 12, 2006 @08:29PM (#15520780) Journal

    It's all bs.

    15 years ago I benchmarked assembler vs c for graphics code - c was 200 x slower. There is NO way that any interpreted runtime will even begin to approach the "bare metal", never mind c.

    Most of the benchmarks crowing about the speed of JIT compilers ignore the startup and initialization time, as well as the end-run time.

    I couldn't believe some of the naive assumptions on one published benchmark - they had the java code print out its start and end time and said "see, only 4x slower than c"; naive is being polite. Proper benchmarking would mean putting a wrapper around both code examples, to handle the start and end time notification.

  • As A Developer (Score:3, Interesting)

    by miyako ( 632510 ) <(miyako) (at) (gmail.com)> on Monday June 12, 2006 @08:41PM (#15520845) Homepage Journal
    Of the development I do, about 60% is in non-native code (mostly java) and about 40% is in native code (usually C++). What I have found is this:
    Java is the language I use the most, and it's good for small programs. It's definitely noticably slower for large applications, but I don't think that's the big reason that a lot of developers don't like it. Swing is nice, but the problem with Java and a lot of other "modern" languages is that they try so hard to protect the developer from themselves and enforcing a certain development paradigm that the same features that make it really nice for writing small program end up standing in your way for large and complex application development. Looking at the other side of the issue, C++ is fast, it can be fairly portable if it's written correctly, and has a huge amount of libraries available. C++ will let you shoot yourself in the foot, but the reason is that it's willing to stand out of the way and say "oh really want to do that? ok...". This makes it easy to write bad/buggy programs if you don't know what your doing, but if you pay attention, have some experience, and a plan for writing the software, then C++ can be less stressful to develop.
    Aside from a reasoned argument, I think a lot of developers are just attached to C/C++. I know that I just enjoy coding in C++ more than in Java. Not that Java is bad- and it can be fun to code in at times, but the lower level languages just give me more of a feeling of actually creating something on the computer- as opposed to some runtime environment.
    Finally, one major reason to stick with C++ is that many interpreted languages aren't really as portable as they pretend to be. A language like C++ that really is only mostly portable, and then only if you keep portability in mind, can sometimes be more portable than other languages that claim to be perfectly portable and then make you spend weeks trying to debug the program because things are fouling up.
  • by billstewart ( 78916 ) on Monday June 12, 2006 @09:21PM (#15521031) Journal
    LISP was a simple, elegant language that demonstrated that almost any language written after 1961 was unnecessary, except for demonstrations of concepts like Object-Oriented programming that could then be re-implemented into LISP, and that any code written in older languages could be replaced with something better :-)

    BASIC had its problems, warping a generation of programmers (including me), but it was small and light and didn't take long to learn unless you wanted to enough find tricks to get real work done.

    FORTH was smaller, lighter, and faster. It was overly self-important, considering its reinvention of the subroutine to something new and radical, but if you wanted to program toasters or telescopes it was the language to use. Postscript was somewhat of a Forth derivative.

    P-Code was a nice portable little VM you could implement other things on.

    And then there was Java, which grew out of Gosling's experiences with NeWS, a Postscript-based windowing system. If you wonder why you're not using Netscape and maybe not using Java, and why you've probably got Windows underneath your Mozilla, it's because it became obvious to lots of people that Netscape+Java was a sufficiently powerful and easily ported environment that the operating system underneath could become nearly irrelevant - so Microsoft had to go build a non-standards-compliant browser and wonky Java implementation and start working on .NET to kill off the threat. It wasn't that conquering the market for free browsers was a big moneymaker - it was self-defense to make sure that free browsers didn't conquer the OS market, allowing Windows+Intel to be replaced by Linux/BSD/QNX/MacOS/OS9/SunOS/etc.

  • by Anonymous Coward on Monday June 12, 2006 @09:48PM (#15521130)
    If I had the opportunity to change the language, I would make more obvious changes, like remove silent letters.

    Well, what do you think we've been doing over here? Just that.

    We've taken out useless letters, such as colour -> color and catalogue -> catalog. We've simplified superfluous pronunciations, as in aluminium ->aluminum. And we've made the number system more consice and practical, for example thousand million -> billion.

    There's much more work to be done; for example, every word that contains an "ough" needs to be reworked. But at least one country has taken up the initiative and made the first few steps towards a more rational language.

  • Re:What?!?!? (Score:5, Interesting)

    by tomhudson ( 43916 ) <barbara.hudson@[ ... m ['bar' in gap]> on Monday June 12, 2006 @09:56PM (#15521163) Journal

    If you want to see a beautiful programming language, how about one that allows one to express code as data?
    In assembler, its all code / its all data. The difference is only a JMP away.

    One of the neat things was te 4k graphics demo contests - try to write the most impressive graphics demo with only 4k of assembler. There was a LOT of code writing code in memory, code using other code that had already run as raw data for designing the next iteration, then using it again as code ... a 4k program that could take you through a 3-dimensional roller coster ride for 20 minutes, never repeating, all done in real time, on hardware that you wouldn't deign to pick out of the scrap heap.

  • by Dcnjoe60 ( 682885 ) on Monday June 12, 2006 @10:29PM (#15521286)
    If we all quit using native languages, then what are we going to use to a) write embed code, b) write drivers, c) write operating systems and d) write the interpreted languages that we use to replace our native ones?
  • Re:What?!?!? (Score:2, Interesting)

    by Waffle Iron ( 339739 ) on Monday June 12, 2006 @10:45PM (#15521368)
    In most cases, unless you're using specialized vector processing units, there's probably not a massive difference in performance between C code written by someone who really knows what he's doing and assembly code. (A good C coder will have a good understanding of how each line of code will get translated into machine code, and will avoid creating performance problems.) I've been coding in both for more than 20 years now, and whenever I disassemble a modern optimized compiler produces, it's almost always very good code.

    The main thing is that the optimized compiler can tirelessly work on millions of lines of code, doing a good job of fully utilizing the hardware at each point, whereas the human assembly language coder will peter out juggling registers after few hundred lines of code. Past that point, the human coder will start creating manual layers of abstraction (subroutine calls, assembler macros, etc.) that just don't get optimized at all.

    After you've written your graphics masterpiece and gotten it fully debugged, then you can profile it. If you can then identify some inner loops that you can't fix up by tweaking the C code, then maybe you should write a few dozen lines of inline assembler code. Anything beyond that is likely to be an unportable, counterproductive waste of effort.

  • Re:Its inevitable (Score:5, Interesting)

    by Mornelithe ( 83633 ) on Monday June 12, 2006 @11:01PM (#15521444)
    Well, that sort of compiler is sort of the holy grail of pure functional languages. The idea goes something like this:

    Forget your C/C++/Java/whatever. Side effects and multiple assignment are bad. Program in a pure functional language, such that all functions are referentially transparent---that is, f(x1,x2,...) always returns the same value given the same x1, x2, ..., and has no side effects (print statements, assignment to mutable state, etc.).

    Now, since most of your code is made up of referentially transparent functions, the compiler can automatically split independent pieces of code up, and perform them in parallel without fear that a call to b(x) somehow effects the results of c(y).

    When you absolutely need side effects (for IO, for example), you use something (uniqueness types, monads; I'm guessing) that explicitly orders the code and in this case, would presumably prevent the compiler from parallelizing it.

    Compilers aren't there yet. The things I'm (vaguely) familiar with require specific annotation of potentially parallel paths. Try Occam, for instance. Another example I've read only slightly more about is parallel Haskell, which includes similar annotation primitives (par and seq). However, just because you annotate something as parallel doesn't mean it will be performed in parallel. The compiler/runtime/I'm-not-sure-which decides what to run in parallel from among the massive potential of parallelism in such a program.

    If you're asking how it's possible in Java: it isn't. But then, Java already sucks when it comes to concurrency compared to systems designed for it like, say, Erlang (which, incidentally, is VM interpreted, but still blows the pants off most conventional C/whatever programs within its application domain (massively concurrent/fault-tolerant systems), lending some credence to the point of this article, not that the same things necessarily couldn't be done with native code).
  • Horses for courses (Score:3, Interesting)

    by Nefarious Wheel ( 628136 ) on Monday June 12, 2006 @11:43PM (#15521602) Journal
    It's still horses for courses, mate. Look at the niche markets -- embedded systems for example -- and you'll find opportunities to shave a few cents by using a smaller configuration that would profit from having tighter code.

    Thinking back a few years, iirc the first Apple Mac had the Quickdraw graphics package written in machine language, didn't it? Not assembler, but instructions made of hand-mapped binary digits. It's the reason why those early Mac GUI's were able to extract such amazing graphic performance out of the Motorola 68000.

    You can still buy Zilog Z8's, and embedded applications still exist for them.

  • by Anonymous Coward on Monday June 12, 2006 @11:59PM (#15521676)
    "BTW, for anyone who's interested, it's not "British English", it's just "English". The Scotts, Welsh and Irish each have their own language - some even use them."

    It's British English to differentiate it from US English, not from Welsh/Scots/Irish English.

    Besides, the Scots and Welsh are British. Britain is the old Roman Province of Britannia, a corruption of the pre-Roman name for the island, 'Prydwen', and includes England, Scotland and Wales.

    While we're here, it might be useful to explain all the terms involved:

    'The British Isles' includes Ireland as well as Britain.

    'Great Britain' is in contrast to 'Brittany', now part of France, which became a Romano-British haven following the withdrawal of the Roman Empire from Britain in the 5th century, when large numbers of Romano-Brits moved there to remain within the official bounds of the Empire.

    The 'United Kingdom' is the combination of the kingdom of England and Wales with the Kingdom of Scotland, and includes Northern Ireland. Northern Ireland, therefore, is British politically (because it is part of the UK), but not geographically (because Ireland was never part of the Roman Province of Britannia).

    England, like France, is a post-Roman nation, named for the people who invaded/migrated there causing/following the collapse of the Roman Empire. In England's case it was the Angles, and in France the Franks, who dominated those invasion/migrations when the naming happened.

    Wales is the bit of the old Roman Province that never got invaded by the post-Romans, the name 'Wales' is derived from the Saxon word 'Weleas' meaning 'Foreigner'. It was conquered by the Normans following their invasion of England, and the two were combined to form one Kingdom.

    Scotland used to be Pictland, but was invaded by the Scotti, a gaelic Irish tribe, in the 5th century (the same time that England was being invaded by the Angles).

    Because of this, the original Scottish tongue is a derivative of Irish Gaelic, and closely resembles, but is different from Welsh, which is linked with the dialect of Cornwall and Brittany (for obvious reasons, they were the same Romano-British people until the Romans left and the English pushed them out).

    The English language as we know it was created in the period following the (French-speaking) Norman invasion, when elements of Saxon, Latin and French were combined into one language, a process that took centuries. During those centuries, the nobility spoke French and the commoners spoke Saxon (in England) or Gaelic (in Wales and Cornwall), while Latin was used in the Church, in all writing, and as the trading language between strangers.

    So...we as the English have no rights to complain about 'embrace and extend', we've been doing it for centuries and it's how we forged our identity. Good luck to the Yanks, the Aussies, the Kiwis, the Jamaicans, the Indians, and everyone else who wants to create a new form of the language.

    But losing 'lose' and loosing 'loose' upon unsuspecting traditionalists is upsetting, I'll agree.
  • Re:What?!?!? (Score:2, Interesting)

    by _merlin ( 160982 ) on Tuesday June 13, 2006 @12:01AM (#15521684) Homepage Journal
    That's rubbish. I can get over twice the performance of GCC on palette-based video blit on PowerPC. GCC wastes far too much time performing loads and stores. It can't think like a human. I wouldn't write a whole app in assembly language, but it's worth it for small, performance-critical parts.
  • by Latent Heat ( 558884 ) on Tuesday June 13, 2006 @12:02AM (#15521689)
    A lot of people are dismissive of Java as having failed on client GUI apps. What is it now, 2006, and Java came out around 1996? I know we talk about "Internet time", but major software concepts can take years to evolve, and Windows started out sometime in the 1980's but it wasn't until Windows 95 that it started kicking backsides and taking names. So maybe Java will eventually have its day.

    I am a Pascal programmer from ancient days and have been pretty much a Delphi person on account of my Pascal affinity and other requirements, but I have implemented GUI apps in C++, C#, Java, Matlab, and VB. I am seriously looking at Java/Swing as the next wave of what started as DOS/Turbo Pascal and got reimplemented in Windows/Delphi. Java simply couldn't do in 1997 what I was doing even at that time in Windows, just plain couldn't from the standpoint of features and performances. Java is not-quite-there-yet with the features I use in Windows, but it is much farther along in 2006 than in 1997 and is closing the gap with graphics acceleration and other features. It may surpass Delphi for what I do if it proves to be easier to do multi-threaded apps to take advantage of multi-core.

    While my complex data visualization stuff is a long way off from being done in Java, the sort of simple data visualization stuff that I was doing in 1997 under Windows works quite well under Java, and it works equally well under Linux. If anything will get me to switch to Linux it will be that I have a collection of graphical data visualiztion programs for the work I do written in Java that will work equally well under Linux. While I can write a faster program with more features in Windows, the Java implementation is proving good enough for a lot of stuff that I am doing and it breaks me loose from Windows as well.

    SUN seems to be in this Java business for the long haul, seemingly spinning their wheels making it available for free and always being a step behind Windows in features. But at some point Java/Swing programs will have accumulated enough performance and features that they are good enough for what people want to do, and they have the added advantage of not being tied to Windows. This idea that something like Java could transcend the OS may yet happen for client GUI apps.

  • by toadlife ( 301863 ) on Tuesday June 13, 2006 @12:10AM (#15521725) Journal
    The first time I ever noticed lose spelt incorrectly was in the Game Operation Flashpoint which was released around 2001. In the triggers in the mission editor one of the choices upon activation was "loose". Also in the scripting language, the functions "getdammage" and "setdammage", damage was also spelt incorrectly.

    Since creating missions for Operation Flashpoint required in hours in the mission editor tweaking triggers, and scripts, in the various game forums almost everyone would regularly misspell both lose and damage when posting messages. Even those that knew the proper spelling would do it. It became somewhat of a joke, as everyone knew it was wrong but would continue to do it out of habit.

    Operation Flashpoint effectively trained thousands of people to misspell both lose and damage.

    So don't blame us "USians" for it. Blame those Czech developers at BIS and their kick ass battle simulator.
  • by Nefarious Wheel ( 628136 ) on Tuesday June 13, 2006 @12:11AM (#15521729) Journal
    Not entirely true, back then. The assembler of the day (think it may have been Motorola's own) provided only a subset of the options available to the chip hardware; some of the more esoteric middle-bit operation modifiers weren't covered by mnemonic+qualifer options.
  • by maelstrom ( 638 ) on Tuesday June 13, 2006 @12:24AM (#15521790) Homepage Journal
    Take a look at Go and then ask yourself why there are no good Go AI programs. Chess is easy to "solve" with computers, Go isn't, yet anyway.
  • Re:What else (Score:4, Interesting)

    by eonlabs ( 921625 ) on Tuesday June 13, 2006 @12:38AM (#15521842) Journal
    If your native code is running as slow as interpreted, I would really recommend getting that looked at. It would seem that people are losing the ability to write clean code since the crutch of interpreted languages is hiding so much of the finer grains of computer science. Sure, if you're writing apps that are fine slow, interpreted doesn't matter. If you're writing higher end programs like games, I would recommend cross-platform libraries in a native language. I'm currently working on learning SDL in C/C++ for exactly that reason.
  • Re:Huh? (Score:2, Interesting)

    by dwlovell ( 815091 ) on Tuesday June 13, 2006 @01:01AM (#15521915)
    You wonder what low-level control has to do with non-native or interpreted languages?

    One word: pointers.

    Virtual machines (ie: runtimes) handle garbage collection and memory management. In low-level languages, you, the programmer handle it. Dont confuse a vmware virtual machine with a runtime like the java virtual machine, or the .net framework runtime which performs JIT and memory mangement.

    Managed and interpreted languages take a perf hit for using special threads to run a generic (albeit engineered) algorithm to manage objects that no longer have references. In low-level languages, you have complete control over when objects are created in and destroyed from memory. So you have more power to optimize memory usage for your application, but also more power to make mistakes regarding memory. (ie: buffer overrun/underrun, bad pointer math).

    So I would say that "low level control" has a great deal to do with "interpreted languages" since virtual machines are what perform the native tasks that you would normally be responsible for.

    Then again, maybe you were just making fun of his grammar and already knew all of this.

  • by killjoe ( 766577 ) on Tuesday June 13, 2006 @01:26AM (#15521989)
    I agree with you but...

    I have started to believe that the proof is in the pudding. I don't know lisp but I know some zope. Zope much like lisp is elegant, innovative, comprehensive, well designed and capabable of almost anything. Just like you probably scratch your head and wonder why people code in PHP or java when they could code in lisp I wonder why people code in PHP or java when they could have used zope and python.

    But I am ready to give that up. I am now under the imression that zope isn't everything I thought it was. I mean if zope is so great then how come there are only three or four blogs written for it and not one of them is 1/10th as good as wordpress which is written in PHP? How come not one ticket tracker written in zope is 1/10th as good as eventum written in php?

    I ask those questions rhetorically though. I know the answer. The answer is that zope if very hard. You have to be a very smart and very dedicated person to climb the ladder of zope and attain zope zen and there are just not enough people in this world that are willing to put forth that much effort.

    In the end it's better to be easy then to be good. Look at how gracefully ruby balances on that rope. ROR is easy and it's innovative. That's why great software is being written in rails while the zope folks are pounding on zope3 trying to make it easier for developers to write decent software.

    BTW I am not even going to attempt to learn zope3. I have to break up with zope. Thanks for the great times guys.
  • by Memnos ( 937795 ) on Tuesday June 13, 2006 @02:49AM (#15522220) Journal
    Hmm.. as well. I worked on a team that developed a DB app that was nine PETABYTES and growing constantly. (Our little test database was 60 terabytes.) It will soon be one of the five largest databases in the world, and could extend into the exabyte range (you can guess who it's for.) We use Java and ASP.NET on the server and Java and an AJAX solution on the client. We throw shitloads of big boxes at it and we don't give a damn, because it works. Do not get me started on how analytically complex the algorthms are that use that data...
  • by telekon ( 185072 ) <canweriotnow@g m a il.com> on Tuesday June 13, 2006 @04:05AM (#15522419) Homepage Journal
    "America" is a shortened or common term for 'The United States of America,' a country located in North America.

    Sure it is, from the perspective of a resident of the United States of America. My Canadian friends, however, lament such shorthand, because although they are 'Americans,' they would be horrified for such US-centric shorthand to confuse the issue of their nationality, especially when travelling to Europe. My Quebecois friends insist on the term États-Unisiens. (United-Statesians)

    I'd think that computer people could understand the difference. The 'South' in 'South America' is part of the string, it's not a prepended descriptive modifier.

    Despite being counted among both of the aforementioned groups (residents of the Untied States of America and 'computer people'), this part of the argument strikes me as the most absurd. Why would you invoke the constructs of a set of abstract artificial languages to (erroneously) explicate the workings of natural languages?

    Despite the status of "South America" as a proper noun, and as such a 'bracketed' piece of syntax, one would do well to remember in natural language context precedes the identity of discrete sentence fragments; etymology is never put out of play. Context, in this case, can easily be established by glancing at a map. Through careful study and observation, one will soon discover that 'south' is damn well a modifier, placing one bit ot the larger set 'america' in geographic relation to the other bit, in this case, specifying it as being 'south' of the 'north' bit.

    Not a modifier, my ass. Natural languages are vastly and inordinately complex, to the degree that they make C++ look like COBOL. But this one is a bit of a no-brainer. Then again, maybe there's a reason nobody expects engineers to be poet laureates.

  • by JulesLt ( 909417 ) on Tuesday June 13, 2006 @04:28AM (#15522476)
    I'm not sure that we've moved that much. I think Gosling and the other originators of Java are still pushing in the wrong direction with GUI; see his remarks on Eclipse / SWT.

    It is not a Java problem per se, but goes right back to the issue of creating cross-platform client apps in the first place. Many of us like to think of the OS as something that provides services - disk access, windowing, etc - that look like they can easily be abstracted - and they can. However, as well as being OS, Windows, OS X, KDE and GNOME are platforms - a set of programming APIs and a philosophy.

    Rather than transcending these differences, Swing is yet another variation. Potentially you could make a Swing app that did look and behave identical to a Windows app - but it would feel plain wrong on OS X. The reverse is equally true (well, just about - I don't think you can use the top-of-screen menu bar in Swing apps).

    I think SWT may be the better approach - it's not write-once run-anywhere, but you are reducing the amount you need to port. And as said above, you need to consider the philosophical differences between platform HCI anyway.

    Ironically one of the few really successful Java GUI apps I know is a data visualisation tool - it mostly consists of OpenGL calls so it's a bit of a misnomer to say it's Java, but it's back to the point that it's the APIs that count. OpenGL is a nice x-platform API.
  • by IamTheRealMike ( 537420 ) on Tuesday June 13, 2006 @06:12AM (#15522712)

    Well, programming languages are about the man/machine interface. Their design is very much a tradeoff between the needs of the humans that'll be using them and the needs of the computers that'll be running them. Simple example - adding features to a language makes it harder to learn but potentially more efficient.

    The other thing is you have to separate language from implementation. Java isn't just slow because of the design of the language, but many other choices that went into it like using a virtual machine, having a massive class library, making it portable etc. It'd be possible to create a Java-like system that was as fast and as resource efficient as C++.

    Programming languages are like real human languages ... we have a bunch of words, and people can use those words to communicate meaning, but some ways of putting words together will be "more efficient" than others. There's no one correct way to say it's raining, you could say "It is raining" or "Apparently our planet decided to send us the gift of precipitation today" - equivalent yet one is faster to say than the other :)

  • Re:What else (Score:5, Interesting)

    by julesh ( 229690 ) on Tuesday June 13, 2006 @07:14AM (#15522841)
    If your native code is running as slow as interpreted, I would really recommend getting that looked at.

    The question you have to ask, of course, is where is the bottleneck. And the answer is fairly obvious if you analyse the performance of modern applications on a variety of different hardware: IO is the bottleneck in almost every case. There's no other explanation for why my 400MHz desktop (with a nice, fast hard disk) performs as well as or better than my 1.7GHz laptop (with a slow, energy saving hard disk but otherwise similar specs) for many applications (including Firefox, OpenOffice, etc... the kind of things that the average user runs daily) while the laptop wipes the floor with it for others (media players, SketchUp).

    The point is, if you're going to be waiting 50ms for disk access, why bother shaving 2ms of processing time by running in a native compiled language? Nobody will ever notice. And you may find the more modern and high-level design of the interpreted language's library allows you to write faster performing IO code more easily than the simple & low level libraries that are supplied with most compiled languages, at which point you may get better results for the same programming effort for using that language.

    In the end, fast programs are about good design, not language choice. Higher level languages often allow you to spend more time on design and less on implementation. All real-world projects have a limited time scale; ISVs just try to do the best they can with the time they have available, which isn't usually producing something miraculous.
  • Re:simple (Score:3, Interesting)

    by julesh ( 229690 ) on Tuesday June 13, 2006 @11:30AM (#15524330)
    As a Linux and Mac user, I don't want cross-platform scraps thrown to me, I want high-quality applications that integrate well with my desktop.

    What few people seem to have realised is that the best way of achieving cross platform portability is not to throw out the systems you're porting to and implement everything from scratch (the AWT/SWING approach). This just results in applications that feel wrong whichever system you run them on. The answer is to use native widgets in a way that is flexible enough to allow the toolkit to decide look & feel issues rather than the programmer. Sure, it's a little harder to write for (you'll probably find yourself doing stuff like writing an XML definition of how the interface should work rather than directly manipulating controls, which is a little more abstract and maybe a little harder to work with if you're aiming for something other than trivial interactions), but the results are so much better that in most cases it's worth the tiny extra effort involved.
  • by David's Boy Toy ( 856279 ) on Tuesday June 13, 2006 @11:36AM (#15524392)
    The tired old "in C++ you have to worry about memory management" myth. I've written substantial
    C++ servers where the only 'delete' call was in a thread safe reference counted smart pointer class.
    90% of the server code didn't need to be fast. I used std::string, and stl containers, no pointers
    or C style arrays. Only where speed mattered (gigabytes being moved over TCP) did I use C style
    arrays, memcpy. and the like. The program has had very few bugs, and none of these where memory
    corruption related.

    Program in C++ correctly and its as safe as the castrated languages. But your free to do "dangerous"
    things when you need to, and it really does matter. Its usually a small percentage of the code where
    most of the CPU time goes. In an interpreted language your screwed, want to do something the libraries
    don't provide, your out of luck. Keep your dangerous code limited and encapsulated, and it becomes
    quite safe. Small sections of dangerous code can be understood sufficiently to verify that they indeed
    do what they are supposed to.

Can anyone remember when the times were not hard, and money not scarce?