Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
United States

American Programmers are Slackers 584

Amigan sent us a "Story on the CNN website how that in a world wide survey of programmers, American programmers are half as productive - based on Lines of Code generated!" Allright, I'm lazy, but in Perl thats ok. I find LOC a shoddy indicator of programmer laziness, but those numbers seem low- I mean, I probably wrote 15,000 lines of code last year, ran Slashdot, and was a full time student. I gotta put up a poll.
This discussion has been archived. No new comments can be posted.

American Programmers are Slackers

Comments Filter:
  • by Anonymous Coward
    IBM used to pay their programmers for lines of code.. This is the most idiotic thing I've ever heard. Writing more lines of code doesn't make you productive, it makes you a poor programmer.

    -AC
  • by Anonymous Coward
    I work for a Japanize company with a division in the US. For several years we used to hear comments that Japan company programs had a higher LOC throughput and less bugs per LOC.

    This was all very true based on the LOC counts that each side tracked.

    Finally someone decided to look in to how each side determined what 1 LOC was. Turns out Japan pretty much counted any line in a file, including comments, were as in the US we were using a much stricter definition, closer to counting each ";".

    After recalculating it turned out US side had a better bug-to-LOC ratio with slightly less LOC (just means higher performance code as well).
  • This is bullshit people. Plain, simple, and unadulterated bullshit. I've been out in the RealWorld (TM) since '91, so I'm speaking from experience here...

    LOC is absolutely positively the *WORST* means of evaluating programmer performance. At times, both I and my colleagues have had *NEGATIVE* lines-of-code-per-week values. This stemmed from another department linking LOC to bonuses, encouraging their programmers to use cut-and-paste programming. After all, why call the same routine twice when duplicating that routine will increase your bottom line...

    Needless to say, that department's project became, in a matter of months, completely unmanageable and uncompletable. And there was a great deal of celebration in that department when the client canceled their contract without suing us...

    Hint: If you see this happening at your company, it means: Circulate your Resume NOW!

    Another factor that may have been completely overlooked is what American programmers are doing besides writing code. How much time is being spent in meetings, demos, support, answering the phone, systems administration, debugging third party software, installing/configuring *ANYTHING* made by microsoft, etc. Perhaps more importantly, how much time is being spent in **DESIGN**. Since, with Object-Oriented approaches, the vast majority (more than half) of your time better be spent in design. If not, well, you're in for a lot of pain, suffering, and cost overruns...

    Finally, what are working conditions like? Modern offices are *NOT* optimized for writing code! They are optimized for managing underlings!!

    There's a test that I like to give managers. I ask them to count backwards from 100 by 3's. Then, after they're down to about 91, I start calling out random numbers between 1 and 100. It's a rare person who can make it to 70.

    Then consider the cubicle that I was located in a while back. The partitions were transparent to sound. And the fellow on the other side of my partition was trying, very hard, to switch careers and become a sports agent. On my other cubicle wall was the hall corridor, right where it met the elevators for the floor. Across the hall, about 2 meters away, was the kitchenette. Which contained the coffee machine, where everyone went to socialize. Which was difficult for them to do because the copier was also located in the kitchenette, and they had to talk quite loudly to be heard over it. Just to insure that a quiet moment would never occur, purchasing was located in the next cubicle bay down the corridor...

    I find that the same part of my mind that I use to write code is also used to process speech. But speech has a higher priority interrupt. My productivity fell through the floor...

    Nowadays I work for a different company. And, do to similar problems, I now telecommute from home, where my productivity is much *much* higher...

  • by Anonymous Coward
    The problem with theses US vs. the world type polls is difficult to eliminate. US respondents, especially engineers, tend to be relatively honest. In most other countries, and I know I'm going to get a lot of grief for this, common respondents tend to tell bald-faced lies as naturally as they bribe their local govt officials for favorable treatment. Or the books are cooked by officials who instruct respondents to give "offically correct" answers. Witness the numbers from the Japanese educational system of the 80s. Entire classes of students not counted in the stats if they weren't considered "acceptable".
    In addition, lots of programmers who aren't US nationals work in US software companies. Does their productivity magically decline the instant they hit our shores?
    I am biased, as I have worked on many projects with offshore programmers, and have never seen any value gained from the practice. On the contrary, it has always been an expensive process, resulting in the offshore code being thrown away and reimplemented locally at greater expense, partly because the deadline could not move even though our foreign counterparts wasted most of the schedule.
    Measure twice, cut once.
  • by Anonymous Coward
    Yes, I went back to school this semester and I have noticed a significant difference in LOC and bug counts between me and compsci students, and especially me and non-CS types who are taking intro computer courses.

    In general, 100 lines of my code is worth about 300 lines of the CS student's code, which is worth about 1000 lines of a tyro's code. And not surprisingly, the people who write the shortest programs also write the least buggy programs. Brevity demonstrates an understanding of the problem, and understanding the problem means you are unlikely to introduce bugs.

    Code is a liability, not an asset. The more source code you write, the more bugs you will encounter, and the more problems you will have to fix later. (e.g. Y2K).
  • Besides that, does each line of comments count for a line of code? Because then I can increase my efficency and decrease my laziness like this.


    /* Program to print Hello World on the screen for all to see and marvel at it's wonderous message!

    Programmed by: Joe Doe
    Creation Date: 4/15/1999 (Please note the special Y2K compliant comments) */

    /* C Library include header files. This file is a must for this damn program to work.
    If you don't believe me, just try compiling it witout it.
    This header file allows me to use the printf function (see below) */
    #include <stdio.h>

    /* This is the main routine.
    This routine is the first one executed when the program is run.
    This routine takes two aruments, one of an integer value and one of a character value.
    The program doesn't make use of them but I put them there so I could be a more productive programmer */
    int main(int argc, char **argv)
    {

    /* This line prints the message Hello world to the screen.
    Where the user can see it and drool in all its wonder.
    This command was made possible by including the stdio.h C header file (see #include statement above) */
    printf("Hello, world.\n");

    /* Return the number zero to the calling program.
    Like the arguments above, this isn't necessary but I put it there because I didn't want to be lazy! */
    return 0;
    }

    /* Well that's the end of this program.
    I hope you enjoyed reading my source code.
    I also hope the code was documented detailed enough for you to understand the program function.
    If my comments were not detailed enough, please let me know and I will add to them.
    Have a pleasant afternoon. */


    There 50 lines of code. I am the king of programmers and will now accept my $300,000 an hour raise.
  • by Anonymous Coward
    You know, it's pretty funny. In my CS algorithms class, we just had an assignment which basically involved coming up with a clever optimization for a dynamic programming problem. If you spotted the obvious optimization, you could reduce the problem from O(n^3) to O(n^2). I came up with an even better optimization, reducing the program complexity to O(n).

    I wrote my program in Perl. The hardcore coder of the class wrote his program in 'heavily optimized' C++, but didn't spot any of the algorithmic optimizations. My program operated on the sample input in a fraction of a second. His program required two days to process the sample input.

    My program was 60 lines long. His program was well over 2000 lines long.

    Needless to say, I consider myself a vastly superior programmer, even though his line count is much higher than mine.

    All I want to know is, what organizations consider LOC important? I would like to avoid them.
  • Author unknown, sung to the tune of The Beatles' "Let it Be".

    When I find my code in tons of trouble,
    Friends and colleagues come to me,
    Speaking words of wisdom: "Write in C."

    As the deadline fast approaches,
    And bugs are all that I can see,
    Somewhere, someone whispers
    "Write in C."

    Write in C, write in C,
    Write in C, write in C.
    LISP is dead and buried,
    Write in C.

    I used to write a lot of FORTRAN,
    for science it worked flawlessly,
    Try using it for graphics!
    Write in C.

    If you've just spent nearly 30 hours
    Debugging some assembly,
    Soon you will be glad to
    Write in C.

    Write in C, write in C,
    Write In C, yeah, write in C.
    Only wimps use BASIC,
    Write in C.
    Write in C, write in C,
    Write in C, oh, write in C.
    Pascal won't quite cut it,
    Write in C.

    Guitar Solo

    Write in C, write in C,
    Write in C, yeah, write in C.
    Don't even mention COBOL,
    Write in C.

    And when the screen is fuzzy,
    And the editor is bugging me,
    I'm sick of ones and zeroes,
    Write in C.

    A thousand people swear that T.P.
    Seven is the one for me.
    I hate the word PROCEDURE,
    Write in C.

    Write in C, write in C,
    Write in C, yeah, write in C.
    PL1 is 80's,
    Write in C.

    Write in C, write in C,
    Write in C, oh, write in C.
    The government loves Ada,
    Write in C.
  • I worked on a COBOL-to-C conversion several years ago, using Microport 286, of all things. There were two of us, the other programmer consistently wrote about five pages of code a day, I usually delivered about 1.5 to 2 pages. After about one week there was a confrontation about the disparity. I replied that my code compiled, ran, and worked as well as could be expected with stubs substituting for missing functionality. I asked the other programmer if she could make the same claims for her code--she couldn't. Did this make any difference? No, I was outta there.

    About four years later I learned that one of the best C coders in our part of the world came in right behind me and met the same fate--last I heard, he was working for the Deathstar Company in Jersey--now I don't feel so bad, but at the time I was really pissed.

    "Lines Of Code" ignores issues of efficiency, complexity, suitability for purpose, and stability. If LOC were any measure of quality, Windows would be the best code known to man. LOC is a bad metric that should die an excrutiatingly painful death.

  • Paying programmers by the number of lines they produce is only slightly less dangerous than paying firemen by the number of fires they put out.

    The only thing worse would be paying them by the number of bugs they fix.
  • Just before I graduated in 1985 (so what) I heard a statistic: the average professional programmer writes 14 lines of code per day.

    I thought: WOW, I'm going to crush these people. They have no idea. So I got out into the real world and my first job was at Sperry (now Unisys), coding in the OS "kernel" for the series 1100/2200 mainframes -- although it wasn't really a kernel since they had never developed it along the lines of a proper OS.

    I quickly learned that "14 lines per day" wasn't because these people were stupid; it was because of all the other crap that had to accompany those lines. In particular, one 500-line modification I added also required 50 pages of (mostly bureaucratic) definition and design documentation and about four months of politics.

    Sperry circa 1985 is no longer a good example, but I'm sure that programmer's productivity everywhere is still completely stymied by corporate cultures that require signoffs from 7 different executive vice-presidents just to decide what doughnut selection is available in the breakroom. ("Operations is worried that powdered sugar is getting into the air vents, Personnel is angling for lower fat content to save on insurance premiums, and Steve in marketing wants them all chocolate because HE likes chocolate so therefore the rest of us should too.")

  • $ cvs diff -ub -D'1 jan 1998' -D'31 dec 1998' | diffstat | tail -1
    57 files changed, 8940 insertions, 6816 deletions

    So, did I write 8940 lines or (8940-6816) = 2124 ?

    Actually, since some lines were rewritten more than once I wrote far more than 10k lines during the year and deleted far more than 7k (plus I workied on side projects too). But, of course, diff thinks a change of 1 character is a change of that line.

    Now when I talk about rewritting code I don't just mean getting rid of the bugs. The versions that I check into CVS are already debugged, so (usually) only contain little bugglets. What I'm talking about here is things like the changes needed to incorporate new features or redesign of some internal data structure.

    If you consider the 1999 model of a car to be 1% different from the 1998 model do you then say that the engineers were working at a rate that would take them 100 years to design a new car?



    --
  • $ cvs diff -ub -D'1 jan 1998' -D'31 dec 1998' | diffstat | tail -1
    57 files changed, 8940 insertions, 6816 deletions

    So, did I write 8940 lines or (8940-6816) = 2124 ?

    Actually, since some lines were rewritten more than once I wrote far more than 10k lines during the year and deleted far more than 7k (plus I workied on side projects too). But, of course, diff thinks a change of 1 character is a change of that line.

    Now when I talk about rewritting code I don't just mean getting rid of the bugs. The versions that I check into CVS are already debugged, so (usually) only contain little bugglets. What I'm talking about here is things like the changes needed to incorporate new features or redesign of some internal data structure.

    If you consider the 1999 model of a car to be 1% different from the 1998 model do you then say that the engineers were working at a rate that would take them 100 years to design a new car?


    --
  • Right now I'm deploying a tiny Perl program that monitors all the servers at work and sends nastygrams when they go down. It's running on two servers at my location and once I get this little Linux system built, it'll be running on one server downtown. Most of the web-related things here are also in Perl. The only other language I really use is C, to make local modifications to various daemons.
  • I tend to do that while I'm building parts of programs, but I delete them when I'm done. When I killed the 'blocked out' lines from my current project prior to deploying it, I noticed the line count drop by half. At least I tend to over-comment when I comment.
  • So where's the choice for, "Yeah, like I counted them..."
  • I'd have sworn I wrote less than 1000 until I counted just three programs (coincedentally the ones with GPLed source available online) and found it was around 1500 lines. Not lines of C, mind you, but for 'a that still lines of GPLed code. _Mac_ code! So that's 1500 lines closer to open source setting the tone for Mac development too :)
  • The ultimate 'toy language': REALbasic for the Mac. Sort of 'Visual Basic minus as much cruft as possible' with a heavy emphasis on interfacebuilding tools: I wrote 1500 lines but the GUI code I didn't have to write would have been at least 3000 lines if not 5000 :)
    Also, the people at REAL software have openly stated that, given the chance to add compatibility with VB, they won't add compatibility to features that suck :)
    Lastly, I don't count HTML as a language, and at any rate, I wrote a REALbasic program to generate my site from text and some markup, so it's moot. In all seriousness, RB will never be any good for device drivers or calculation engines, but it's at least as good as any scripting language, and by God is it quick to prototype stuff in. Whee! Also it's not really basic, just uses some similar syntax and has the reassuring name- it's quite event-driven, does thread classes, and even has a wizzy sprite engine for fooling around with games :)
  • What's wrong with
    StaticText1.text="Hello World" ?
    I dunno if this is the way VB does it, but it's REALbasic code, and it seems nice and straightforward. You drag the statictext onto the app window and set it up mousishly :)
  • I just munched up a copy of SiteBotSource.txt with 'cut lines including', and got the following ratio:
    Source: 550 lines
    Comments: 235 lines
    This is because I needed Sitebot to work for _me_, and it calls its own routines recursively and does many strange and arcane things (if I didn't need to auto-link to a separate FX folder that's always at root level the whole thing could have been done by hand but nooooo, genius here had to come up with more complicated things to do to^H^Hwith it)
    //Comments rule!
    When coding weird stuff the most bizarre notions are likely to seem obvious, and then be totally inexplicable in a month. I like to be able to stop coding on Sitebot for a month, and then resume without spending hours figuring out what I was thinking....
  • Trusting soul, aren't you!
    I always try to figure out the most evil things I can do to my programs to make them crash and burn, and if it seems possible I'll do the error checking.
    Not if (2+2 = 5)
    but if (application is expecting to be able to get parent folderitem for itself as a folder/directory but surprise! It's at root level out of the folder it shipped in. On a Zip Disk. Which is write protected, with one of the sectors damaged >:) )
    Get it? If you don't try to kill your program in horrible ways, horrible things will eventually happen anyway, and your program will faw down go boom, probably when someone needs it most. I don't care what Linus says- he can and should make the Linux kernel an intricate perfect diamond dependent only on his own code- but when I'm writing applications I assume nothing.
    If your app needs no error checking, you probably aren't letting users run amok enough. Saying 'don't touch anything' isn't okay. They need to get their little paws dirty and feel they can mess with whatever- and if they break your program's dependencies doing so, unless it's truly speed critical and you can't spare the check at all, you need to SAY what they did to break it so they can learn and fix it.
  • I write mostly in PHP and Perl (since a lot of stuff is web based). You should also take a look at Python and SQL. Not to mention C.
  • Posted by The Trailer:

    I would think that generating fewer lines of code would be a mark of excellence in this industry, not laziness. I'd much rather produce an elegant 7700 lines of code than 16,000+ lines of bloatware.

    oh well.
  • Posted by Lulu of the Lotus-Eaters:

    One of the other silly things about counting LOC is that it is often so much more productive to ELIMINATE code than to write new lines.

    I probably wrote some small number of thousands of lines of new code last year. But I am quite certain that I simultaneously managed to delete more lines than that from legacy products/projects. Some of that was in Y2K remediation, other parts were just in regular code maintenance, version creation.

    Bad programmers, or even just good programmers in too much of a hurry, will write almost the same 50 lines over and over rather than stick them into a generalized function/class; or they may simply not know that someone else in the workteam (who left years earlier) had written that function and stuck it somewhere in a different program. Also, but to a smaller extent, programmers use inefficient expressions of an algorithm that can be expressed non-cryptically (no obfuscated Perl for me) in many fewer lines.

    So during my 1998 contract, I probably wound up getting paid a net NEGATIVE $10 per line of code I wrote :-). I've had a similar experience most places that I have inherited code.
  • Posted by Lord Kano-The Gangster Of Love:

    What this article may not be taking into consideration is the language that code is written in. Cobol or Fortran are languages which dictate that coders use more lines to do the same things. In the US C(++) is the dominant language. The objects in C++ and the ability to reuse those objects eliminates the need the re-invent the wheel and type those lines of code again, and again.

    I keep all of the code that I've written handy, just in case I want to save 4 hours of work rewriting and redebugging a group of objects.

    I'm not currently coding for a living, I do it to learn, and to accomplish simple repetitive tasks. If I had to do it to keep food on my table, I'd probably learn even more tricks to save time and get the job done faster. If these guys could get the same amount of work done while only writing 500 lines of code, I'd say all the better.

    LK
  • Posted by Pseudonet:

    Could anybody tell me what percentage of Microsoft programmers are American - this article could explain a few things.

    From
    A man how writes barely 1000 lines a year
  • by gavinhall ( 33 )
    Posted by Zero G:

    This is so strange. I was talking about this around 12:30 this morning, one hour before the post. The only exception is that I was refering more to (both corporate & private) windows programmers. I HATE all this visual crap. (Dos rules!) I even run an open source site.
  • Posted by Zero G:

    I program in a lot of different languages.
    I do mostly C, but I use other languages as needed.
  • Posted by Paul Holden:

    The difference between O(n) and O(n^2) is in no way 2 days - "a fraction of a second".
  • by gavinhall ( 33 )
    Posted by LOTHAR, of the Hill People:

    of course I don't write many lines of code.
    I do however, reuse a great deal of code. I spend more time designing than programming so I don't have to write large amounts of code, re-write, or patch.
  • Posted by The Merry Misanthrope:

    If it's hard to write, it should be hard to read.
  • Posted by The Merry Misanthrope:

    If you know how many lines of code you've written in the last year, you've got way too much time on your hands. Go be more productive, weenie boys!
  • Posted by Reitzel:

    Hmm. I notice that this code has NOT ONE SINGLE COMMENT. Terrible practicum, well on the way to Write-Only Programs.

    Some more salient questions about code: How many _totally_incompatible_except_MicroSoft() function calls did you use? How many global variables changed in every function you ever wrote? How many ridiculous fpszqhPrefixToVariableName did you use? And finally, how many of those lines of code were (gulps back bile) VisualBasic...
  • Posted by FascDot Killed My Previous Use:

    ...but not unproductive.

    Laziness is a VIRTUE in a programmer. "Man, I don't want to keep doing this. I'll write a program to take care of it." "It'll take me forever to write that program; there must be a faster way." Etc.
  • As you write this, the over-the-hill threshold has dropped into the upper 30's.

    Razy americans ... very razy

  • by jabbo ( 860 )
    If you're so valuable it should be trivial to get a job elsewhere for a reasonable salary. Decent programmers actually *are* in demand, you just have to force the issue.



  • Okay, so I don't know every programing language that mankind has ever invented. But anyone who can get a CS degree should be able to pick up any language needed in just a few days. It will take months to learn all the tricks, but in a few days you can be productive, and a in a week nearly as good as the seasoned veterans (with the same expirence as you).

    The important part of programing is algorythims, data structures, and the expirence to know when to apply them. The skills from one language translate to anouther easially.

    I have on several occosions sat down with the source code in a language I've never seen before and found the bug in it. EXPIRENCE to know where to look, and the simple structure of all languages allows me to do this.

    Learning a programing language isn't like learning a foreign language. Once you know a LISP, a Assembly, and a C like language you can learn them all in no time. Learning a new one is like a Londoner going to the southern US. The langauge is different, but it won't take long to converse with someone there.

    To answer the question: I use TCL/TK, C/C++ (C compiled with a c++ compiler, and in c++ function calling. I'm interfacing with hardware so c++ isn't of much use), and a private OOA thing (4gl?) that gets compiled to C++. They all work, and all are the same, even though the OOA doesn't support arrays to my annoyance, I can code arround that.

  • Oh yeah, Perl, Perl, and more Perl... it's nice ant portable, I do mostly cgi stuff, and run perl scripts on the Mac, NT (ugh!), FreeBSD, Linux... the swiss army chainsaw! the duct tape of the internet!


    Mmmmm... perl....
    (and occasionally UserTalk, and AppleScript)
  • 50k line accounting package, windows though..

    Guess that puts me in the 20k+ group, heh...

    Oh well, now I get to maintain that program, and write cool things in perl, c/c++ and java...

    I like my job :)
  • I wrote a whole bunch of code for classes last year. I had 5 compiler theory programs that each were around 1800 loc. Took me a while, but it all worked.

    -Ben
  • At work, approx 50,000 lines
    At home (personal projects) approx 15,000 lines

    The 50,000 lines were Java, the 15,000 C and C++

    I wish the totals were reversed, but unfortunately I have to code what other people want me to code so that I can feed myself and then code what I want to code on my time off ...
  • Sorry, K&R is the One True Style, and it uses:

    if (...) {
    ...
    }

    That's what I use. And I *always* limit my lines to 80 columns, which may inflate my line count a little bit. Also I tend to comment fairly liberally and use whitespace abundantly when it helps to clarify code. Some of my coworkers write lines 160+ characters long (!!!) and they don't comment at all. So my 50+ Kloc is a somewhat inflated compared to them.

    However, I feel that neat, readable, well organized, commented code is implicitly more valuable so while my whitespace:line ratio is higher, so is my value:line ...
  • I switched from putting the curly brace at the end of the line after fixing one too many compile problems where I thought there was a brace but there wasn't or where I thought there wasn't a brace but there was. Give me
    if (...)
    {
    ...
    }
    any day.

    I agree with you on the 80 columns bit though, long lines mean too danged much resizing. And cygwin doesn't like to resize width-wise.
  • > it just irrtates me that someone [...] gets paid $60K-$120K a year while I'm [getting] a whopping $25K per year.

    Then find another job. It's not you who should be irritated by them, it's they at you for doing what you do for such a piddling wage and bringing the pay average down. It is not immoral or disloyal to leave a place where you are underpaid and overworked.

    And if you're young, a certain amount of job-switching is good for you. You'll get a better sense of what work environments there are, you'll find that your best pay raises are when you change jobs, and you'll make useful contacts. Also, once you've left school, work is one of the major places to meet people, both for friends and potential mates. And it can be problematic to get involved with the latter while still working for the same company.

  • Heh. You mean 'LoC' should therefore be:

    cat bigprog.cc | indent | wc -l

    (or, as an added bonus, probably more like

    cat obfuscated.c | grep -v ^# | cc -E - | tr -d '\n' | indent -kr | wc -l

    )

    I actually needed to strip carriage returns because indent couldn't figure out how to indent ascii-art-c properly... oh well... :)

    Of course lines of code is an inaccurate, stupid, braindead, unreliable measure of productivity. So are bogomips. We still use them... However, they aren't the holy grail of software metrics or benchmarking or anything. That's why this is just a poll topic, instead of something serious.

    However, if you think it *is* the holy grail, use gnu-style indenting. More lines-of-code thank k&r, so your code is suddenly better. :)
  • by bhurt ( 1081 ) on Thursday April 15, 1999 @01:19PM (#1932071) Homepage
    ... it's downright deadly. Using LoC as a measure of programming productivity:

    - Encourages cut&paste programming ("Look- I just wrote 2000 lines of working code without touching the keyboard!"). The danger of cut&paste programming was ablely demonstrated with the Y2K problem- simply fixing the problem in one place doesn't fix the problem, you have to find all the places that code was copied to and fix it there as well. What should have been a 30-minute fix turned into a multi-year code spelunking expedition.

    - discourages black-box reuse- both of older code and of existing libraries.

    - discriminates against maintainance programming-
    did Linus write 5K lines of code this year? Not much more, in either case. Is he lazy? Of course not.

    - discriminates against testing and debugging. I'm a middlin-decent typist (for a programmer), and I can type in one to two thousand lines of code in a single day. Testing and debugging said code can take days or even weeks- during which my code output is exceeding low to non-existant. Last year there was an entire month of busting my butt where my sum total code output was 2 lines (finding the two lines of code in the device driver to fix was a bitch).

    - Discriminates against time spent on design. Especially design which increases code reuse, and decreases code complexity and size.

    - Discriminates against writting documentation (that's not code).

    In fact, LoC Discrimates against all parts of programming _except_ the typing parts.

    Programming is not an assembly line production, no matter how much some managers wish it were. There are no easy measurements of programmer productivity. And I do not beleive the American programmer is "lazy".
  • I'm not sure, but this seems kind of like engineered information to support US companies looking for "relief" from the current IT job crunch.

    Its amazing how companies dig on supply and demand until we start talking about capital/labor markets. =)

    Of course, all metrics have problems, and the L.O.C. thing has been well covered, blah, blah, blah... I just don't like the idea of someone telling me that my 60 hours a week aren't good enough.

    oh well...enough whining... I have to get cracking if I'm going to beat my "nonunion mexican equivilent" (to quote the simpsons).
  • Then of course a lot of those are

    { return depth; }

    kinds of functions, inheritance, and templates like hell. And yes, I tend to omit return values.

    It just shows how far people will go to make the computer do what they want yet no corporation will ever pay anyone to code it.
  • For me the clear point from the article and this thread is that management of software development is still failing.

    This is a key reason why free software results in higher quality software with fewer bugs.

    In many companies the behaviour of management actively destroys productivity and quality. Many managers completely fail to understand software development and the people who do it.

    Company Management forces software developers to spend time on none productive work. It rewards behaviour that is not aligned with quality work. It measures the wrong things.

    People like DeMarco have understood this and documented it for years. Have companies listened? No.

    The article in CNN clearly shows a manager who also completely fails to understand how to manage software developers. He needs training urgently.

    In my last job before starting my own company I worked as a hacker and then team leader for a software company in the City of London on banking software.

    That company was run and managed entirely by sales people. They did not understand AT ALL what motivates programmers/analysts and designers.

    These types of managers try to measure software development by number of hours worked and LOC. The results are late, slow and buggy software.

    Compare this to free software which is often developed at an amazing speed, has fantastic performance and many less bugs.

    Whats the difference? Free software does not have managers!

    Do the world a favour - sack a manager today!

    Dave
  • it's not about measuring SPECIFIC people, it's about estimating a new project in a specific area of the company...

  • e.g.

    #include <stdio.h>

    int main(int argc, char **argv)
    {
    printf("Hello, world.\n");
    return 0;
    }

    How many lines is that?
    7? (total lines in program including blank lines)
    6? (ignoring blank lines)
    4? (is '{' and '}' really a 'line of code'?)
    3? (#include is not really code)
    hundreds? (stdio.h is huge and includes many other header files)
  • Program A

    Software person 1: does it in 20k lines

    Software person 2: does it in 10k lines

    Which is likely the better program & programmer?

    Tom

    "I would have wrote it shorter, but I did not have the time" - gist of a famous quote
  • The first programming lesson you learn is how to print "Hello World" on the screen five times. The second lesson you learn is how to use a loop to do it in a three lines of code.

    And then you learn more and realize the loop is just baggage and makes it slower.

    And then you learn more and realize the compiler figured out the loop was static and unrolled it.
  • . . . yeah, those benchmarks suck. The Pentium II is faster than the G3 any day. . .
  • . . . i wish more programmers documented as well as you do. Sometimes I can't read half the crap.
  • well, I'm a C newbie too, and I find that one liner much easier to understand.

    More verbose comments, and no unnecessary carriage returns and indention. THAT'S what your source code needs!
  • by jafac ( 1449 )
    hows this for a metric?

    Stock Options.

    If the programmers write good code, make good products, sell lots of CDs, make the company successful, they get rich.

    Duh.
  • No, but a guy who writes 5,000 lines that work well and are thought out IS better then the guy who just kicks out 10,000 lines of hack.. ;-P
  • I wonder if this is due to the development tools used. I'd like to see how many of those lines where part of Visual C++ programs, where Microsoft, in their supreme wisdom, writes half the code FOR you..
  • Yes, and technically, these are NOT WRITTEN lines of code..

    Not to mention things like MFC.. Heck, one function call can actually serve what 500 lines would do..
  • Oh dear.. I USED that system in the Air Force.. ;-P YOU wrote it?? I now must kill you.. ;-P
  • And I'm willing to bet they actually have a life.. Terrible, isn't it..

    We shall have to double their workload.. They obviously have time..
  • Wow. What an incredibly lame coment.
  • Yes, but unfortunately, the management types that know nothing about programming buy this crap about lines of code.

    If there's one thing I've learned, studies are the most effective way of lying.
  • I agree with Linus Torvalds on the subject of eror checking. If it's programmed right, then there's no need for error checking. If error checking is necessary, it's time for a rewrite.

    Leave the BS out and debug instead.
  • Yeah--time to start putting that extraneous debugging code back in, too. Do sorts on data, then throw away the results. Add stuff like "i = i;" too, for good measure. :^)
  • I'm still in the academic world (unfortunately) and have noticed that programmers from other countries (students, actually) tend to write kludgy code with lots of unnecessary code. Rewriting basic functions, that sort of thing.
  • No mention of how the data were collected...for all we know, Meta Group's people could have merely mailed out a survey. There are just too many variables to do this...and I suspect this is what was done, because I seriously doubt that the researchers did line counts themselves.
  • Yes, along with most the programming world. :^P
  • It's a holdover from productivity measurement developed when the United States was part of the Industrial Age. The more you make, the better you are; quality be damned!
  • What a ridiculous sweeping statement. So the guy who writes 1 line of code a year is better than the guy who writes 10,000?
  • You forgot to add:

    AFTER ADVANCING ONE PAGE.

    "In true sound..." -Agents of Good Root
  • It's nice to see a poll that really relates to my life...

    20k++ aww yeah...
    --
  • I am an Englishman who is working in France, for a French company and who works with French programmers daily.

    I would like to say that my collegues are as competent as those of any other nationality that I have encountered. I would also like to point out that the French education system prides itself on producing very qualified (note - not competent!) "ingenieurs" - this is a word that does not completely translate into engineer as it can only be applied to university educated professionals.

    At least our friend is programming in his native language (I presume) - the keywords are not translated, nor are any of the library functions. What is even worse is that if one uses "MS Dev Studio" - the help, nor the program has been internationalised!

    One could therefore suggest that a Frenchman who can do what our friend does is actually MORE competent, as he is working in a foreign language!
  • I dunno about this poll... Some languages take fewer lines of code to do things.... O well..
    Most my code was in PERL... the Programming language of the gods...

    ChiefArcher
  • It depends a lot on what you do, but possibly even more on who you work for. A lot of orgs have "pet languages".

    If you work for DOD, you'll probably use Ada. If you don't, you probably won't. Some places do everything in Smalltalk, or LISP/Scheme, or Python.

    I work at a consulting company now, and we use Java, C++, Perl, and VB for pretty much everything. I personally do Java almost exclusively for work, though I know plenty of other langugages.

    But I have to say that it's useful to get a broad exposure to languages, even if you rarely use some of them. At school I learned a variety of AI-oriented languages (most dialects of LISP/Scheme) and I'm much better for it, even though I never use them in RL. For one thing, it helps me "think outside the box" (to use a cliche), but it also pads the resume...more important than you expect! :-)

  • I'm not in the business world yet, but these numbers all seem really really low. 7,700 lines a year = 148 lines a week = 3.7 lines an hour. Huh? Who writes 3.7 lines an hour?? Who even writes at such a slow pace as even 10 lines an hour?

    This makes me wonder how they define "programmer." Perhaps "programmer" is just someone who happens to write some code in his or her job.

    I'm just a college freshman now, and on a good day, I'll write 500 lines (a line being defined by wc -l) in a single day. Multiply that by ~250 work days comes out to 125,000 lines in a year...

  • Read my lips:

    PERFORM PERFORM PERFORM
    DECENT DECENT DECENT

    If you went to a descent school you'd be going DOWN! Maybe you're making $25K because you CAN'T FREAKING SPELL! EVER THINK OF THAT?! SPELLING FREAKING COUNTS, PEOPLE! Maybe the consultants just don't look like an IDIOT in written communications!

    Arrrrrggggggghhhh!!!!

  • I hear you brother! I'm currently re-implementing a program to interface with a DB in a very simple way. The totality of the files counts upwards from 14,000 lines (just a wc -l kind of count).

    I estimate that when I'm finished with the project, I'll have roughtly 1 order of magnitude less than that. I.E. 1,400 lines, give or take a thousand or two.
  • How many lines of code did I remove?
  • Yes. I looked at only a few of my projects from the last year. A rough count gives me 20k. I regularly write more than 1500 lines of code in one day. Watch me be accused of writing bad code now. Oh well.

    Your code is probably fine. Looking at your web page, it looks like a lot of the programs you've written were things you wanted to write, and that they are pretty small. What this means is that you don't have to spend as much time defining requirements (they're in your head, so you know them as well as anybody else,) meeting with clients (you're the client,) and trying to coordinate with larger teams (small project, one programmer.) These are absolutely necessary if one is to deliver large-scale software to others, but they don't produce a single line of code.

    For example, the current web project that I'm on has been going on for two months, and is scheduled to go on for a third. We just finished requirements definition and design, which means no code. I'm not worried, though -- the time spent in defining the problem means that we will waste less time in coding, and probably actually write less code as a result.

    Again, lines of code is not a good measure of productivity, because it doesn't distinguish between code that needed to be written, and code that didn't. It also neglects the important front-end processes that ultimately (unless taken to ridiculous extremes) compress schedules and make each line of code more efficient.
  • at school they teach us c++, although in some of the higher classes (e.g. operating systems) c seems to be more useful.

    i prefer programming in c for recreational stuff as well. i have also had jobs that required c programming. almost everything for my current job is pl/sql based (a pretty horrid crossbreed between ada and sql)

    definitely you should be comfortable with c. not necessarily good, but at least competent. also learn perl. while i have never done much commercial work with it, it is an invaluable tool for quick fixes, and fast development
  • I agree, measuring LOC is foolish, it promotes the practice of writing poor code or bloatware.

    LOC is a metric that is meaningless, the only folks that use it are the idiots in management that don't understand SFA about code.

    Which is better code?

    main( int argc, char **argv )
    {
    int i, x;

     for( i = 0 ; i   x = ( 80 - strlen( argv[i] ) ) / 2;
      printf( "%*.*s%s\n", x, x, "", argv[i] );
     }
    }

    ...or

    main(
    int argc,
    char **argv[]
    )

    {
    int i;
    int x;
    int n;

     for(
      i = 0 ;
      i   i = i + 1
     )
     {
      x = strlen( argv[i] );
      x = 80 - x;
      x = x / 2;
      for(
       n = 0 ;
       n &nt; x ;
       n = n + 1
      )
      {
       printf( " " );
      }
      printf( "%s", argv[i] );
     }
    }

    More lines of code does not make better code. I can crank out a more lines of code just by screwing with whitespace and the formatting as I did in the above example.

  • Ugh... you code in Scheme for *pleasure*? I didn't think those two words could go together...
    gads that language sucks roadkill
  • Mostly Perl. A dab of C here and there and I once took a class in Pascal (talk about the most useless class I've ever taken .. worse than Human Sexuality).
  • Objective-C and Java are the language to use for object-oriented geeks like me.

    How come almost nobody here seems to be using Objective-C?
  • LOC is no way to rate quality. The same could be said concerning all areas of our workforce. Are Amercia's ditch diggers any worse than the rest of the world because they shovel less dirt? No, we use cranes, bulldozers and such. Better tools make for less effort and better quality.

    ----------------

    "Great spirits have always encountered violent opposition from mediocre minds." - Albert Einstein
  • I've written 50K lines of code a little less than four months, in a 2 person team. My coding speed is 500 lines a day in a coding run and I've seen people who are better than me.

    Admittedly, it takes longer to debug everything, but those numbers above have to be bogus. If Rob can write 15K, slashdot, study and party, then it's not programmers that are the bottleneck, but the managers.
  • this poll reminds me of steve balmer talking about m$ and the time they had with ibm & os2 (revenge of the nerds, m.kringley) where the ibm suits ranting about KLOC (thousand lines of code) and how they would pay m$ per KLOC

    it aint about the amount of lines of code u generate. programming is more akin to writing hiku(sp) than say telephone books, so those in the high KLOC's are either very fast programmers writing very big systems or telephone book coders featuring up their next release software...sound familiar?


  • At my prior job, an Indian company was outsourced to re-write the company software for a migration off a Wang system onto an AS/400 with client/server support for PC systems. I had a few problems trying to make revisions to the code that definitely lowered my productivity:

    • variable names were ambiguous(for an English speaker at least) which made it hard to follow the programs
    • spagetti code was rampant, goto's everywhere. It reminded me of my days on the Commodore 64. At least on the C=64 you had to use GOTO
    • entire sub-routines were routinely disabled by placing a GOTO as the first line of the sub-routine, where the destination was the end of the subroutine. This not only had the effect of inflating the lines of code written, but it also made me less productive by making the program much harder to follow (all the calls to the null-subroutine were still active)

    I suspect another issue is that American companies in general have been computerized longer than companies in other countries. This would imply that more work in the USA would be done on maintenance than on developing new code. Maintenance work tends to produce fewer lines of code due to the time spent analyzing the program before changes can be written and applied.

  • Actually, I'm incredibly lazy. Or to put it more succinctly, I am an incredible slacker (all hail Bob!).

    And my lazyness is one of the reasons I'm a great programmer, both in implementation, and in GUI design. I hate (absolutely abhor) to do anything twice. Three times is my max. When I have done something, codewise, 3 times (with small variations), I immediately yank it out and make a new method or class or (C++ Template).

    "Europeans tend to have a more disciplined, engineered approach to software development. I think we can carry some of that philosophy over to the U.S. without making people feel like they're being boxed-in"

    Software "Engineering" is still something of a misnomer. Software in many respects is still a craft, and discipline is efficiency of design before it is efficiency of effort. If your mind is disciplined, you analyze the problem completely before you code, and your code will be more efficient because of the design (in theory).

    "lean more on using packaged software than we do in the U.S...."

    I take it this means that they count code generated by code-generators in common GUI building IDE's as "developed code"? Also a major falicy as we all know.

    Proud to be a slacker...
    "You lack slack, Jack!"
  • I figure the full text, commented out at the end will add to my productivity immensely! But seriously, haven't we gotten out of the "more lines of code are better" mentality back in olden days? From what I can tell, American programmers aren't better or worse than other programmers all over the world, so why do we need this ridiculous metric to tell us what to do?
  • by DLG ( 14172 ) on Thursday April 15, 1999 @01:02PM (#1932235)
    I don't really know why it would be American versus the rest of the world or whatever, but I have always considered laziness as a virtue. Generally it motivates people to produce time/energy savers such as cars, compilers (I know, some of us just love to handcode stuff in whatever our favorite executible format is, but personally I dig how much C and perl accomplish...) and toilets.

    I cannot even vote for how many lines of code I have written. Probably 100 different projects between 10 and 1000 lines of code, but most of them really involve cutting and pasting. Hell, if I wasn't so lazy I would probably be using more modular code than I already do and thus write even FEWER lines of code.

    I expect on the otherhand that they didn't actually ask any of the Microsoft OS developers. From what I understand they write kajillions of lines of code, as they have to rewrite the OS over again everytime they make a revision.

    Wheras with Linux generally you take someone elses code and rewrite the important differences.

    So is OpenSource's real value the way it supports laziness? I mean there are all sorts of projects I have done where I have looked for appropriate open sourced code before hand coding. Is this wrong? I would say that a real measurement of productivity would be how much a single programer could accomplish with the fewest number of lines but since the majority of the lines of code written by US programmers are actual productivity wasters like games it is somewhat ironic...

    Productivity is measured by how much work an individual can accomplish in a consistent amount of time. The notion that writing more lines of code as getting MORE done in less time is far fetched.

    Lazy=good or we would still be hunter gatherers.


    DLG
  • Accurate answers:
    Sex: As often as possible. Once a year. Both true.

    LOC: 1000. Also true.

    Oh, wait. I guess that first question didn't actually say "...with someone else?" If we count Unitarian worship, 500 to 1000 per year.

    Pretty pathetic, huh?

    Despite all prior comments, I KNOW you're all laughing at my smaller LOC.

    Bastards.
  • Didn't we just go through this kind of idiotic misreporting and skewing of data with the NT/Linux benchmarks?

    As some of the above posters have mentioned, number of LOC is an extremely foolish way to track programmer productivity. The fewer lines of code used to preform a task the better the end product will preform (generally). I mean if they really wanted just mass LOC why not have everyone slapping programs together with Delphi and we'll all have 6MB executables for our terminal emulators.

    Besides, if you go to a descent school for CompSci, you should have the idea of less is more drilled into your skull. I dunno it just irrtates me that someone who manages 3-4 Access DB's (little one's with maybe 3 users and a couple thousand pieces of data) gets paid $60K-$120K a year while I'm building a PHP/MySQL backend for 20-30 different DB's in my spare time between trouble shooting the main IS dept's networking f*ckups AND dealing with all the user issues, for a whopping $25K per year. It's this same unbalanced kind of crap that can cause sloppy coders (who generate more LOC) to be promoted/given raises over the more effiecient ones. Heck that's one of the reasons that we're losing the only good people in our main IS dept. The MIS higher ups are making idiotic Dilbert-esque demands on the IS folks and the talented ones are getting so frustrated that they're leaving. Which means that the IS dept is gutted and people such as myself (in another dept.'s IS) wind up having to pickup the slack and double/triple our workload. Meanwhile management hires consultants at $130-$300/hour to do crap like setup new PC's. And the crowning jewel? The consultants screwup constantly and I wind up having to drop everything and go do it right.
    *rant* *rant*

  • Speaking as a member of my friendly neighborhood metrics team, LOC can be as good as any other measure. It depends how you use it, of course.

    I agree completely that evaluating someone based on how many lines of code they have typed is really stupid. But I have trouble believing that many functioning companies use LOC in this manner. The correct use of metrics, such as size (indicated by lines of code or function points or whatever), is in generating estimates and tracking actual progress to those estimates, so that the project manager can tell if the project is on schedule. In this sort of application, LOC are generally counted based on the code affected by changes - that could include new code, modified code, or even deleted code. This number is then cross-referenced to historical data / estimated data (depending on use) to allow analysis.

    using lines of code, or any other metric, to evaluate an individual is strictly forbidden by my organization, and hopefully most others. Once you start evaluating individuals based on metrics you can throw out all hope of gathering useful information, because everyone will be covering their asses.
  • Those 8,000 lines were almost all Perl. How many lines of C to do the same thing?

    Feel free to multiply by 5
  • It seems stupid to base productivity on the number of lines you code. Just look at almost any M$FT product, they have an incredible line count but not many of them are very productive. If line count was a measure of how good a program is then all bloated M$FT programs would perform really well. When I was in high school our programming teacher told us to try and find an algorithm that was the best solution with the least amount of code. So if coding fewer lines make me a lazy american programmer then so be it.

BLISS is ignorance.

Working...