Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

What are Your Programming Goals? 350

Crutcher asks: "I've been walking the murky path to one day becoming a Systems Programmer, and I was wondering what other not-yet-gurus like myself saw as their ultimate goals, and why they choose those fields? Do we all want to hack kernels, or do more want to be UI Gurus, or Deamonic Masters? It would help if we (the proto-gurus) had clearer understandings of where we could go, and what it takes to get there." This is an interesting question. I've discovered however, that the more I learn about coding (including new paradigms and languages), the more my goals have changed. What are your thoughts on this?
This discussion has been archived. No new comments can be posted.

What Kind of Programmer Do You Want to Be?

Comments Filter:
  • by Anonymous Coward
    A GOAL would be a specific, obtainable target, or 'thing'. i.e. "I want to build a desktop environment for Linux that works".

    A purpose is a reason for doing things, or an effect that is intended or desired. "Increasing my programming skills." is a purpose - its an effect that is intended or desired.

    You've listed 6 (3/3) *purposes*, but no specific goal. What's the goal? You going to build something, or have a suite of apps under your belt, or start a new community-developed project? Set that GOAL, and orient your purposes around it.

    My advice is to set a GOAL ("Build an application that allows Internet users everywhere to share files freely and relase that application to the 'net"), and orient your PURPOSES around accomplishing that GOAL ("Learn multicast programming techniques, encryption, participate in the FreeNet project, etc").

    This might result in better progress, overall. It might seem obvious, but sometimes the most obvious things in life are the things most often ignored.

    In the case of the original article poster, my advice would be to set a specific goal that interests you - some obtainable 'thing' or 'target', and then orient your purposes around it.

    For example, you mentioned that you want to become a Systems Programmer. What are some of the products that a Systems Programmer produces - operating systems, device drivers, new techniques for memory organization, embedded systems, etc. (It's a big field, Systems Programming).

    Saying that you "want to become the best Programmer you can be" is an admirable purpose, but it's a goal-less one. Instead, focus on what the actual "result" of that purpose is going to be - maybe you're interested in building a new operating system. That's your goal. It just so happens that in order to build a new operating system, you have to be a pretty shit-hot Systems Programmer ... so, by doing all that, you will have achieved your Goal (Linux!) and purpose (to become a great programmer) at the same time...

    Just a little advice from someone who would prefer to remain Anonymous for this post, but who participates in Slashdot frequently ...
  • by Anonymous Coward
    My main goal as a programmer is to get out of the computer field. Quite frankly, I've been disappointed in the industry. And I've been working in it for 8 years. I would recommend that you do a couple internships in the real world where you'd be working and make sure it's for you. I so wish I had done this when starting out. The best thing about the computer industry is that the pay is excellent, so far...but more and more companies (US) are hiring folks from India and other countries because they can pay them $25,000 a year to work. Also, I don't know what your gender is, but the computer industry has been extremely unfriendly to me (a female). If you're a guy, no need to worry about that. Anyway, just make sure it's what you want to be doing..because once you get your degree and have been out working for a while it's pretty tough to go back to school and to try to switch especially if you've got a family depending on you to bring home some cash.
  • by Anonymous Coward
    Given the other comments - I do agree, although I do not have one.

    Algorithms and theory are the basic building blocks to write better software. Programming by itself - experience and manual labour. One of my major problems now is time. It just often is not possible to work through a textbook and of course more often than not there is no-one around to ask.

    A good algorithm can make an application twice as fast as a simple version. At the same time you might spend weeks optimizing it and gain short of nothing.

    If you are somewhat brainy, then the day in day out stuff will not take that long and, as I said before, experience does matter, but it is something that takes time and cannot be learned.

    What else to say - have fun and no long term goal. Try to be better than you are, never be satisfied as there always room left for improvement and do not make compromises. Your customers pay you 100% real money and so they deserve a 100% and rock solid solution. If they do not pay you because the software is free, then they trust you and deserve this trust to be honoured, full stop. I am not interested at all in whether you can achieve it or not, but if you are not willing to try - go away. If you are willing - there is nothing but a piece of code of which one can be proud of.

    Last but not least - do not overdo it. 12 - 14 hours sessions are nice and make you feel a terrible nerd, but quality, and so satisfaction, is going to degrade in the end, say do not burn up yourself by trying to achieve all. You cannot; too many new and interesting things are coming up all the time and yes, I like this job 8)

  • I've learned while doing a variety of work that I love streamlining processes. Every time I make somebody's job a little easier, it's rewarding. My current goal is to simplify the jobs of our operations department as completely as is possible, by improving the software used to configure and monitor our network of servers.

    Your goal is to become more proficient in your craft, and to make your work more effeciently accessible by people. You say that these are values, not goals, but that's semantics at best.

    Goals are important, the problem is that some people have bugs in their goals. They define success as a static condition. As long as you realize that success, and the goals associated with it, are by neccessity dynamic, goals are incredibly useful, if only to help stay focused.


    ----------------------------
  • by joey ( 315 )
    The more I think about this, the more I feel my really long-term goal (10 to 15 years) is to find something to apply my skills to, outside the realm of just plain computers.

    It's easy to lose sight of this, but computers are tools to make it easier to do certain types of work. When you find yourself devoting all your time to improving the tools, and never actually using them except as a way to build better tools, something is out of wack. Especially when you consider that software is moving so fast that it's likely that your improvements will be inconsequential in a few years.

    This is why I admire people who use computers -- and program -- as a tool to accomplish some larger goal, be it modeling supernovas or helping people in the third world. My goal is to find something that interests me enough so I can move on from improving my already excellent tools, to actually using them.

    Unfortunatly, I haven't had much luck so far, but I'll keep looking ..
    --
  • I wrote cute things in basic on an Apple //e because it was fun. My goal was to enjoy myself. Enjoyment led to games, and games led to machine language. An insurmountable obstacle for the limited resources available to me. I had the assembler, but no documentation. Stuck, dead in the water, for years.

    I stopped programming for a while, then I moved on to a WinTel box and discovered Demos. I was awed. I wanted to do polygons. 3D was my goal. But, WTF is this? DOS/Windows doesn't come with any real development tools? I have to PAY for languages? Oh well, I got a job and went shopping for an assembler (because everyone knows all the coolest demos were written in assembly). The salespeople were stunned... "Assembly language? We don't stock that. Here, this box says it includes a mini-assembler"

    I went home with Borland Turbo C++ 3 for DOS. I had to learn C just to create the framework for my assembly programs to run in. By the time I'd learned enough, I discovered that developing for DOS is a pain in the arse. I dunno... It just stopped being fun around the time I figured out that to get anything really cool on the screen required practically writing your own device drivers. Dead in the water again.

    But with a kicker... Someone was willing to pay for my talent. I started writing windows software and was spoiled by frameworks. I lost my interest in writing "cool" stuff. My new goal was to write GOOD stuff.

    At about the same time I got fed up with Windows, I rediscovered Linux. I'd given it a go once, but it didn't seem "ready for prime time". Now, it was almost ready and seemed worth the effort. It was a bit longer before I ventured to write any code on a linux machine.

    As it stands, I do very little linux development and a lot of windows stuff. I'm heavily influenced by the "UNIX way", and I'm constantly agravated by the need to make Win32 calls to get any real work done.

    I'm impressed with the quality of the libre compilers, but I'm a bit underwhelmed by nearly every other aspect of linux development. At least the documentation never lies to you on a unix system.

    --Threed

    The Slashdot Sig Virus was foiled before it could spread.
  • I'm what you could call a systems programmer, I write a lot of device drivers, write and/or graft IP stacks into embedded OS kernels, etc. Those applications (as opposed to system level stuff) I do write tend not to have GUI interfaces for human input, but rather process input from sensors that detect things like cars going through tollboths or move network packets around. There's never much other software betweem my code and the hardware, and I like it that way.

    How did I get here? Two influences come immediately to mind: my first computer, a Commodore VIC-20, and my high school's VAX 11/750. The Commodore was limited, but you had complete control of the machine, and if you wanted it to do more than just make cute sounds and display characters on a coloured screen you had to learn everything about it. Today's more complicated and powerful machines give you a nice ready-made front end so that graphics and such are comparatively easy to do, and there is speed to spare so you don't have to be as careful with your code. On that simple machine, if you wanted something done you had to talk to the hardware.

    The other big pre-university influence, the VAX, was enormously powerful compared to the PCs of the time, but couldn't do real graphics on the VT100 terminals students used. Flashy games and such weren't really the way to go, but I did want to explore what the system could do. So I got into playing with processes, IPC, learning assembler, etc. I also learned a lot about a very complex and powerful OS and how it worked. By the time I got to university, I found working "inside" systems was a lot more interesting -- and generally easier -- than developing user-oriented stuff.

    That interest spurred me into getting a co-op job with a printer company, where I did real systems work on controller OSs (we called them "executives"). And that experience got me into a job doing different sorts of embedded and realtime work, but almost always at a systems level. I haven't always singlemindedly focused on this sort of thing, I did graphics and databases in school and some amount of GUI and RDBMS work professionally, but it was never as fun or satisfying for me as the systems stuff. With systems work, you still have that sense of really controlling the machine that you just can't get when you have to ask somebody else's OS or GUI API's do do work on your behalf.

    And that's how it went.
  • It's worse than that: with full knowledge of the problems with gets(), Strousup gave us istream::operator(char *) with _exactly_ the same problem. Allowing a new generation of programmers to program their buffer overruns in an object oriented language.
  • To quote the standard:
    [#1] The function called at program startup is named main.
    The implementation declares no prototype for this function.
    It shall be defined with a return type of int and with no
    parameters:

    int main(void) { /* ... */ }

    or with two parameters (referred to here as argc and argv,
    though any names may be used, as they are local to the
    function in which they are declared):

    int main(int argc, char *argv[]) { /* ... */ }

    or equivalent;8) or in some other implementation-defined
    manner.
    That said, some compilers will allow main() to be defined to return void and still have a defined termination status (FWIW, gcc is *not* one of them. Declaring main() to return void will leave you with undefined behaviour under gcc).

    The point is, declaring main() to return anything other than int on a hosted environment is not portable and, most importantly, not useful. If you disagree, I suggest you follow up at comp.lang.c and see what kind of reaction you get.
  • Everybody always asks me about my goals, future plans, all that stuff. To be honest, I don't have any.

    Since about half way through college, this whole hacking/coding/systems thing has just been one fun game. I wrote code and learned what was in front of me, and it all was fun. I dropped out of school because the start-up let me play with computers more. There are hackers I work with who know more than me, so perhaps it's a goal to learn what they know, but I'm honestly just curious.

    So here's my advice about having goals. Don't. Do what you love, what interests you. There's no need to plan ahead that much if you enjoy what you're doing now. Financially, of course, plan ahead, invest wisely, all that jazz. But as far as a career is concerned, forget planning. Things change so fast, just enjoy what you are doing. If you love to learn and discover things as much as I do, it all will follow.
  • Okay not mine - I actually like it for the challenge - but a lot of people's [latimes.com]. A pretty interesting story nonetheless, so even if you hate my stylish implicit linkage check it out...

    --
  • Mastering programming is only the first goal. There are others that I've run into in my time as a professional coder.

    Become a good public speaker. I've done 3 speeches for different conferences now (2 technical, 1 for a business audience). I'm constantly amazed by how much recognition and advancement you get. Plus, it's fun once you get passed the nervousness.

    Become a good man-manager. Different companies have different names for it. I'm a "project lead" at my company. It is a good thing for yourself and your career to be able to manage people well. Management is about more than just setting scheduals and budgets. You need to be able to judge the amount of work someone is able to do, fix any interpersonal problems between team members, notice when a person is bored or overworked, advise your superiors on new programs/incentives, etc. The benefits are more pay, better control over your working environment, and (if you're good) a more pleasant time.

    Prophecy. This is a tough one, but working on it helps. The better you are at predicting the future, the more fun and profitable your work as a coder is. If you can tell, for instance, that a new technology is going to take off, you can prepare for it and be ready. If you can tell what your company is going to do, you can be ready. Most often this means leaving a company before the going gets nasty, but often it means starting a new division before the competition (and thus, a better title and pay for you).

    -Dave
  • Comment removed based on user account deletion
  • Comment removed based on user account deletion
  • Get a degree in computer science. While a degree does not a programmer make, it should certainly help give you the theoretical background you need to become a good programmer. Plus it will help you get a good job.
  • I'm not creative enough to be a UI hacker (though I've read the Human Interface Guidelines from Apple and used several different types of GUI/CLIs, and I have a built-in aesthetic of what seems "right" about an interface)

    Sounds like you'd make an excellent UI hacker; that built-in aesthetic is essential for building good UIs. Creativity in UI design is not always a good thing. See the Interface Hall of Shame [iarchitect.com] if you don't believe me.

    29? You're way too young to be bitter and disenchanted. Don't let those hordes of 22-year-old millionaire dot-commies try to persuade you otherwise. I was once in very nearly your position, except that I did finish school (with a History degree, which is not much more useful than having not graduated). Did my time in tech support and took a bunch of classes, and was able to hang up my headset for good and go into development when I was, well, 29. Making that transition was greatly facilitated by finding a position that combined my support and development skills, supporting a commercial API library. This is more or less the standard route from support into development, at least in my experience.

    And the advantage of getting older is that people pay less attention to the Education section of your resume (which belongs at the bottom, incidentally).
  • "I don't know what hardware your EE was blaming the lighting on, but on some systems it really does matter."

    Fair enough. But in software, we have error handling. In hardware, we have shielding. In both, precautions get short-changed as deadlines approach, or projects go over-budget.

    Neither hardware nor software developers are really that much better in this, except that since hardware bugs tend to be so permanent, there are fewer of them (because people care more about them), and the ones that are left are much more annoying (because they won't go away). Pretty much a wash, if you ask me.

    phil

  • Administrative Assistant - a comatose, gun-chewing zombie


    --

  • After that, my goal is to return to and spend seven years in a monestary in the outskirts of Kathmandu [kathmandu.com] studying the Dharma [hindu.org], emulating the Bodhisattva [geocities.com], finally attaining a degree in Compa rative Religion [harvard.edu] from Harvard, followed by enlightenment, metaphysical realization and entering Nirvana to host a global video conferance with the late Alan Watts [alanwatts.com], the Dali Lhama [mountainmadness.com] and Timothy Leary [leary.com].
  • My goal is to write a universal 'bullshit detector' that can apply the derivation rules of logic from 'Principia Mathematica' to first principles of metaphysics and the grand unification theorm, creating all basic sciences, nuclear-plasma physics, astrophysics, chemistry, biology, medicine and then finally the psychology of human sexual behavior. This ambitious project, on hold awaiting govt funding, entails a self maintaining massive database of 'truths' derived (under human direction as to 'revelancy' to human needs, sort of a trusted derivable encyclopedia) from said fundamental principles, with the cognative ability to catagorize any random statement into TRUE or FALSE in a reasonable amount of time by cross correlating it to the massive database as logically derivable from self evident axioms or not, complete with referances to prior research, bibliography and derivation path (for human checking of logic) - sort of like an automated library research assistant, or like a web search engine but only capable of producing 'true' information, while all unfounded, non-derivable speculative ideas and plain ol' B.S. are weeded out as 'false' or at lease 'not yet proven', particularly the system's self-referential 'Gödel' theorms.

    I'd like to implement this with a massive cluster of superconducting R TX-2001's [cmu.edu] or better yet the PSC1000 [ptsc.com] using a FORTH microcode as the propositional logic processor but haven't decided on the high level language yet. Even that physical implementation probably needs updating.
  • What do you mean by program invariants and proofs? If you are talking about proofs of correctness, then Lisp should be right up your alley. It's relatively easy to write up a rigorous proof of an algorithm (e.g., quicksort), which is completely impossible in an imperative language like C.

    Oh, and BTW, in Lisp you don't usually have to deal with the sort of C nonsense you pointed out before.

    Which isn't to say C doesn't have a place---it most certainly does, in systems programming---but the fact remains that it's ridiculously hard and timeconsuming to do all the idiotic bookkeeping it requires for application programming.

  • Experiment, play, find out for yourself.

    I've change my mind many times over the years. First I wanted to do games, but I found myself really enjoying learning the intricate details of how things worked inside. I taught myself all about the Atari internals, and was even going to get into the intracacies of some of the programmable signal generators and the tape drive before I discovered the ST.

    The ST had languages like Pascal and C to learn, and by the time I had learned those, I had discovered Unix, and it's wealth of things, including sockets. That lead to distributed computing, then asynchronous I/O. In the meantime, I tought myself about X and C++, and wrote a thing C++ wrapper around XLib (that some idiot deleted a long time ago to save disk space on a gigantic writeable CD jukebox that wasn't anywhere _close_ to full). I then delved into middleware, and have learned some things about CORBA and Java. I've recently become interested in the Linux kernel...

    Basically, anything that presented itself as an interesting problem is what I learned. That approach has served me well, and I have a fairly broad base of knowledge. There are useful things to learn at almost all levels.

    I will say that experience has taught me that I'm generally not interested in the UI/Application level because what people wants changes so quickly and time-to-delivery is so short that I'm rarely allowed to excersize the attention to detail that I pride myself on in my other work.

  • Comment removed based on user account deletion
  • I want to be good at solving problems. I want to be the guy that people call when all of their IT guys get stumped. I want to see a BIG RED speed dial button on the telephone of every manager with whom I work that says "KANO" on it.

    I am not nearly that good yet, but I'm working on it. I want to be the guy who gets called to testify before congress when some IT related legislation is under consideration. "Yeah, Kano, this is Senator Lott again. We need your input on the new internet regulations package that we're working on. Oh by the way, my laptop is runnig great now, thanks."

    That's my goal. I want to learn as much as I possibly can given my potential and need for a real life outside of work.

    LK
  • When I was a young(er) aspiring programmer, I wanted to know how to program the Roland MPU-401 MIDI interface, in intelligent mode. I wanted to do this in DOS, so of course I would be writing the driver for it myself.

    This would have been impossible for me to accomplish if it had not been for a gentleman on Compuserve, who worked for a company called MusicQuest (that happened to make a compatible device).

    He pointed me to where I could find a technical reference manual on the hardware, and answered the questions that I had. I was able to understand the technology in a little while and create a program that actually worked.

    Moral of the story: My goal wouldn't have been achieved if someone didn't help me out.

    So after you find out what your goal is, find someone who knows.
  • I mean, to save the world.
  • When I got out of college, I didn't have a CS degree. Although I had done a little hacking in C, Emacs-lisp, and perl, I didn't think I had the skill required for an entry-level programming job. (Now that I have a closer view of the software industry, I think maybe I was too hard on myself, but I digress. :-) So ... a few years after graduating, I started working for a hospital as a tape transcriptionist.

    I did my transcription on a DOS machine running WordPerfect 4.1. My predecessor had defined some WordPerfect macros for commonly-used medical terms. After a while on the job, I wondered: "With so many medical terms being used here, what is the optimum set of macros to use for my job?"

    And then I thought: "I have a computer. There must be a way to solve this problem with a computer program."

    So I copied a week's worth of typing into one file, brought it down to BU (where I was going to graduate school), and after some hacking with C, emacs, and WordPerfect's macro language, I had a few hundred macros.

    Now I'm working as a technical writer, and there are several repetitive tasks at my job that make me think: "There must be a way to solve these problems with computer programs ... in fact, there must be a way to write tools that will make many of these problems easier to deal with." Unfortunately, most of these problems take more than a few days of hacking to solve; fortunately, my manager says that after I deal with the projects currently on my plate, I'll have more time to focus on coding tools to help out our group.

    So, in the short run, I want to identify the repetitive information-processing tasks in my life, and treat them as opportunities to capture the repetitive aspects in code, rather than endure them as part of life's drudgery. And to the extent that my employer permits, I want to share that code with other people.

    In the long run, I want to help other people do the same thing with the repetitive tasks that dominate their lives, find out why so many people either ignore or misuse the programming tools that are already available to help them, and use that information to make better tools.
    --
    "But, Mulder, the new millennium doesn't begin until January 2001."

  • Note that the "some other implementation defined manner" thing buys you nothing; there's no promise that such a manner remains defined, there's no implementations I know of that actually promise anything...

    (And, of course, most of the people doing this are still using C89 compilers, and C89 didn't have the allowance for another format.)
  • Well, C99. C89 doesn't define the return status without an explicit return from main or call to exit(). (In fact, one of the arguments for fixing this was that the old wording was that the termination status was "undefined", but only behavior can be undefined, and the intent was clearly not that the behavior be undefined if you fell off the end...)
  • envp never got blessed. POSIX says

    extern char **environ;

    and says nothing about "envp".
  • "void main is perfectly valid", he says.

    Oh, really?

    Citation, please? And if you tell me it's from a Schildt book, I'm not even going to respond, I'm just going to laugh at you.

    You've seen a quote from the standard in this thread. It's a FAQ for comp.lang.c.

    Also, it is *NOT* an error to fall off the end of main after declaring it type int. That code is *REQUIRED* to compile correctly. A compiler may warn you (and indeed, should!) but *MUST* accept the trivial program:
    int main(void) {}

    In C89, that returns random nonsense; in C99, it's successful.

    (Hint: Before you flame me, read my web page.)
  • Uhm. "int main".

    Not "void main".

    Thank you.

  • I mostly do tools that make my life (read "work") easier, and I also cheat at video games. That's how I got started, and it's still the most fun part of programming.

    Hack for hacking's sake. You'll find a project you want to do that no one has thought of. Don't try to pick a field. Play in them all.
  • Long ago, when I was playing on a Mac SE in China, I had a chart of 68k instructions. So, one day, I decided that Wizardry I needed to give me more hit points, and dammit, I was going to make it give me more hit points. (You started with 8 in Wizardry, and I wanted 12, because I was used to hack. Stupid reason, I know.)

    So, I searched through code segments until I found a pair of move-immediates moving 8 into two adjacent locations, and I changed the 8s to 12s, and it worked. I did a few other binary patches. I once did myself a custom-tuned version of "hack" where all the monsters were 2x-3x tougher than they are in standard hack. Interesting point: It gives you more XP's if you make a monster tougher.

    That's how I learned to program. It never occurred to me that you weren't supposed to interact with binaries that way. ;)
  • I wouldn't say you can *ignore* C89; you have to expect a lot of systems to be using it for a few years yet.

    Anyway, the issue is, many systems already returned "success" automatically, and it's painfully easy for a compiler to do; gcc already had the code, and it basically makes up for one of the most common problems newbies ran into. It also encourages compilers to shut up about falling off the end of main, so people don't get the mistaken idea that main can return void in portable code.

    (For the record, Schildt is an "observing" member, which means he pays dues and never shows up at meetings, and it's unclear that he "observes" very much.)
  • Osborne once offered me $200 to do a technical edit on the entire _C: The Complete Reference_. I told them it was way too much work for the entire book.

    I have an 8-page fax from Schildt "defending" void main, based entirely on not reading the definition of "undefined behavior".

    :)
  • All too often I find myself doing things and looking for information that I need, knowing how much simpler this could be if I could just tell my computer: "Get X" or "Do Y". This is especially painful to me since I've done every one of these actions before, at least once. Saving and repeating them shouldn't be too difficult, huh?

    This problem bothers me a lot since I like helping people (including myself) -- and most people I know can speak English at a sufficient level to state clearly what they want. All that is needed is somebody or someware (program) to understand them, and to help them.

    So what I want to do is right a quick-and-dirty Natural Language parser.

    gasp from the audience

    I know the problem is considered to be impossible, or at least as difficult as creating an AI. Personally I believe that the two tasks complement each other, and when one will truly be solved -- so will the other. (Mostly because of the required 'learning' abilities the either program must have).

    That's my life's goal (at the moment, and it has been so for the last 5-7 years at least!) I may even accomplish it one day, who knows?

    --

  • I know others have said it, but here it is again: diversify. By that, I don't just mean you should go and learn a bunch of similar languages (*cough*C++*cough*Java*cough*Eiffel*cough*). I mean broaden your horizons. Take some time to study things - languages, paradigms, ideas, fields - completely unrelated to your current field.

    As for languages - well, ESR says in his page that every good programmer should at least get acquainted with Lisp, Perl, Python and C. I don't disagree (except perhaps WRT C). (If you've been taught Lisp/Scheme improperly and, as a result, now hate it, give it another try, using a more free-form approach, and in a good environment - DrScheme [rice.edu] is good, and, besides the regular Windoze, MacOS and Unix releases, there's even a distribution uses OSKit to make it an actual FreeBSD-compatible stand-alone OS!)

    Other languages I suggest: Haskell and ML (both functional languages with more "traditional" syntaxes than Lisp; Haskell is a pure functional language), Prolog (another excellent idea with a terrible reputation due to being mis-taught), Smalltalk and Self (both pure object languages; Smalltalk is pretty much the father of modern OO, and Self is its prototype-based - i.e., classless - descendant), APL (yes, APL... it's very remarkable!), and various assembly languages, most notably for the PowerPC and the Alpha.

    As for paradigms... well, don't get too attached to them. As you get some experience with various languages, you'll find that paradigms are only "right" as long as they're useful. More specifically, you'll have developed your own sense of the Right Thing in programming, your own view of what programming should be like, and you'll see how the good ideas in each paradigm fit into that. (For example, Brian "water" Rice is doing some very fine work on Slate [tunes.org], a language which, somewhat like BETA, integrates objects, components and functions on a fundamental level.)

    Also, never neglect the fundamentals. I'm talking about the theoretical foundations of computational mathematics (*): partial recursive functions, Turing Machines, etc. (Remember - don't be afraid of the math... the math is your friend.)

    Finally (and in relation to the former paragraph), sit down at whatever library you find which has a copy of it, and study Knuth's Art of Computer Programming. Despite some pitfalls, it remains one of the fundamental texts in the field.

    One last thing: go look at the long-standing Tunes [tunes.org] project (here's an explanation [eleves.ens.fr] for the less enlightened, given that the project's leader has a tendency to verbosity and obscurity when writing). Also interesting is its Languages Review [tunes.org] page.

    (*) I refuse to use the term "computer science". But that's the subject of another rant entirely...
  • Back in the early ninties (1992 or so) before my first programming job, I set as my professional goal: to become a C 'language lawyer'. (that is, the person in a programming team who understands the arcanum of a given programming language) Seven years later I am working at NASA with part of my job function being to provide C programming expertise to the other members of my team (I'm the only trained computer scientist on team, everyone else are physicists or methematicians). I have found that my goal was suprisingly easy to achieve and now need to formulate some kind of follow-up goal.

    As a professional in any field, you should probably be reevaluating your goals on a periodic basis, as well as keeping an eye on how far you have progressed toward your current goal. You are likely to be working in your chosen field for far longer than it will take to achieve any specific goal, so you need to be thinking beyond the immediately upcoming goal.
  • The important thing is to continue to learn and find new things that provide motivation and interest. Open source is invaluable in providing choices, a community, and a great knowledge base from which to learn. Without it, I probably would have given up programming after 10 years. With passion, one can do almost anything.
  • Consider your motivations and look for the path that will make you happiest.

    If curiosity is your motivation, then the best path to take is that which affords you the most opportunities to learn. If money is your motivation, find the fad-of-the-week and latch onto it with a deathgrip. Wash, rinse, repeat next week. :)

    Always reevaluate where you are and try to determine if you are happy, and whether your current path will lead to more, or less happiness.

    -JF
  • I think that it's important to start with the distinction between kinds of programming.

    There's Systems programming. There's user-space programming ("applications"). There's tool programming. They all overlap in places, but they're different arts.

    The more a programmer knows about the low level details, the closer they are to the metal, the better they'll be able to write good code. This philosophy pushes everyone towards writing assembler code. The more they know about systems programming, the better they'll be able to write good code. This philosophy drives people towards writing C code. But when you're building tools for others to implement user space programming, re-usability sometimes takes precedence over performance. Likewise, when writing applications, maintainability is key. This philosophy would drive people towards writing Java.

    Only by satisfying some of each of the three constraints can one build a truly Krufty Hack.

    The arrival at Elegance and Beauty (the "deep" goals of programming) is balancing these three disparate directions. The grail, of course, is highly-optimized, highly-flexible, extremely portable, outrageously maintainable, and Beautiful code that actually fulfills a needed function.
    -
    bukra fil mish mish
    -
    Monitor the Web, or Track your site!
  • The most important thing is knowing how a program should be written and the most effiecient way of writing it while making it maintainable. Some people have the knowledge naturally, but its not a thing that can be taught.

    I disagree with the statement that this can't be taught.

    Certainly, some people are naturally more endowed with the kinds of skills it takes to organize, plan and implement solutions to whatever problem they're faced with. And it is probably true that some people are no good at these things and never will be. In this sense, CS skills are no different from cooking, carpentry or cat burglary.

    But, you can't tell me that you really believe people can't learn good programming techniques by seeing them done correctly. Honestly, can you? If this were true, CS skills would be in a class apart from almost everything else except perfect pitch.

    One place to see the correct way is on the job, another is in the bedroom, staring at someone else's code while you should be doing your chemistry homework. But there's no reason this can't be done in an (college) academic setting, too.

    -c

  • above: 1337 5|^33K
    elite speak - the obfuscated typed slang of script kiddies

    title: ur n0t 1337

    You're not elite (1337 speak for talented, knowledgable, skillful, or just good in general).

    1v 0n133 hAx0rd f0r a m0n74, & 1 Kn0 3vr1t41ng a1r3d1

    I've only hackered [been a script kiddie] for a month, and I know everything already.

    1v g0t a11 t43 1337 5kr1p75 & r00t 0|\| 1075 0\/ 80X35 & z11110n g1gz 0v Pr0N

    I've got all the elite scripts, root on lots of boxes, and [a] zillion gigs [gigabytes] of porn.

    g355 175 d1fr1n7 f0r 1337 d00dz 11K3 |\/|3 !!!!

    [I] guess it's different for elite dudes like me!

    Yes, there are really people who talk and think that way. Think of a 13-year old kid who has just learned the bare rudiments of programming and goes around seeing how well he can defy authority figures by committing random acts of vandalism and getting unpaid access to things that he isn't supposed to be able to get, and the online peer group he fits into with it's own customs and slang. They're not generally bad people, or stupid, just young and ignorant and trying to have fun.
  • Semantics is about the meaning of words. A semantic difference is a difference in expressed meaning, not a superficial difference in expression of a meaning.

    By definition, a goal is something that you hope to achieve, not something inherently open-ended and unachievable. To use "goal" as you would "value" is an assault on precise communication in the English language, an effort to make synonyms of two words with distinct meanings in a language already overflowing with useless synonyms.

    You might as well say, "I have a turnip! A turnip that all men will be judged by the potato of their character, not the carrot of their skin." After all, it's only a semantic error, people still "get what you're saying."

    Join me, my fastidious brethren! Save the words! Take the Oath (post it as a reply):

    Oath of the Guardians of the English Language (A.K.A. pedant's oath)
    I, (name here), solemnly swear
    to protect the English language
    from ignoramuses and morons alike;
    to correct improper utterances,
    and poorly written sentences
    (even if it annoys people);
    and to kill, without hesitation,
    anyone who uses "literally"
    to emphasize a metaphor.

  • Yes, I do mean proofs of correctness. But it should be understood that because of the Curry-Howard isomorphism (in short: "programs=proofs", but that's a bit short; I used to have a page on Curry-Howard but it's gone; promise, some day I'll rewrite it, but for the moment that's quite low down on my TODO-list), typing is merely a form of proofs of correctness, but proofs that are checked by the compiler itself. What I want is the ability for the programmer to specify completely general invariance properties on her program (e.g., "all through execution, i will always be a power of two") and to prove it (naturally it is the programmer's task to prove things, otherwise the compiler's job is far too complex), and the compiler will (a)check the proof and (b)possibly optimize using the extra information thus obtained.

    Other than that, I agree that Lisp is one of the very best programming languages in existence. Unfortunately, Scheme (which is the best as far as the core language is concerned, IMHO) is utterly worthless as far as the library is concerened; and Common Lisp (which has a very rich library) isn't nearly as elegant as a Lisp language could be.

  • What I want is the ability for the programmer to specify completely general invariance properties on her program (e.g., "all through execution, i will always be a power of two")

    This specific case is fairly trivial to do in C++. Just define a class with assignment operators that enforce the invariance. You could even make a template class to cover a whole collection of variations. If you were willing to take the memory overhead you could even make a generic class that holds a pointer to an invariant-check function.

    Also, try looking at Eiffel, it has at least some invarient enforcement capacity, though I am not sure if it has exactly what you are looking for.
  • So, basically, it boils down to you wanting all programmers to be perfect and for software to auto-magically work. Well, gosh. I know my job as a software developer would be a lot easier if everyone else was perfect.

    There is basically only one way to prevent users from having problems installing new software and hardware -- don't give them options. The only way for this to work is if everyone sticks to a hard set of standards or a closed, proprietary system. This is one reason hardware is plug-and-play on Macintoshes -- you follow Apple's standards, or it's unsupported and probably doesn't work across OS upgrades.

    What do you get then? People bitching about not having options. It's one or the other. You have to choose which.
  • <em>And I've apparently attracted my own division of trolls.</em>

    No, sir, you are the troll. As acknowledged by one of your first rebuttals to one of the first replies, you fully expected to marked down. That is prima face evidence that you posted the original post with the full intention of angering people. <b>That</b> is a troll. People who respond back in an angry fashion are flaming you. You are the troll.

    I hope you enjoyed wasting moderator time with your rapid spewing of ill thought out messages. It's sad to see at this deeply nested of a level a final admission that, yes, hardware people can be at fault.

    A reminder: If hardware people are so perfect, why does the Linux kernel have patches to avoid the Pentium F00F bug?
  • The subject says it all.

    If I can help people who use my code out and have a fun time creating it, I'm happy.
  • by CDanek ( 34285 )
    Though it's scary at times, I'm still a firm believer in the fact that the single most important goal that the human race should be working towards is the invention and cultivation of AI. I could get into the scarier details, the science fiction of AI; couple AI with robotics and have some 'sentient cyborgs' who aren't happy with their creators, but that is only one outcome. The outcome I foresee is a reliance and interdependence.. A joining of man and machine over time.

    The first primary goal of an AI should be that of learning. Learning about it's environment, about the entities that have provided for it, etc. The second goal should be interaction, the 3rd understanding, the 4th complete thought & problem solving, and the 5th, sentience.

    There is a lot to think about when thinking about AI. What will happen to humans? What will happen to machines? Does this ultimately mean the end of humanity, or human vs robot wars we have so much popular fiction about? Possibly. But this is how evolution works, and besides, aren't you tired of the lemony-fresh life we have now? What ever happened to just surviving? I don't feel like I have a huge purpose aside from making money, but I think with the advent of computing I now have the tools available to me to make a true change in the course of human history.

    cd
  • 1. ENGINEERING IS A SOCIAL SCIENCE.

    Yep, that's right--it's not a hard science like mathematics or physics except in the most facile and shallow way. Every engineer should have a background in human factors as well as scientific ones. Our creations are ultimately used by human beings, and we cannot afford to neglect them.

    2. ENGINEERING IS AN ART FORM.

    Engineering is not simply "creating stuff that works". While possible, it hardly contributes to the human race as much as aesthetic qualities like grace and elegance do. Compare an I.M. Pei or Frank Lloyd Wright building with a Soviet-era fish cannery--which one do you think adds more to the world's culture, designwise?

    3. SOFTWARE ENGINEERING IS A HIDEOUSLY BROKEN FIELD.

    The axiom is that "if carpenters built homes like we build software, the first woodpecker would destroy civilization". No other form of engineering permits the kind of widespread failures and bass-ackwardness which we do. Most Professional Engineers I know (the people who get to add "P.E." after their name) can't stand the thought of programming being elevated to the level of an engineering discipline, because they see us--correctly, for the most part--as Johnny-come-latelies who are concerned only with making a buck and looking cool, not building things which will stand the test of time.

    4. MY GOALS

    My goals are formed by these principles. Number one, my ultimate and overarching ambition is to make software which is helpful to society and individuals. Second only to that, my goal is to make elegance an integral part of my work. Third, I aim to write software which doesn't suck.

    As long as my job permits me to do all those three things, I'm happy. If I write a piece of software which manages to achieve all those things, I'm happy. Right now I'm polishing up Fishtank-1.0.1 for release (a GPLed, elegant, well-documented crypto library; email me if you want to see a pre-release) and I think it's been guided by those principles. I'm happy with it, and that leads to my final, most important, goal:

    5. BE HAPPY.

    :)
  • For me I enjoy making somthing from nothing. Programming is a lot like woodcarving in that you start with an idea and see where it takes you. Sys Admin in_and_of_itself can be pretty tedius, especially if you work in an environment where change is a bad_thing. I love trying the newest, best tools and applications available. I love hacking apps. Get everthing running, sit back and monitor, spend free time hacking. With each application that I write I learn a little more and learn how to creatively solve programming problems. Time for Zen moment: In a nutshell,The path is the goal.
  • I found that, as your standard of living increases, your desired standard of living increases along with it. Soon, you need ever more money before you drop out of the corporate world and thus never do.

    In my experience there comes a point when you are happy. Or at least, for me, I'm happy now. I figure it shouldn't take that much more for me to be able to "retire" to the life of writing free software.

    Of course I was born to parents who started off dirt poor--and whose attitude, now that they have money, is that "the bigger your house, the more junk you have to dust." Same with cars: there does come a point where the stress of worrying if someone will side-swipe your car outweighs your enjoyment of having a nice car.

    So I figure the point where I should be able to cover my expenses for the rest of my life is comming up relatively quick. But of course there is that AS-400 running Linux that I wouldn't mind buying... :-)
  • With the standard template library and mechanisms for generic programming, C++ is a high level language. Go read some Stroustrup (3rd edition, the language has changed a lot!) some time. Larry Wall once said that the usefulness of a programming language is inversely related to the number of axes the creators of the language have to grind.
  • My goal is to die having made an even number of sign errors.
  • I like to program interfaces and output, although I don't mind designing data structures.

    I like Perl programming in general, although I am relatively new to it. Web programming is a nice easy thing to do, and you can get great results with very little effort and a good graphics program. I used to like GUI programming, and I might still get back into it, but Java Swing really irked me with the layout stuff.

    I dislike C, not that I can't program it, but because it just pisses me off. Starting from scratch in C is okay, but as soon as I hit somebody elses code I scream. Maybe it is because I like my code to be easy to follow rather than optimal...?

  • The main focus of the 'industry' (nearly every industry) is to take current technologies, and make them work together to augment human decisions and work. I know it sounds like a buzz-sentence (and it is a sentence, a life sentence!) but the fact remains: every business problem can be solved by one proprietary/expensive solution, or by the merger of 2-5 seperate less expensive solutions.

    The real programmer isn't necessarily knowledgeable in all these technologies, but is good at

    Problem solving
    Learning (quickly)
    Grasping large-system concepts quickly
    Discussing the solution with others at their level

    Once you have the basic concepts of programming down, the speed with which you implement the system is often more important that the speed the final system runs at. You may find yourself writing a VB client which interfaces with a Perl cgi module which then communicates with an ecommerce DB. The next day you have to talk with the marketing manager about the best way to keep in touch with sales reps on the internet. After that its off to manufacturing where they need to change their maintenance and reports DB interface.

    You have to know of new technologies so that when you have a problem to solve you know about the tools that are available. A guru C programmer knows of a few other languages and can apply them, but every problem looks like a nail to be fixed with their C hammer. This isn't bad, but in most cases a person who goes to the hardware store every week and peruses the tools available without expending a lot of time in any particular tool will do better and end up with a more easily maintained finished product.

    -Adam

    Education is what you get from reading the fine print.
    Experience is what you get from not reading it.
  • You're being C-centric for no apparent reason.....
  • If you asked a group of authors "What are your writing goals?", then you'd get responses about the novels they have in their heads and wish to write, or what they want to say in their work, or how they want to write something as great as writers they admire. Now replace those writers with Slashdotters and ask them the same question. You'll hear responses like:

    I want to write lots and lots.

    I want to convince people that English is the best language to write in.

    I want to get so intricately involved in grammar and spelling that I never have to really write anything.

    I want to learn everything I can about writing so I can, you know, just write stuff. Anything. It doesn't matter what it is.

    I want to have grammar that's so perfect that no one can find any mistakes in what I write.

    What strange to me is how programming is looked at as an end in itself and not a medium for creation. I would have expected raves about implementing great ideas. Huh.

  • Hmmmm, let's see....well, I'm not bright enough to be a mathematical hacker, I'm not creative enough to be a UI hacker (though I've read the Human Interface Guidelines from Apple and used several different types of GUI/CLIs, and I have a built-in aesthetic of what seems "right" about an interface), and I'm not knowledgeable enough to be a device driver/kernel hacker. Plus I don't want to spend the next 10 years writing plug-in code for the Office Assistants.....yeech.

    What I'd ideally like to do, once I get some serious skills built up, is be one of those guys who goes around cleaning up old crufty bits of code, rewriting things to make them smoother/faster/less cluttered, updating bits of code that haven't been touched since the original developer lost interest six months ago, that sort of thing. Is there a category/place for people like that? (Other than "pathetic", that is.)

    ObBitterAdvice/FeelSorryForSelfRant: Kids! Thinking about leaving school to get experience in the "real world"? For chrissakes, DON'T! You'll end up trapped in a dismal malaise of low-paying, unthinking jobs with no hope of advancement because you don't have the sheepskin, you'll be shunted away from jobs that might build your skill set and enrich your coding abilities, you'll watch your peak coding years drain away while you struggle to balance work, more work, and the one or two classes you can afford to take a semester (which means you'll have your degree roughly the same time your Social Security runs out) and your brain will rot from disuse into a pasty mush that actually wonders why anybody would use a mail program other than Microsoft Outlook. Save yourselves! Don't end up a bitter, disenchanted tech-support worker at 29! (Er, like this guy I know.)

  • I'd agree with this, but with a few additions. I find that as a system programmer my job is to write code that works well enough that nobody actually notices that there is a lot of hardware that they rely on. Making sure that everything remains linked together and that the users have 365 days of uptime is the most important goal of a systems programmer. Mostly I end up writing utilities that make life easier to manage the systems.
  • I ended up as systems programmer by accident. I drifted into the field when it was noticed I knew more about what held the systems together than anyone else in the company. A lot of systems programming these days seems to be making jobs easier for non-tech staff. I write a lot of user interface stuff that shows the staff in charge of changing backup tapes exactly when theres an error and what to do about it. I write utilites that automatically let me know when the system is having a problem that needs my attention. The greatest goal I have as a systems programmer is to have a long stretch where I can sit back and do not very much at all apart from keep up to date on new technology.
  • /me capitulates

    I surrender, your dictionary is obviously bigger than mine.
    Now, go do something productive, like fix all of those Gnome bugs.

    Kintanon
  • Hi. I'm really curious about a few things.

    1) What the hell are you talking about? Linux doesn't care at all what monitor you use. Are you meaning to say 'X windows' is wanting that info or are you meaning that "linux" wants 'my monitor's refresh rate capabilities?'

    2) What distro are you using that does all of this?

    3) Why has no one else asked you this?

    If it pisses you off so much, don't use it.


    My bad, everywhere I said Linux in that post replace it with 'Redhat 6, and X-Windows'.

    Linux works fine, X-windows is a piece of shit.

    And Redhat's installer is crap.

    I apologize for not being clearer.

    Kintanon
  • One could argue that they want the software to default to 1600x1200@80 and let the user change it to 640x480@60! No matter what you default to, someone's not going to be satisfied with it. So, instead of guessing what a good default should be, I think it's better to ask the user for information to determine what he or she wants!



    Umm, the default should always be the most likely to work. It is more likely that 100% of the monitors around will run properly at 640x480 than at 1600x1200. That would be good engineering, which I suppose is something you programmers don't go in for....

    How is the hardware designed badly? It does PRECISELY what it was designed to do. There is poorly designed hardware, it needs to be redone as well. But last time I checked my CRT didn't suddenly decide it was pissed at my keyboard and refuse to work.

    Kintanon
  • Heh, you have to royally fuck up to toast a monitor that fast....
    Extra 0 on one of those numbers somewhere?

    Kintanon
  • Hardware that does PRECISELY what it was designed to do and hardware that is badly designed is not mutually exclusive. Why shouldn't a video card be able to determine a monitor's capabilities directly? Why put the user into the loop when a communications link between the video card and the monitor already exists?

    Sure, have the monitor send a brief signal burst every few seconds until it gets a return message from the vid card that acknowledges the monitors existence and stats. Why isn't that implemented in hardware? Damned if I know. Seems simple enough. But ATM I lack the facilities to do it myself (open source hardware is just a LITTLE BIT more difficult than software), so I have to wait until I can get a decent EE lab or some company listens to me.

    Kintanon
  • Do yourself a favor and switch to SuSE or Mandrake. SuSE has some sort of deal with Xfree86 so their X looks good. Both they and Mandrake use KDE, which a lot of people hate, but it looks good. With Redhat, even if you get the X stuff set up properly it just never looks quite perfect.

    Of course, I'm just spoiled from years of Winduhs, Macs, BeOS, NeXT cubes, etc.



    Too late, I gave up on X-windows. I just use the cmd line for everything.

    Kintanon
  • Part of the reason a lot of software sucks is because almost every time some new hardware company gets a large crazy idea everything needs to be rewritten or hacked up to keep working (which is why we are still stuck with x86, etc--hacking up and rewritting would have been too expensive/time consuming even though there is better hardware to run on out there). Video cards are the *perfect* example of this problem.



    I will definately agree with this. Hardware people are just as likely to go running off after 'Newest, Greatest, Spiffiest' when they should also be concentrating on making the stuff we have now WORK.

    But the original topic was programmers. So that was the subject of my post.
    And BOY did I piss a lot of people off. That was fun, 4 mod points used to keep my original post exactly where it was, 20 or so posts under mine...
    And I've apparently attracted my own division of trolls. >:)

    Kintanon
  • s/bigger/more often used/



    You sure you want to go that route?
    The obvious implication being that you find it more often neccesary to delve into your dictionary. My posts are all 100% dictionary free, posted while working, and frequently while only half paying attention. Given those circumstances I'd say my error rate is pretty low.
    Yours should be nonexistant given your apparent derth of free time.

    Kintanon
  • But I got news for you, pal: hardware ain't the holy grail you think it to be. I got lots of dead hardware sitting around that died for no apparent reason.

    Well, wait, actually that's not true.

    It did die for an apparent reason: the quality of the hardware sucked.



    Did you pay any attention to that little sticker on the back of the device that tells you under what conditions it's rated to operate? No? How surprising.

    Oh, and don't dis my apostrophe usage until you can properly use the contraction 'ain't' in a sentence.

    Now, you give me your dead hardware and I'll tell you why it died. And I ain't just makin' that up.

    (Friendly advice, Ain't is a contraction of 'Am not')

    Kintanon
  • But there is not, and never has been, any way for a VGA/SVGA monitor to report to the video card what brand it is or what sweep values it can accept without melting into a puddle of slag; ergo, X asks YOU. If you don't tell it otherwise, then, sure, you'll be looking at 640x480, the lowest common denominator of screen resolution settings.


    There isn't YET. You forgot that very important word. The hardware fix for that is relatively simple conceptually, a bit harder in implementation though.
    What I have a big problem with is that when X asks me 'What resolution does your monitor run at' and I tell it 1024x768@60 and it comes back and tells me I'm wrong.
    Ok, so which is it? Does X know WTF my monitor is doing or not?

    Kintanon
  • The newest XFree (XFree86 V4.0) does access this information by setting up a virtual real mode processor, and calling the video card BIOS. However, the XFree guys haven't yet tied this into the modeline code to allow X to read the monitor's data & set itself up automatically. However, this is coming soon.


    So, as I had imagined previously, it's back to being a programmer problem. The hardware exists and works just fine. But the software is faulty. So, one of you hot shot programmers go make it work.

    Kintanon
  • Then the community as a whole shouldn't be bragging about how superior their software is to commercial, closed source software unless the piece of software in question ACTUALLY is.
    The open source ideology might be far superior, but that means bugger all if the code doesn't work.

    Kintanon
  • You've come across the bane of programmers all over the world. You're expecting some sort of psychic interface to the computer. Some sort of consciousness that lets it do what you WANT not what you SAY. You'd like the computer to have some sort of eyeballs and brain stem to recognise the box that you're showing it "see, computer? This is my new video card! Make it work!"

    I'm not quite that ambitious. I just want a system that works. If I tell it something about my hardware it should believe me. If I want to run two programs at the sime time I should not get an 'Illegal Operation' or GPF. I realize that there is an infinite combination of software running on an infinite combination of hardware to test for, but sometimes it seems like if you leave a program running, all by itself, for 15 minutes, the thing will explode. And that's anything from little open source widgets to $600+ software packages. There needs to be more emphasis on quality, working, functional, robust software.

    Kintanon
  • Can you imagine the fireworks if you hooked up a small 14" VGA monitor to a futuristic 4096x3072 video card, set for 150 Hz of vertical refresh? If the monitor's hardware wasn't damaged, I'd think whatever analog buffers on the VGA inputs at least wouldn't be able to handle that kind of necessary bandwidth.



    Most systems have safe guards for that built in already. If you do manage to set your system to a higher res@ref than it can handle it tends to fuzz out (Used to do this all the time back with Win3.1) and go absolutely bonkers. Easy solution, turn the monitor off and change the settings back blind. Or just drop to dos and do some .ini editing. Yes, if you just let it sit there and run it will damage the monitor, but I'd rather fritz my monitor for 10 seconds than be stuck with a system that refuses to work properly because I'm using a monitor it doesn't recognize....

    Kintanon
  • So it knows what info to send to the video card. You can take the most gemeric settings,. personally I don't want my 21' monitor displaying 640x480 at 60 Hz, but hay thats me. Is there a GUI that doesn't need to know the specs of a monitor to perform optimally?
    The use of for letter words in know way helps make you're point.
    score 3 insightful, Bah! Humbug!


    Heh, I fully expected to be marked down as flamebait....

    Oh, and Dos 6 running win 3.1 worked JUST FINE without knowing what my monitor was.
    And if it's going to ask me what my monitor is then it should believe me when I tell it the answer. I've had linux tell me I gave it the wrong monitor info, if it knows enough to tell me the info I gave it is wrong, why doesn't it give itself the RIGHT info? It's silly.
    Why doesn't it default to 640x480@60 and let me tell it to run at 1600x1200@80? That would be the smart thing to do. Instead of dicking around asking me questions it doesn't like the answers to.
    Also, why does it get confused when there are 2 video cards in the machine? Just use the one the monitor is plugged into. Apparently it can tell the difference since it knows enough about my system to tell me my monitor info is wrong....

    I just want the software to work. Isn't that the entire idea behind open source? Stuff that works?
    My hardware works, it performs predictably, why doesn't my software? Come on programmers, get off of your asses and do things RIGHT.

    Kintanon
  • monitor send a brief signal burst every few seconds until it gets a return message from the vid card
    that acknowledges the monitors existence and stats. Why isn't that implemented in hardware?

    It is implemented in hardware: it's called DCC, for digital communications channel, and any modern (plug&play) monitor supports this, as does any modern card. However, the video card makers made each card access this in a different way, and the code to access this is in the video card BIOS and runs only in X86 real mode. Not good, when you are running X86 protected mode (Linux/86) or on something else (Alpha, PPC, MIPS,....)


    The newest XFree (XFree86 V4.0) does access this information by setting up a virtual real mode processor, and calling the video card BIOS. However, the XFree guys haven't yet tied this into the modeline code to allow X to read the monitor's data & set itself up automatically. However, this is coming soon.

  • Well, I work for a company that does a lot of embedded systems programming. Here's a list of what I do:
    • Software architechture design: what data goes where
    • Device drivers: Vxworks based hard realtime drivers
    • DSP: filters, symbol recovery, vocoders
    • UI: TCL/TK and X (in an embedded system, I might add
    • Control systems: servo loops, PID control
    • Networking: TCP/IP - servers, clients, NIC drivers
    • Graphics: 2D mostly, things like oscilloscopes and spectrum analyzers
    • Radio: controlling receivers and generators
    • Communications systems: radio protocols, infrastructure
    • Hardware debugging
    • Configuration control and management
    • Interviewing potential new hires

    This is in Wichita, KS. If you are interested, reply and I'll tell you how to contact our HR dept.
  • I once knew an EE who claimed that his hardware didn't work correctly because of the presence of fluorescent lighting.

    Are you kidding? Fluorescent lights are a HUGE factor in low-noise systems, especially with measurements. Standard practice, in many low-noise environments, is to turn off all lights during noise measurements. Many times these experiments are done in huge faraday cages, with no digital electronics inside, and all AC power sources highly conditioned.

    I don't know what hardware your EE was blaming the lighting on, but on some systems it really does matter.

  • "The difference between Theory and Practice, is greater in practice than it is in theory"

    seriously though, if you want to get into OS development, take the hard, low-level bit-basher courses, the parallel and distributed systems courses and LOTS of theory and algorithms electives.

    there are a lot of options out there besides M$, IBM, and RH. remember, 95% of all computers are embedded, and about half of those require an OS of some sort. in the embedded market, no single company owns more than about 5% (vxworks is about the biggest).

    there are lots of companies that have their own OS ( including the company I work for ), and they all need talented engineers ( ditto ) (send me your resume)

  • I would LOVE to do kernel hacking or deamon writing or device driver writing for a living (not just on the side) but WHERE ARE THOSE JOBS? I can't find any in a city I want to live in. I DON"T want to live on the east coast, west coast, or in Chicago. Everywhere else, all the development going on is just business app and science app writing. I'd love to do more low level coding, but nobody I can find around here is DOING that. *sigh* If any of you had any ideas.. let me know.
  • Actually, your mainframe comment was spot on - I have just clocked up one year at my first job out of university, which is working on mainframe system software. There _is_ a lot of money to be made, yet I'm starting a new job in three weeks.

    Why? For (some) of the reasons mentioned above: the mainframe area _is_ on the decline, though it will take a long time to disappear, and as this is my first job I don't want to be "trapped" into one over-specific category. I have learnt an absolute truckload about being a coder, and about work in general, but I feel like I have learnt a large slice of any "general" knowledge I can gain from here.

    And after learning all that, I decided I wanted something else. For now.

    Hope it turns out I guess ;)

    Cheers,
    SuperG

    P.S. MVS (OS/390) yeah done that. Have you heard of VSE/ESA though? FEAR
  • by bluGill ( 862 ) on Tuesday May 23, 2000 @09:03AM (#1053118)

    My goal is to write code that doesn't break. I prefer not to do medical or aircraft work where someone dies, focusing instead on systems that would simply cause anouther depression if it breaks. (Which probably kills more people in the long run)

    I tried writting user interfaces. I hated that work. I work with folks who love that work. I like flipping bits and watching them cross several busses and appear (inverted) at a different processor. I love yanking out processors that are running my code, and have my code automaticlly move to a different processor. To me that is cool, to others around me it is just a lot of work.

    In other words, we are all different. I'm no better or worse then my peers. We work differently, and like different areas. I can do the work that others around me do, but they like it I don't, and vise versa.

    You end up having to try things to decide if you like them. Some folks work on KDE, others userland of FreeBSD, others the Linux kernel. Given enough time one person could do all of the above to the same quality, but they would hate some tasks and love others. ANd of course as time goes on your grow, so a task you hated before you might like now, or something you used to love now just bores you to tears.

    Don't be inflexable. Someday the inflexable will find that nobody cares about the awesome system he wrote and he no longer can get a job. Because he is unwilling to learn something new the inflexable person is lost. (The mainframe is a classic example, it is slowly dieing. Not because it is bad - though we have learned much about comptuer design since it was made, but because it is out of style. Do you really want to be on a sinking ship, without knowing how to use the life boat? There is a lot of money in mainframes today, but you better know something else)

    Don't be too flexable. Sometimes you have to say "I can do this, but it isn't right." Or maybe you are so flexable that you get a new job every few months. You need to see a project through to completion, see what customers really think. (Open source isn't such a problem here, but in buisness you cannot contribute to two different companies at a time.)

    I like where I am now, and what I'm doing. I could have - perhaps should have taken an opertunity to transfer to a different department. Only time will tell what would have been best.

  • by Kaa ( 21510 ) on Tuesday May 23, 2000 @09:43AM (#1053119) Homepage
    Why is it that in the year 2000 we're still writing our operating systems and most of our programs in C (or C++, which, although messy, is not truly high-level)?

    Because these are decent languages for the job?

    [grins, ducks, and runs...]

    Why is it that garbage-collection has never truly come out of the closet?

    Because you lose some control over execution of your code, and in a lot of situations that's not desirable at all (real-time systems is a classic example).

    Why is it that Java is compiled in byte-code rather than native code - and why is it so slow?

    The native code of which processor? If you recall "write once, debu^H^H^H^Hrun anywhere" was the Java's big selling point and a explicit design goal. In any case, native Java compilers are starting to appear.

    As to the slowness -- that's exactly because Java is a "more" high-level language. TANSTAAFL.

    Why is it that no programming language that I know of is able to handle program invariants and proofs that are any bit more complicated than ("i is an integer") (nothing like "i is a power of 2" for example).

    Well, you can always write a C++ class with properly defined assignment operators, etc. which will enforce "i is a power of two". Quite trivial, really.

    On the other hand, you've been complaining that Java is slow... I don't think you'll like the speed of any high-level language.

    Why is it that anyone who wishes to program anything still has to spend half of his time writing things like if ( retval == -1 ) { perror ("frobnicating"); exit (EXIT_FAILURE); }?

    First, even in C there are such things (evil, I acknowledge) as macros. Second, sometimes you may want to frobnicate and sometimes you may want to discombobulate. Sometime you will just exit, and other times you'll try to recover gracefully. It all depends and that's why we spend time writing code like this.

    Why is it that buffer overruns still exist?

    As usual, it's a trade-off. You can run your programs with run-time memory checking (very, very useful during debugging). Memory checking, though, like almost everything else, imposes an overhead. Sometimes it's OK, sometimes it's not.

    In any case, if you are programming in C++ and are using a decent library, you should have very few memory access problems. Writing in C and in the usual C style is another matter entirely.

    To summarize: TANSTAAFL.

    Kaa
  • so that I can afford to spend my time writing free software, while keeping a roof over my head and a car in my garage and food on my table.

    Ultimately I want to spend my time providing cross-platform tools so that the same bit of code can be compiled on the Mac, Windows and Linux and run correctly. But at the moment, the project has to languish so that I can write a web-based appointment scheduling system for a fellow whose having fantasies of getting rich in the New Economy...
  • by AppyPappy ( 64817 ) on Tuesday May 23, 2000 @09:11AM (#1053121)
    Systems Pukes - Arrogant bit-twiddlers who sit around doing work no one understands but everyone needs. Occaisonally, they purposely break the system and put it back together to prove their worth.

    Application Programmers - smug resume monkeys who make prostitutes look like nuns. Most of their day is spend looking for another job. The rest of the time they are avoiding meetings and hunting free food. They will leave a company for $1 more an hour but will remain loyal to a department for an $8 pizza.

    Analysts - Programmers who can't code. Their functions including stating the obvious and avoiding responsiblity. The chief aim of the analyst is to create a system without a spec thereby rendering them blameless.

    Project Leaders - the dumping ground of IT. Drooling pirates who will swing into meetings and lash each other with promises to the amusement of management. Then they will stagger back to the staff and announce some impossible deadline that was born to be pushed out to sea. The sworn enemies of programmers everywhere. See also "Euthenasia"

    DBA's - Anal-retentive spellcheckers whose chief purpose is to slow work to a crawl.
  • I gave up the option of programming for a games company to go do a CS degree. It was a bad mistake for me, as the degree started out good but drifted away from what was relevant and cutting edge and in the end I came out behind of where I would have been.

    The most important thing is knowing how a program should be written and the most effiecient way of writing it while making it maintainable. Some people have the knowledge naturally, but its not a thing that can be taught.

    When I'm hiring people I hire based on how quickly they can catch onto ideas and implement them, not on how many degrees they have. One of the best programmers I have came straight out of high school.
  • by taniwha ( 70410 ) on Tuesday May 23, 2000 @09:09AM (#1053123) Homepage Journal
    My goals have change over the years - I spent much of the 80s doing Unix kernel hacking .... and the 90s designing chips (these days logic design is really programming - honest) - with side trips into compiler design along the way - the 00s? who knows - probably more programming I think - I've decided it's more rewarding on a day to day basis.

    The important thing is not to dig your self a big hole by becoming too specialised - become a generalist - be adaptable - than you can follow your interests as they develop. There will be cool interesting technology to play with along the way and if you become too specialised you will eventually find yourself stuck doing the same thing over and over and over again.

    The other thing that I think is a usefull (non-programming) skill - become a 'self-starter' ie don't need a lot of management to get your work done - let your enthusiasm drive your work - you'll probably get ahead a lot faster and you'll be able to work in smaller more cutting-edge companies

  • by G27 Radio ( 78394 ) on Tuesday May 23, 2000 @09:02AM (#1053124)

    For me programming is a break from reality. Naturally having a goal in mind makes it more interesting, but in general I just do it for the hell of it. I mess with all sorts of languages but rarely take the time to become proficient with any particular language--unless it's simple enough to learn before I find something else interesting to learn. While I'm programming it's as if I've left my body and I've entered another dimension. I don't hear people talking, see the monitor, or feel my fingers on the keys. I'm just somewhere else entirely.

    numb
  • My programming Goal is to finish the Do Stuff(c) Functions a Libraries.
    A program that will Write your programs for you by simply declaring the dostuff function. Basically
    #include "dostuff.h" //or your languages equivalent
    void main(){ //again your language equivalent
    dostuff(); //ditto }
    Viola You are done My Only pittfall has been, actually writing it. I designed it to write itself Recursivley.. But so far my hardware doesnt want to comply.. And yes I wrote the original Segfault story on this so it _is_ original even if you have read it there before.
  • by jbarnett ( 127033 ) on Tuesday May 23, 2000 @09:04AM (#1053126) Homepage
    The more you know, the more questions you ask.

  • by cford ( 141147 ) on Tuesday May 23, 2000 @09:26AM (#1053127)
    Outstanding expression of your goals. Expressed elegantly in clean, aesthetically-pleasing, efficient statements. In short, I suspect this is the whole ball of wax when it comes to programming. I'm brand new at it, and you've quite eloquently mapped out precisely how I feel about programming, and why I enjoy it. I think I'll hang on to this as a reminder for those frustrating days when I wonder why I chose this field. ( which are few & far between )
  • by Mr_Huber ( 160160 ) on Tuesday May 23, 2000 @09:29AM (#1053128) Homepage
    I've been programming professionally for about five years now. In that time, I've worked for small companies, large companies, evil empires, egomaniacal doctors, aerospace dinosaurs and finally, my own consulting company. In that time, I've coded C, C++, HTML, Perl, even a little Java. I've programmed desktop computers, handhelds, embedded systems and Boeing 777s. My style has moved from ugly hacks to organized hacks to (hopefully) organized programs. Over time, I've developed a single guiding principal.

    Make it simple, easy to understand and maintainable.

    Do not make it cool. Do not show me how well you know C++, Perl or Java. Do not make it subtle. Do not make it fast. Do not make it cute. These things hurt code.

    Make it simple to understand and work on. Make it look like a Model T engine. Comment everything. Refactor for simplicity whenever possible. Speed will come as things develop. If it is too slow, profile the code and fix the slow bits. Follow these steps and the system will be easy to maintain. Customer ordered changes can be quickly and painlessly accomidated. And, when your contract ends and you have more interesting work to do, you can hand the code to anyone and they can maintain it. Job security sucks, it chains you to one spot, ends your career and keeps you at work away from your wife.

    Mr Huber
    Ex Building 17 Orange Badger


    • Kraaash your enemies...
      see dem driffen before you...
      und hear de lamentations uf dere wimmin...
  • by WillAffleck ( 42386 ) on Tuesday May 23, 2000 @09:04AM (#1053130)
    Actually, my main goal in going into Systems Analysis/Programming was to only have to relearn half of what I knew every two years, instead of all of what I knew (when I did hardware).

    I try not to get to hung up on OS, or Programming Languages - they're just tools we use to provide frameworks for solutions. Each has it's quirks.

    To contribute to society - that's one of my goals. I used to want to code the best game simulations - somehow, that went out the window. Once you've got a bunch of money, that ceases to be such a big deal - so skip that as a motive.

    To design a system that, while not the most efficient or fastest, allows one to get one's job done in an elegant and robust fashion - that's what I like doing. I may not make the best wheel, but my wheels allow you to change the tires while driving and use bigger wheels with different treads. I've found my code being used by other people more than ten years later - because it just keeps working. Elegence, simplicity, robustness.

    The rest is all carp. A thousand years from now, all the things you think are important will be, at best, a joke.

  • by TheDullBlade ( 28998 ) on Tuesday May 23, 2000 @09:19AM (#1053131)

    I don't have goals in programming, except for specific projects. Goals are limiting, and only suited to the short term in such a dynamic field.

    I have values: it is good to become a better programmer, it is good to learn more mathematics (the better to create accurate computer models of problems), it is good to learn more about how people interact with machines and what they want from them, it is good to learn more about my own mind and how I learn and how I work... I could go on for pages describing what I value.

    The important thing is that general "goals as a programmer" are not worth having. Whole fields may become irrelevant by the time you master them, so you must always be learning. The programmer must be a generalist in principle, even though he spends a year or twelve with his main focus on one particular area, because any specialized field will eventually be made trivial by an advance in the state of the art (and the obvious reasons to delve into a particular area are that there's work to be done, you find the work interesting or profitable, and you think you can do it). Today's divisions into "interface design", "systems programming", "language design", "web scripting", "graphics engine programming", "hardware design", etc. (let alone narrow subdivisions like "Windows programmer" or "Playstation developer") are all arbitrary categories describing certain temporary conditions that must be dealt with. Twenty years from now, there will be a whole new set of programming tasks (though some of the names might be the same, the problems will be entirely different ones), and you'll probably still be programming. Prepare for it, by preparing for anything.

    'Intelligence is life's response of "I can do anything!" to a universe that threatens with everything.'

    -Frank Herbert in "Destination: Void" (quoted from memory; not exact)

  • by David A. Madore ( 30444 ) on Tuesday May 23, 2000 @09:12AM (#1053132) Homepage

    My goal would be to make high-level languages (functional if possible) really usable - and really used.

    Why is it that in the year 2000 we're still writing our operating systems and most of our programs in C (or C++, which, although messy, is not truly high-level)? Why is it that garbage-collection has never truly come out of the closet? Why is it that Java is compiled in byte-code rather than native code - and why is it so slow? Why is it that no programming language that I know of is able to handle program invariants and proofs that are any bit more complicated than ("i is an integer") (nothing like "i is a power of 2" for example). Why is it that anyone who wishes to program anything still has to spend half of his time writing things like if ( retval == -1 ) { perror ("frobnicating"); exit (EXIT_FAILURE); }? Why is it that buffer overruns still exist?

  • My goal as a programmer is to continually increase my fluency in

    1. Dividing complex tasks into manageable pieces
    2. Expressing those pieces elegantly in clean, aesthetically-pleasing, efficient code that expresses both the problem and its solution
    3. Using the new gained knowledge to increase my understanding of the world's problem sets so I get better at step 1

    I hope to get there by three main methods:

    1. Write a lot of code; write, use, trash, re-write better
    2. Study the masters; read Usenet; ignore the flames and trolls and learn from the gurus
    3. Keep a humble and respectful attitude

    I've been at it for 20 years, and am still learning every day. I get just as many thrills from it all today as I did when I started.


    TomatoMan

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...