Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Is Microprocessor/Controller Design Dead? 108

blanchae asks: "I work for a Canadian post-secondary institute and I have been scouring the web job sites, newspapers and newsgroups for career adds for microprocessor/controller based electronic designers at the technology level (2 years training). We are re-evaluating our curriculum and are looking at the job market as one way of warranting specialization training to existing programs. There's lots of career adds for embedded controller designers with University degrees but not a thing for technology level microprocessor/controller design. It is very puzzling. So the question is: Is microprocessor/controller design dead? Has it moved offshore? Is it off the radar and mainly in small upstart companies (5 to 25 employees) that hire word of mouth and not through the big corporate media methods?"
This discussion has been archived. No new comments can be posted.

Is Microprocessor/Controller Design Dead?

Comments Filter:
  • by bcat24 ( 914105 ) on Thursday June 22, 2006 @11:58PM (#15587248) Homepage Journal
    Maybe it's only mostly dead. Remember, there's a big difference between mostly dead and all dead. Mostly dead is slightly alive. With all dead, well, there's only one thing you can do....
  • You work for a Canadian post-secondary institute and it is very puzzling and you turn to Slashdot? God help our post-secondary institutes.
  • Umm.... (Score:4, Insightful)

    by Andrew Sterian ( 182 ) <andrewsterian@yahoo.com> on Friday June 23, 2006 @12:01AM (#15587260) Homepage
    Don't you think 2 years is perhaps not enough time to have someone be competent at something as complex as microcontroller design? A 2-year degree is generally associated with technicians/technologists that are not hired for design work.
    • In Australia, there seem to be many cases of technicians being hired for non-design jobs who then progress to doing some code maintenance, then end up in design if they show an aptitude for it.

      (There's others who've hired programmers and try very hard to keep them away from their embedded designs, since your typical CompSci grad thinks a MB of compiled code is compact!)

      So, I'd suggest equipping your students with the sort of skills that will get them a foot in the door of companies doing embedded design,

      • There's others who've hired programmers and try very hard to keep them away from their embedded designs, since your typical CompSci grad thinks a MB of compiled code is compact!

        It is - 700k of that is symbols (we have sucky STL libs). Actually, I don't sweat 1MB of code because we have 1GB ram, so nothing's hurting.

        • I don't sweat 1MB of code because we have 1GB ram, so nothing's hurting.

          Except your cache -- and your instruction pipeline on the frequent cache misses.
          • Heh, we've got a meg of cache too. Sometimes more. But if there's any coherency at all, caching will work just fine on the relevant part of that 1M of code.
          • Except your cache -- and your instruction pipeline on the frequent cache misses.

            My code doesn't need to be fast - it mostly waits for the database, so I don't care about that either. The major performance factors in my application space are data models, decent algorithms, and database speed, in that order. It'd be different if I were messing with physics processors of games :).

            • Yep, this is what I was talking about!

              I think we mean different things by "embedded"...
              You mean an ETX card with an x86 running at 100s of MHz, I mean an 8 or 16 bit microcontroller with 16k of flash and 4k of RAM in it.

              Here in Oz, at least, MCU code is far more likely to be written by an electronics engineer than a programmer. Mind you, the same engineer probably designed the circuit, built the prototype, wrote the documentation, wrote the Windows interface software, designed the case, swept the floor,

          • Symbols never get in the cache unless your're running a debugger.
            Code bloat is not a problem for caches, because they cache lines
            of sequential bytes. Logic bloat is a problem for caches, lots of
            long branches.
        • YOu haven't tried to mess with an 8051 lately, have you?
      • by Anonymous Coward
        (There's others who've hired programmers and try very hard to keep them away from their embedded designs, since your typical CompSci grad thinks a MB of compiled code is compact!)

        They're not CompSci grads, they're "the new CompSci grads", otherwise known as "Java Engineers". That is they majored in Bloat, so that's all they know.
      • That sounds a heck of a lot like a Software Engineer (and a crappy one at that). A good CompSci or a good Software Engineer is going to have a better idea of what size code is needed on this thing.

        A MB of compiled code is compact...if you're working on an Athalon64. It's far from compact when you're working on an 8051. That said, it might be that those doing the highering need to be looking at Computer Engineers, not CompScis or Software Engineers. At my school, the CompScis only take two hardware classes (
    • Re:Umm.... (Score:5, Informative)

      by Anne Thwacks ( 531696 ) on Friday June 23, 2006 @04:15AM (#15588037)
      Correct ... I am hiring someone for micro controller this week (UK, not Canada), and after looking through CVs, I consider people with six years experience to be relative novices, with only limited contact with many of the important issues.

      To do micro controller work, you need to know EVERYTHING from how to create thread-safe stuff with no memory management to how to implement mathematical functions found in second/third year of a maths degree with UNSIGNED arithmetic. You need to know how hardware behaves when its faulty, and you need to know whether the compiler is faulty or the hardware, or your code is defective - and get it RIGHT.

      And in most cases you probably need good client facing skills, the ability to work bizarre hours, and a willingness to put up with shitty conditions of employment.

      The reward for all this is the chance to laugh at those better dressed and housed than you, in their posh cars with their trophy wives, knowing that its thanks to you their engine management system has left them stranded on the highway in a $60,000 car!

      • Requiring the ability to work bizarre hours strongly indicates incompetence in program planning and program management. If you did the planning and management properly, you wouldn't HAVE to be working bizarre hours. Read DeMarco's "The Deadline", and think about it.

        Requiring a willingness to endure shitty conditions of employment is even worse. Companies that do this invariably find themselves functioning as unpaid training departments for their competitors.

        And you leave out a very important point: You
  • No. Actually the microprocessor or controller design business has at least a 20 years to survive according to Gordon Moore's projection (Google Moore's Law) and ITRS(http://public.itrs.net) projections. This is true for technology level designers too. You are probably mistaken by the fact that large companies do not seek only 4-year graduates. You must try Intel; TI, IBM or etc. You will see that the business still needs lots of people. I heard that there are 2 year technology graduates in TI from a fri
    • microprocessor or controller design business has at least a 20 years to survive according to Gordon Moore's projection

      Well, don't forget which company he founded.... I feel his "law" is nothing more than a mission statement that managers and investors took as their basis for evaluating the industry. Because they (managers/investors) believe in it, it will either happen or Intel will be punished. (e.g. has group X maintaned the law, okay they get a check on their annual progress report). Otherwise it
      • That is completely a wrong statement of yours; based on a simple logic. Gordon Moore may have founded the company INTEL; but this, at least in academic world, has nothing to do with scaling factors by time. ACtually what he predicted was what the technology, or companies so to speak, could do without SPENDING MUCH EFFORT.... This is the key point here. If you can scale the transistors without needing too much concern on low power techniques or interconnect wires, and this was the case for the last 30 yea
        • Yes, but are you saying that his early management/efforts (technical & nontech) didn't help sustain his prediction?

          (How did he benchmark himself?)

          And even after others took over his duties, I don't see why they would be eager to break the "tradition" nor argue why such a departure should be acepted.

          Perhaps they did sustain his prediction with little effort. But I'm arguing that he could have made a somewhat different prediction and it would have been sustained in the same way.
          • The thing that is constantly misleading you is that you are not fully aware of what Moores Law is about. I respect your comments and arguments but sometimes the factual accuracy is more important than pure logic. I am going to try to explain to you, some basics of Digital Design and semiconductor businesses, and I am pretty sure that your major is not electronics. I am by no means trying to degrade your knowledge or rationale but only trying to clarify some points. Therefore please do not get offended. My
  • not dead ... (Score:5, Insightful)

    by neomage86 ( 690331 ) on Friday June 23, 2006 @12:34AM (#15587363)
    just far too hard for anyone with a two year degree (and for most people with bachelors degrees)

    At the bare minimum, to be able design even a relatively simple chip you need the following classes:
    1.5 years physics (mechanics, em/wave, and quantum)
    3 years math (calc 1, calc 2, multivariable calc, diff eq, linear algebra, stats)
    3 years electronics (intro to electronics, digital logic, basic design i.e. intro to hdl, analog signal processing, solid state devices, advanced design) 1 year CS (CSI/II)

    Anyone capable of covering that much material, in addition to general school requirements, in two years destroyed their college admission exams and already has a good scholarship to a 4 year school (where they can get the degree in 2 years if they really want).
    • Re:not dead ... (Score:5, Interesting)

      by Omega Hacker ( 6676 ) <omegaNO@SPAMomegacs.net> on Friday June 23, 2006 @01:51AM (#15587646)
      Um, I'm pretty sure the OP is talking about using existing microcontrollers (e.g. PIC, AVR, lesser ARMs, etc.) in projects, not designing new processors...

      I've done commercial projects of such a nature myself, with only a tiny bit of formal training. Such things are trivially within the grasp of a 2-year degree holder with appropriate training.

    • by Anonymous Coward
      "At the bare minimum, to be able design even a relatively simple chip you need the following classes: "

      You two are talking two different levels (OP mentions controllers). A computer engineer is the one designing the microprocessor.* Then there's the person who takes both ICs and discrete components and lays them out onto a substrate (PCB usually).

      *several someone's actually.

      As far as microcontrollers? Not dead by any means.
    • What college admission exams? I know Caltech offers a exam for transfers (IIRC). However, most schools don't have admission exams in the US. Are you in Europe? The SATs only need a little high school math and relatively simple verbal skills. Also, most schools ignore SAT scores for transfers. I'm really not trying to be picky. I'm just curious.
  • Attention: (Score:3, Funny)

    by Anonymous Coward on Friday June 23, 2006 @12:40AM (#15587385)
    DeVry lied to you about your earning potential.
  • A college does not teach enough theory to do anything useful in embedded electronics. A 2 year trade program doesn't have the time and the students don't have the aptitude to cover the needed algorithms; if they did, they'd go to university.

    • Correct. Im doing software design for embedded systems. Ive a Masters from electrical engineering, and I eat my colleges with Batchelors for lunch. The thing is they cant think in systems. For them good code is what does the thing it has to do. For me good code is what A, does it has to do B, in the least possible time, C until the end of the world, and D, does not make bigger problems, as the ones it solves. No wonder most of the batchelor guys get lost in a year or two, because of too many stress. If th
      • "For them good code is what does the thing it has to do."

        I would take that as axoimatic (overlooking the poor wording).

        "For me good code is what A, does it has to do B, in the least possible time, C until the end of the world, and D, does not make bigger problems, as the ones it solves."

        Criteria B may violate criteria A in some real-time applications. Criteria C is impossible to meet. If criteria A is met, and criteria D is violated, it must be a requirements problem.
        • We both know, thats nitpicking what you do here, so I dont comment on all your comments, just the last. Yeah, its mostly a requirements problem. You see, the requirements are mostly written by marketing people, or people with very little overseeing of the task involved. To make the device accepted by the regulatory people, and both get the filled out check from the customer, there are sometimes trade-offs involved, and last minute requirements changes. You can say that you dont implement something, because
    • This is only partially correct. 2 years isn't quite long enough, but you don't have to go to University. I went to college for a 4-year technologist course and found I had better hand-on experience and one-on-one training with the professors when compared to university students I met on co-op terms. University may set you up better to teach, but not as well to jump directly into the job market.
      • I went to college for a 4-year technologist course and found I had better hand-on experience and one-on-one training with the professors when compared to university students I met on co-op terms. University may set you up better to teach, but not as well to jump directly into the job market.

        You do know those co-op students are working as part of their education, right? At Waterloo the first co-op term is after 4 or 8 months of school, so you're comparing your 4-year diploma to someone 8-months into a 4
        • Interesting site and strange logic. Years ago, my first year in college, I took a philosophy class to cover a liberal arts credit. It was an interesting class. I'm glad I took it. I do wish I could remember it better.

          Quite a bit of the class talked about artificial intelligence in computers and learning systems. One of the examples used a simple algorithm that your link reminds me of. I just wish I could remember it. I'll have to dig out the class book now and look it up. Anyway, the example was showing h
          • Correct, just interesting on how it came to the conclusion.

            The site is a lot of fun to play with because of things like this. It's always right and usually in a totally non-obvious and silly way. It happens because of incomplete data, like not knowing that Birds or Turkeys are animals. It will never be complete but its database is growing.
    • Oh, I don't know about that. If you throw out all the idotic liberal arts requirements and some of the irrelevent 'core' classes, you're left with a 2 year program. If you have strong math skills and generally understand DC circuits, it's not that difficult. I work with an 8 bit PIC as part of the project I'm working on. I do all of the programming and we have an engineer that we contract with to do the primary design. I don't need to be able to design the circuits from scratch, but I do need to unders
  • in everything from phones, gaming consoles, bluetooth mice/keyboards, etc...

    Just everyone and their brother runs a Java.net.OOP/PHP webshop and the signal/noise ratio of REAL jobs is too low.

    Tom
  • in like microprocessors or something, and in industrial electronics, and in general electronics. its been so long since i took those courses. i wish i didn't waste all that time on the high level courses. they don't transfer to any universities. the certificate is completely useless. i've never seen a job posting for someone with a two-year degree in any kind of electronics. it sucks. i have to work inside sales while i take 30959 years to finish my bachelor's.
  • by Anonymous Coward
    Based on your description are you thinking of jobs designing boards using micro-controllers right?

    In this case, I would say try small companies that deals things such as military, medical or even elevator manufacturing.

    However, if you are talking about designing chips, as a 2 year degree, you are better off teaching mask design. This is because it doesn't require a lot of training other then tedious work. Currently Intel, AMD, or even Via just to name a few will hire people with mask design expirence. Howev
  • They tend to pad requirements if they think they'll get hundreds of responses, they'll still hire the BCIT and Ryerson grads. Most embedded devices are composed of a some I/O, a processor, and programmable logic. Despite the other posts here, you really don't need differential calculus to design I/O devices, or quantum physics to make an iPod.
    • by ADRA ( 37398 ) on Friday June 23, 2006 @02:10AM (#15587704)
      Ah, sweet BCIT. I worked on embedded systems development (Not exactly chip design mind you), and I delt with the challenge adequately. Another friend, same school, same company ended up being one of their most proficient developers. Just because you don't get the entirety of the education to be entry 'qualified' doesn't mean you're incapable of ever picking it up. Given the chance, many can perform quite well above their current educational level.

      With that said, I think 2 years experience would be a challenging task. Then again, they don't necessarily hire entry level chip developers. They could start you off with more remedial jobs and make you work in-house a while.

      The question really is, are there -any- chip manufacturers still around in the Americas? Well, there seems to be a few big-houses still around, and if you're really questioning to keep the program or not, why not ask these companies personally? Maybe you could even arange career seminars with soon-to-be graduates?
      • PMC Sierra designs their own chips. SFU's Enigneering school has a intro course to chip design, but it basically covers putting a couple gates onto silicon. Teaching theory isn't the same thing as teaching technology or good design practice.
      • There's a lot of IC development done here in north america (and even here in vancouver). Admitidly a lot of the fabrication and rest is done overseas for cost and to be honest, poor environmental regulations. Chips are quite a concoction of nasty shit.

        For some reason lots of people also forget fpga development. At SFU we got into FPGA and ISA/PCI design in our second year (although not anymore since they started babying the curriculum to "double the opportunity"). We still do our VLSI class in 4th year thou
  • by xenocide2 ( 231786 ) on Friday June 23, 2006 @01:53AM (#15587650) Homepage
    Firstly, you're looking in exactly the wrong order. If you were looking for a research assistant, would you ask a student you know, or would you put an ad in the local college paper? Hopefully, you'd choose a student you know. You have a good idea of who they are, their work ethic, etc. And you won't have to somehow sort through the flood of applications you'd receive in a newspaper posting. So a newspaper is exactly the last place to look for most jobs, unless the employer is hoping to find the cheapest among several qualified applicatants. I don't believe Intel actually places ads in papers; some places prefer that you take interest in their company and seek them out instead. If you want to match your cirriculum to employers needs, I'd hope your "institute" has a few industry connections, since this is often a good avenue to your students actually getting a job. These are the people you need to talk to.

    Secondly, 2 years training to design microprocessors? What exactly would they be doing, that only takes two years to go from high school education, to mastered enough to be productive? Programming microcontroller devices, maybe. Designing them in today's market takes a knowledge of what's been done in the past, and ways one might improve them. The industry is simply too competitive to accept the kinds of mistakes and inefficiencies a novice would make when multiplied by a large scale production run. A 4 year degree is a good start, nothing more. Many of the largest chip design places have internal education to address academic cirriculum shortcomings. These would also be good people to talk to.

    Finally, what do you think qualifies as a distinction between a microcontroller and an embedded system? I'd say not much. 386's are being used more often now, in places where DOS or Linux can do far more than a PIC traditionally does. And if you're seeing so many postings for embedded systems, remember that a number of these projects are likely for US military applications, and non-US citizens, like Canadians, are usually unemployable in that field as a security precaution. If this still seems fruitful, why not adjust your cirriculum to match the demand you see right now?
  • One route for you to take might be to build cool hacks, put them on a blog, and promote them via Make.com and similar sites.

    Build a reputation and contacts that way, and it might turn into job offers.
    • Make Magazine was so disappointing. Their "cool hacks" were almost exclusively stuff like "I got this thingamabob called a PIC controller, which somehow controls an output signal based on the input. I don't know how it works, but that's not important right now".

      It's the "American Chopper" of technology - if this is the mindset and attention span of our future engineers, then nothing can forestall the apocalypse. I for one, welcome our new Chinese overlords.
  • Maybe you look in the wrong place.

    I advise you to:

    • Check out the semi-governmental research institutes
    • Not limit yourself to your own country
  • by blanchae ( 965013 ) on Friday June 23, 2006 @02:16AM (#15587723) Homepage
    First, thanks for all the comments, (even the negative ones). I intended to mean board level design or circuit design not chip design. I know that chip design is beyond a 2 year technology program and so is embedded systems like the ARM.

    In response to other postings, we do have industry contacts but you must appreciate that when an educational institute comes knocking asking for information, the priority on answering is way down on the list of things to do like emptying the garbage can...

    I agree that word of mouth is a common method of finding suitable employees and that's what I did when I was looking to hire employees when I was in industry. The issue is how to track the "word of mouth" career offerings?

    Slashdot is on the pulse of technology and seems like a quick and dirty method of acquiring data. How much value is put on the data is dependant on the quality and quantity.

    • The way to track the word of mouth technology is to go to the local trade shows. In Vancouver you can find events posted on BCtechnology.com Another method of feedback is to contact your alumni and get feedback on what courses they 'should have' had more info on. I find it sad that people on slashdot do not know what kind of courses are taught at the Institutes of Technology.
    • In response to other postings, we do have industry contacts but you must appreciate that when an educational institute comes knocking asking for information, the priority on answering is way down on the list of things to do like emptying the garbage can...

      Well, frankly, you have very poor relations with your corporate partners and need to work on this. I work in academia and can tell you that any of our faculty, or myself, that picks up the phone and calls one of our vendor contacts will have an answer
    • We do plenty of board-level design using microcontrollers, FPGAs, DSPs, and internally-designed ASICs.

      But... we don't hire 2-year degrees for design positions. Most university graduates we hire have GPAs of 3.8 or better and still start out with a year or two in applications engineering before they transition to R&D, or sales, or marketing, or manufacturing. (It's a good place to work.)
    • Oh I thought you meant chip design too.

      I doubt its dead. In fact I suspect the market for board-level design is huge given the variety of products and design tools out there.

      Its something I'm interested in too, but I suspect a college degree would teach me less than 2 full years with copper-clad boards, design tools, sample board companies and tonnes of CPU samples from around (and their programmers).

      Hopefully the curriculum you design is hands-on enough.
    • As a former board designer I can tell you someone with a 2 year degree will not be getting hired on as a board designer fresh out of school by all but a very few places. Frankly they are not going to have the combination of skills needed. Add in some good industry experience and some personal initiative and they might have a shot. A good board designer will need to know logic design, low level programming (assembly and/or C probably), tranmission line and power analysis skills. I really don't see packing al
  • by GrpA ( 691294 ) on Friday June 23, 2006 @02:39AM (#15587793)
    It's not dofficult to work out what happened. I started out in this career path long ago, straight out of high school. Back then, I was designing Z-80 based computer systems... Later, I went on to MCS-48 and MCS-51 based designs as well as flirting with x86 and 68xx(x) architectures at times.

    I was pretty good at it. My success ratio exceeded 90% throughout my career. And I was a one-man engineering lab... From design (including PLDs) through fabrication, prototyping and production following successful prototyping. Many of my successful projects were valued in the millions of dollars to the companies I worked for, back in the 80's.

    But there's not many people with those sort of skills, so over time, employers couldn't get the skills cheaply and stopped advertising for those people. They turned to PCs to perform jobs that would normally be performed on micro's or found other ways to do things. Most design work became an offshoot to inhouse production teams and never really became a critical business component.

    And when someone did have an engineer with those skills, they tended to undervalue them. I worked for many employers as a part of their churn. I replaced a cheap engineer and they weren't prepared to pay extra for the skills I brought to the position. Not all employers can see the value of someone who can design a modem out of three 20pin PALs or produce an engine management unit if all they want is someone to design their latest pinpad.

    The lack of people who could cut perfect code in assembly language and manually route circuits more efficiently than the auto-routing algorythms of the day became less important as circuit design apps got better and processors got fast enough that high level languages could be used instead of low level languages.

    So more people came in to the industry, but lacked the skills. Employers worked around it by asking less of them, but that diluted the products and so in turn diluted the value of such engineers to their employers.

    Universities and technical education centres simply couldn't produce the skills in people coming up. This further diluted the available skills resources.

    And no matter that you can get away with this 95% of the time, the other 5% of the time, you need the low-level skills. Otherwise your success ratio tend to drop below 50%.

    It got to the point where the average wage earner made around $40K per year, and Electronic Engineers in my city (Major capital city) would average about $35K.

    So I usually left after a while, chasing salary increases with other companies, when the ones I worked for didn't want to pay.. Until one day I realised I could make more for my family just by doing basic low-level tech work as a PC assembler. So I threw away my old skills and became another tech on the production line.

    Everyone else I knew - people who designed their own home PCs from the chips up only ten years earlier did the same... they became miners, postmen, builders. The work was less stressful, less hours and paid better. Some stayed. The lucky ones found companies that looked after them. It was rare.

    But now with only the diluted skills left in the marketplace, employers had a problem. I would speak to old employers who seemed suprised that their new projects were failing and no engineers were left. They wondered why it took a 386 processor and six weeks of C development to develop and debug a replacement to a keypad that cost $300 per unit to make that replaced one I designed for them from concept to prototype using $30 of parts with an MCS51 just six years earlier.

    So Business got out of that industry also. No engineering skills means it's not viable business. So they got into PC software development or similar related industries and just dropped that line of revenue from their business model.

    So, No new low-level skills, No engineers with the skills available to fix the problem and no positions because companies let this source of revenue die out.

    And the industry disappeared.
    • by Jasin Natael ( 14968 ) on Friday June 23, 2006 @07:21AM (#15588464)

      I wonder about this, and worry a little bit. These companies certainly need people with those skills, so ... would society benefit from a return to some form of indentured servitude? Perhaps if companies had protected their image over the past 30 years instead of letting hotshot MBA's slit their cash cows' throats and ride it into the ground, screwing all their customers in the process, then the stable companies could be trusted to provide a lifelong career for someone who chooses to learn these skills. I think that in the current environment, there aren't enough jobs to entice someone to get the necessary training. Turnover in skilled disciplines -- from both the employer's and employee's sides -- is way too high to justify the kind of dedication it takes to learn to do these things well.

      It would be nice to have enough faith in the long-term plans of a company that, eg, when IBM or Ford Motor Group needs someone who can do this, an employee could be sent to school for 2-4 years with a reasonable expectation of some long-term benefits.

      Lately, it seems like you need 10-15 years of experience just to be an asset rather than a liability in some fields. So why would an employer hire a college graduate for a reasonable salary, when the chances are next-to-nothing that this person will work for them long enough to contribute to the company? And who will guarantee that some new MBA won't fire him for some stupid reason? I once lost a job because some middle-manager decided that being "late" to work was defined as punching in more than 3 minutes after your scheduled time, and if you were late more than 8 times a year, you should be fired.

      Back to the core of the topic: It's the question of Freddy Fastfingers, the coder who can churn out functional code super-fast, but for every hour of his work, the company invests 2-3 hours of manpower fixing, explaining, or otherwise ameliorating the effects of solveable flaws in his code. Does he even deserve to have a job? Probably not. The question is, is it reasonable for the company to nurse his career for 10 years until he's learned his way around in his field, or should they find a way to do his job with less-skilled labor, using tools that (while overpriced and underperforming) aren't filled with amateurish, glaring bugs?

      Employers can't trust Employees to stick with their company, and Employees can't trust Employers not to fire them. It's a vicious cycle, and it's destroyed much of what made this country a leader in high technology in the first place.

    • As a university student in the UK having just finished my first year of electronic engineering, and thinking about specialising in processor architecture myself, I find your post both immensely insightful and immensely frightening.

      Would you by any chance have any added insight to offer for someone in my position?
    • Thanks for taking the time to post that excellent history of the profession. It's folks like you that keep me reading here.
    • GrpA, there are still some small co's doing interesting things in niche areas. Interested? Shoot me some email!
  • Yes, it is. (Score:5, Informative)

    by Vo0k ( 760020 ) on Friday June 23, 2006 @02:51AM (#15587822) Journal
    I licked quite a bit of the microcontroller-based embedded design, and from what I saw, only amateurs, and only most clueless of them use separate processor and controller. In the past it was making sense. Nowadays the market is saturated with microcontrollers that carry enormous amounts of extra hardware on chip and a hour with soldering iron spent on including a dedicated controller chip in the project can be easily avoided by a hour of browsing the catalogues for derivative that has that controller on-chip. Price increases are often negligible. Speeds are amazing.

    www.fairchildsemi.com/products/micro/ - SOIC-8 package, the size of an optocoupler - 8 pins, thingy would fit on the nail of your pinky, whole, with surface-mount pins. 64 bytes of RAM, 1-2K of program eprom, 64 bytes of data eprom, clocks, power monitoring, wake-up on any pin, 6 GPIO lines, eeprom writing, watchdog, serial output generator, sleep mode, idle mode, oscillator, and quite a few other goodies.

    On the other end of the scale: http://www.maxim-ic.com/quick_view2.cfm/qv_pk/4535 [maxim-ic.com] : 75MHz 64M addressable, ethernet, 1w, spi, CAN, 3x RS232, 8x bidi 8-bit GPIO, IP stack plus UDP, TCP, DHCP, ICMP, TFTP, IGMP in ROM, Wake-On-Lan, watchdog, clocks, and God knows what more.

    Add to that DSPs which are quite specific but achieve speeds higher than newest pentiums and athlons in their tasks (and often carry some "extra", add PC for heavyweight number-crunching and user interaction and you see:

    Controllers are dead. Microcontroller is way better because it allows for just the same on the hardware side, while vastly simplifying the interface side. With your current knowledge you should catch up and learn microcontroller-based design pretty fast.
  • by Dark Coder ( 66759 ) on Friday June 23, 2006 @05:44AM (#15588259)
    There are basically three categories of microprocessor design. And armed with an electronic degree, one can decide which area to focus on:

    1. CUSTOM BOARD INTEGRATOR
    3. PROGRAMMERS (both HW and Software)
    3. THE MICROPROCESSOR DESIGNER

    All areas entail different stages (and thus different skills). They basically cover requirements, designs, coding, integration, testing and maintenance. It is entirely possible to have a lifelong career in just within one of those stages, particularly test and maintenance.

    Even so, each area utilize different skill sets.

    1. Lowest man on the totem pole (but still well-paid) is the custom board integrator. Involves research and selection of hardware components using interchangeable interfaces (i.e., PCI, PCI-X, LVDS, Rocket I/O, VME and lesser known interfaces such as USB, Firewire, Parallel and serial). Testing of each HW components (not to get bad capacitors) are a non-trivial effort. Most low-budget company skimp these component testings. Nevertheless, it entails buildup around THE microprocessor.

    2. Midway is the programmer. VHDL, Synoptic and many other custom hardware programming languages which tends to be chipset-specific. Bulk of the job market are in this category.

    3. The elite is THE microprocessor designer. Intel, AMD, IBM, Motorola, Hitachi, Fujitsu, Xilinx and many others make uses of M-Designers. Most of them tend to be cultivated from within each company. Much research material have to be digested and assimulated to be able to design one of today's complexity. Best and easiest break into this arena is startup company, successful or not.

    I suggest, for a startup university department, you shoot for #2 as the majority of your curriculums. This ensures that these skill-sets are transportable to either #1 or #3, depending on how good they grasp the elementary logics.
    • What about ASIC (Application Specific IC's)?

      It ain't designing the next Power chip, but I know a few ASIC and other firms that spend their days taking problem and burping out custom solutions for car manufacturers, electronic gadget firms, avionics, etc.

      And just a couple years ago I kept hearing about the analog/digital ASIC hybrid market. Is that at all interesting?

      And then there are those 'brave new world' things like smart dust and nanomachines. This one tends to use skills from chip *manufacturing*, n
  • Not dead yet! (Score:5, Informative)

    by Peter Simpson ( 112887 ) on Friday June 23, 2006 @05:46AM (#15588267)

    [finally, something on Slashdot that I can comment intelligently about]

    I work for a small (6 EE, 10 ME, 20 Industrial designers) design firm. Small and large companies come to us for all kinds of design work, some of it is development or improvement of EE designs.

    We're always doing some sort of microprocessor/controller design, as well as CPLD and FPGA programmable logic. Pretty much every job we do incorporates one or more micros. In the past three years, I've used PICs, TI's MSP430, Freescales MC9S12, Atmels and probably a couple more. Development is done on PCs, running something like a Metroworks IDE. Sometimes we use an embedded OS like uCos, sometimes not.

    At least from where I sit, microprocessors are still very much relevant. I'm currently working on an embedded controller for a mechanical system -- two motors, limit switches, temperature sensors and two serial communication ports to other controllers not built by us. There's analog and digital interface design, the micro is a Freescale 9S12, power supply regulators and more. Lots of fun!

  • there are literally billions of 8- and 16-bit microcontrollers shipped each year. so there must be jobs out there somewhere! granted, it only takes one designer to make a product that can ship a hundred thousand units - and the companies doing that probably won't hire a two-year certificate guy to design that. but there's also a lot of small-run and custom work, done by smaller engineering shops.

    go ask your question on the piclist mailing list, you'll get a lot more coherent answers than the ones here.
  • by Not Invented Here ( 30411 ) on Friday June 23, 2006 @10:13AM (#15589236)
    There's a lot of these chips selling, so somebody must be buying them. Have you tried putting your question to the local offices of the chip companies?
  • I went to a Canadian Post-secondary institute where we where taught how to design and build a micro-controller (we used the HC6811). Since that point in time seen any jobs advertised where I could get a job doing micro-controller design, to be fair I haven't looked specifically for that though. I got my current job through word of mouth and we may be doing some micro work coming up for a project. The one reason I really liked our micro courses where that instead of being told that all pieces are accessed th
  • still important (Score:2, Interesting)

    by fortunatus ( 445210 )
    there is A LOT of embedded processor work out there. i work for a company that is constantly looking for programmers who are accustomed to working with embedded processors. i personally would be glad to hire a talented 2 year tech. cert. level person as a programmer on my systems.

    here's thoughts:

    1) companies advertise for 4 year engineers and higher, they simply don't see the need to advertise for 2 year certificate level applicants. you need to train your people in networking to get around that. al

  • Micro design (Score:2, Informative)

    by thoriphes ( 984506 )
    I'm a recent college grad and I took the embedded course they offered here as my main design course. We worked with Atmel AT91 (ARM) using eCos as the embedded OS. In the course we not only worked on the microprocessor architecture, but also interfaced it with hardware that we designed and built ourselves (not just your run of the mill "read the ambient temperature and do something about it" projects, but stuff like building signal boosters, RF controllers, ethernet controllers, etc). Needless to say, it wa
  • Microcontroller-based systems are definitely popular these days, so it's clearly not dead. Pick up a DigiKey catalogue and look at just how many different microcontrollers they stock and sell in unit quantities; somebody's got to be using them. And you are finding postings for jobs requiring a university degree, which means there are people working at that level.

    I'd never heard of a tech school program specializing in microcontrollers before, and there's definitely a substantial amount of amateur work in th
  • I'm general manager for an embedded design house.

    Dead? What? In a day and age when everything around you has a uC of some sort in it? Now are a lot of those consumer products being designed in Elbonia? Of course, but still, there's a lot going on. This year I've worked on projects using small (say PIC like) microcontrollers in:
    house arrest system
    sports watch
    in store kiosk (touch screen controller, the brain of the kiosk is an embedded x8
  • I'm currently seeking for that kind of job.

    What I've seen, is that the circle of embedded coder/designer is small. So if you are a new graduate, there will be a lot of senior ppl wanting to get the same job than you. So you never can start :-/ or you have to find the place where the senior was working. There is plenty of good jobs, but need to find'em, because the don't get posted on jobs website for long.

    I'm from montreal if you look for a microcontroller designer/coder! I'm interested!

    -I'm an ing

  • There aren't a lot of companies producing their own hardware anymore. Long gone are the days of DEC, Sun, HP, SGI, IBM, and others producing their own microprocessors and hardware. With consolodation comes fewer jobs.

    I have noticed that many embedded projects I've worked on leverage FPGAs, quite heavily. While companies typically prefer to purchase premade cores, there's still a ton of integration work... memory controllers, DSP functions, and co-processors.
    • Can OSX run on AMD platform? In 2005, Steve Jobs said that OSX is processor independent and cross platform in design from day one. I wonder whether OSX can run on AMD processors. If yes, we can DIY Mac. Is it possible?

For God's sake, stop researching for a while and begin to think!

Working...