Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
AMD IBM Technology

AMD Lures IBM Veteran to Lead Chip Design 127

Rob writes "Computer Business Review is reporting that Advanced Micro Devices yesterday said it had hired Jeff VerHeul away from IBM to lead the direction of AMD's future silicon design. VerHeul's most recent post during his 25-year stint at IBM was head of engineering and technology services. Now, he will lead the development of all future AMD computing products, including silicon roadmap design across all AMD's engineering sites worldwide."
This discussion has been archived. No new comments can be posted.

AMD Lures IBM Veteran to Lead Chip Design

Comments Filter:
  • by Saven Marek ( 739395 ) on Friday August 19, 2005 @06:54AM (#13354154)
    Who else is waiting for the next slashdot story

    "ex-IBM Engineer sued for violating non compete agreement"
    • I havent actually checked to see which specific site Verheul was previously employed but given that both Intel's and AMD's headquarters are located in California he probably doesnt have anything to worry about.

      My questions is how long are the rest of us going to have to wait before a ban on non-compete clauses filters out to the other 49 states?
    • The headline I've been waiting for is "IBM to buy AMD". Then AMD wouldn't have to worry about joint development deals any more, and IBM would suddenly have a world class x86 chip to sell - and their size, reputation, and chip capacity would probably get the AMD64 some serious market share. Now IBM has someone they trust over at AMD who has the background to evaluate the sensibility of such a deal. All they'd need to do is have lunch six months down the road, and ask the magic question - a simple yes or no c
      • Re:Wrong headline (Score:3, Insightful)

        by Knetzar ( 698216 )
        Why would IBM want x86? They no longer make PCs and they want to be able to get both Intel and AMD processors for their non-power based servers.
        • IBM is trying to make itself into a services first company, all over the last 10 years has been selling off any commodity Hardware manufacturing. First Disk Drives, then PowerPC, now Laptops. Servers may be next, although who would buy them is unclear. Dell could probably find the money if they really wanted to buy IBM's Server line, but there would be product line overlap.. Mainframes IBM will always make but they are not made in the USA anyhow.
          • if they were to sell there server/mainframe line to anyone I would think it would be Unisys, but honestly I don't see IBM selling it since it fits into the whole IBM solution business model
          • They sold the PC division because it wasn't making a profit. Right now the remaining major divisions (software, services, and servers) are making a profit, so I think they'll hold on to them. If things change down the road and they start losing money in the server business, then I can see them getting completely out of hardware, but until then, I really think they'll hold onto those big iron machines.
    • This was likely done with IBM's approval. AMD and IBM recently extended their chip development cooperation agreement. I'd sooner expect IBM to buy AMD than sue them but even that's not all that likely.
    • I'll wait for the dupe.
    • "ex-IBM Engineer sued for violating non compete agreement"

      Very unlikely, IMHO. I've worked for IBM for the last 10 years, and I've seen firsthand how IBM handles these sorts of situations -- with kid gloves. Although IBM employees sign an employment contract that includes a non-compete clause, IBM almost never tries to enforce it. For example, I know a former IBM executive who violated his IBM non-compete agreement by going to work for a client, as CEO. That's not at all unusual, of course, though it

      • swilden writers, "and how many companies *aren't* smaller?"

        Now that it an easy question. There are 9 companies that aren't smaller than IBM in the US alone. :)

        In 2005, IBM's Fortune 500 Rank [fortune.com] was 10, meaning 9 companies exceeded them in size. In 2004 they were number 9... they gotta watch out.

        It isn't just IBM that is HUGE. HP, through merger mania is #11 [fortune.com], and tiny little direct sales PC only Dell is even ranked #28 [fortune.com]... Note I say Tiny because Dell had under $50b in revenues, and IBM had almost $100b...
    • For some reason I've read the headline as "AMD lubes ...". Must get eyes checked.
  • by DohnJoe ( 900898 ) on Friday August 19, 2005 @06:55AM (#13354157)
    this must mean that AMD will switch to PowerPC!!!
  • by Kawahee ( 901497 ) on Friday August 19, 2005 @06:58AM (#13354162) Homepage Journal
    Hopefully this will give nex-gen AMD chips a fresh design and hopefully push them to a significant majority over Intel. I've always personally favoured AMD chips, simply because they're damn good value, and efficient.
    • AMDs designs are already 'fresh.' It looks like a good move, but I bet they could have done MUCH better promoting from within. But it wouldn't have been a news story then...
    • by Anonymous Coward
      Hopefully this will give nex-gen AMD chips a fresh design and hopefully push them to a significant majority over Intel. I've always personally favoured AMD chips, simply because they're damn good value, and efficient.

      AMD: true dual core -- now.
      Intel: piecemeal dual core

      AMD: mobile 64-bit cpu
      Intel: mobile 32 cpu based on the Pentium iii(and is not planning to go to 64 bit there)

      Open your eyes, my friend. AMD is already the next generation.
      • Logged-in Master of Bravery has a point.
        The only real reasons AMD doesn't have the bigger market share are:
        1) Intel's been around longer
        2) Intel plays more anti-competitive hardball (see: their pockets are deeper)
        3) Intel's pockets are deeper

        What I find great is that even though intel's compiler makes AMD code run slower, I still find AMD faster and more reliable than intel chips.
        -- Proud AthlonXP T-bred 2700+ user, soon to be AMD64 in my ASUS laptop for college (yes, I know I don't *need* it. XP)
    • by MOBE2001 ( 263700 ) on Friday August 19, 2005 @08:13AM (#13354431) Homepage Journal
      Hopefully this will give nex-gen AMD chips a fresh design and hopefully push them to a significant majority over Intel.

      This will not happen. Intel's marketing prowess is much better than its competition. What would scare Intel (and the others) is a revolutionary new chip that solves a major problem in the industry. Consider that all processor architectures are based on and optimized for the algorithm, a custom started by a guy named Babbage more than 150 years ago. Progress has only been incremental since.

      A really new architecture should abandon the algorithmic model and adopt a non-algorithmic, signal-based synchronous software model. It would revolutionize computing and solve the nastiest problem in the computer industry: software unreliability.

      But we cannot expect big companies like Intel, AMD and IBM to be truly innovative. Their approach is evolutionary, not revolutionary. Hopefully a bright upstart will get the message and make a killing while the behemoths are busy fighting each other for market share. They won't know what hit them until it is too late.

      The message is that there is a solution to the software reliability crisis. The disadvantage is that it will require a radical change in both processor architecture and software construction methodology. But the advantage is too good to ignore: 100% software reliability! Guaranteed!
      • After looking at the COSAed program a couple times and digging deeper, I got a couple things:
        1. Algorithmic processing works, it is standard, it is understood. Things need to be done in order.
        2. I'm a somewhat smart fellow, and I understood about 40% of that COSA OS design. The design is based on smaller chunks of algorithmic code, just like the new Cell processor from IBM. You know, the one powering the Xbox360 & PS3.

        Seems to me the silver bullet isn't going to smash through all things in a magic
      • the real silver bullet is of course people seeking quality in there lives, and quality in there actions... and quality in there work. if you're just there for the money who cares as long as it 'works for you'

        Also reliable software is dependant on reliable hardware. so it's not just the software industry that needs to focus on making sure there is enough quality in the jobs they do, it's the responsiblity of the people mass producing hardware too.
      • And if you call now you will also receive an Omniscient Hard Disk Drive Solution(R) from Infinium Labs(TM) which will hold 5 PB of data for eternity absolutely free! This OHDDS can withstand obuse from nuclear attacks, millions of computer viruses and worms, hellfire and brimstone, submersion in server administrator urine, and even volcanic eruptions! Order Now!
      • A really new architecture should abandon the algorithmic model and adopt a non-algorithmic, signal-based synchronous software model. It would revolutionize computing and solve the nastiest problem in the computer industry: software unreliability.

        Of course something that radical would be useless to the desktop/notebook market as a whole. You'd have to start a whole new type of PC to take advantage of it, one that's fundamentally incompatible with everything else out there. Don't get me wrong, I'd love to s
      • Your silver bullet is no real solution. You would still be dealing with algorithms, you are just pushing them somewhere else and calling them by a different name. The signal based synchronous software model would not (in and of itself) make any improvents to the relibility of software. You state in your paper that hardware flaws are physicial flaws rather than design flaws, which is completely untrue. The hardware world has seen more than it's fair share of design flaws as well. The reason is that QA
      • Any programming model can be done on a Turing machine, without support from the hardware. There is no need to overcomplicate silicon just to support one programming paradigm over another.
      • That crack rock's gone to your head.

        "... signal-based synchronous software model. It would revolutionize computing and solve the nastiest problem in the computer industry: software unreliability."

        Software would be reliable if we could produce 100% algorithms which are free from unconsidered and unhandled cases. It has nothing to do with the medium of execution, and everything to do with the design. This is why computer engineering needs to be a more popular topic.
      • But we cannot expect big companies like Intel, AMD and IBM to be truly innovative. Their approach is evolutionary, not revolutionary. Hopefully a bright upstart will get the message and make a killing while the behemoths are busy fighting each other for market share. They won't know what hit them until it is too late.

        With a chip-fab nearing $40 billion startup capital, it's hard to imagine anyone from their garages coming up with a competing chip. The days of Tesla and Edison playing in a garage are gone
      • Intel's marketing prowess is much better than its competition

        I think you mean *manufacturing*

        When AMD can pony up 10 billion dollars a year to invest in fab capacity then Intel has something to worry about.
    • by gregorio ( 520049 ) on Friday August 19, 2005 @08:42AM (#13354622)
      I've always personally favoured AMD chips, simply because they're damn good value, and efficient.

      Or maybe because you're the typical geek who hates everything that's big and dominant. Geeks need to love "different" things, made for "special" people or not. Geeks need iPods and Unix computers, because other players and Windows computers are not for special people like you guys.

      If someday AMD beats the crap out of Intel and start to be the big guy, you might as well start talking about the superiority of Intel products and how it is so unfair that AMD dominates the market. =]

      And my point is...? Well, it's not really smart to be such a big fan of a company/group/etc. I think that we should give our respect to good products, actions and attitudes. Cheerleading for a commercial entity is just pure nonsense. I'm a consumer, I want good products, good actions and good attitudes. The world is about results. It's naive to expect that just because you "like" a group all of their actions are going to fit your views and needs. It's up to their shareholders if AMD is going to succeed in the long term, have giant profits or giant marketshare.

      I'm giving my soul to good results, not for companies, groups or whatever. That's why my current PC holds an AMD processor. Next time I'm buying a computer, I'll just buy whatever is best for me, AMD or not. I'm not "hoping" AMD wins, I'm just hoping the market is filled with good products and plenty of choice.
      • Score 3 insightful? gregorio just went off on a rant that was totally unrelated to the original post:

        >> I've always personally favoured AMD chips, simply because they're damn good value, and efficient.
        > I think that we should give our respect to good products, actions and attitudes.

        Parent never said he was a *fan* of AMD, just that he liked AMD's products. Your entire post, albeit interesting, is moot point. Which I guess is alright, (+4 interesting, -1 moot)
        I think calling people on fanatical chee
        • I think calling people on fanatical cheerleading of companies that clearly don't deserve it (see: Apple, Microshaft) is probably something we need to call them on

          Right, because things are that black-and-white in real world, right? Like, everything that Microsoft does is crap and Apple products are made for dumb people?

          but I don't see a problem with cheerleading a company that produces good products, tends to be the "good guy" in relation to their major competitor (see: Intel v. AMD for list of anti-com
      • Geeks need iPods and Unix computers, because other players and Windows computers are not for special people like you guys.

        Au contraire, the iPod has grown far too popular. The true geeks only swim in the iRiver! :D

        - shazow
      • I think that we should give our respect to good products, actions and attitudes.

        Excellent. Everyone agrees on this one.

        Now consider that when the number of competitors in a marketplace decreases, that the remaining businesses don't need to provide as much quality for the price. Ultimately, with a single large dominant player in the marketplace, be it chips, OS, routers, petrol, telephone service or whatever, you end up paying a lot of money for little quality.

        And there are multiple barriers to entry in a

        • Cheerleading may be irrational, but it remains one of the ways to change the marketplace dynamics.

          If irrational fans support niche players like AMD, Apple, Linux, non-Cisco routers and biodiesel fuels, then I know I'll benefit. So I don't complain.
          That doesn't stop us from thinking they're weird, irrational people. I mean, I like monkeys, they're useful (they entertain me at the zoo), while they're dumb.

          =]
      • Or maybe just maybe it's that teeny weeny thing about how AMD has always had the better performance/$ not to mention less heat, smaller size, just plain better designed cpu ever since the Athlon came out over 6 years ago.
      • Or maybe because you're the typical geek who hates everything that's big and dominant. ...or not. Like you end up agreeing with me, I use what's best. That's why I don't use Linux on my home computer for development and personal use, I use Windows XP Pro and MS VS .NET 2003, with Visual Assist X with Internet Explorer 7. My webservers are Linux, Apache 2 and MySQL 4.1. I use AMD and ATi, a MS mouse, as well as other fairly dominant brands for most of my hardware, such as my Pioneer DVD drive.
        • That's why I don't use Linux on my home computer for development and personal use, I use Windows XP Pro and MS VS .NET 2003, with Visual Assist X with Internet Explorer 7. My webservers are Linux, Apache 2 and MySQL 4.1. I use AMD and ATi, a MS mouse, as well as other fairly dominant brands for most of my hardware, such as my Pioneer DVD drive.

          Then I can't see why you "hope" that AMD succeeds or anything like that. You should "hope" that anyone invents good stuff, so you can buy it.

      • Geeks need to love "different" things, made for "special" people or not. Geeks need iPods and Unix computers, because other players and Windows computers are not for special people like you guys.

        Parent modded insightful?? The iPod is the market share leader in portable digital music players by a hefty margin. How does the most popular of a type of device qualify as "different?" Unix is a much more elegant operating system than anything MS has put out to date. Geeks typically have training that make

        • Parent modded insightful?? The iPod is the market share leader in portable digital music players by a hefty margin. How does the most popular of a type of device qualify as "different?"

          You don't need to be truly unique to be "different" (or "special"). There are a lot of groups where being "special" just means being like everyone else in the said group.

          In fact, you don't even need an official group to be "special". Using things perceived as "special" (like a stylish music player) is an important part of

    • ...push them to a significant majority over Intel. I've always personally favoured AMD chips, simply because they're damn good value...

      Of course, if AMD beats out Intel, then we can forget the "good value" part because AMD will also have the pricing power that Intel currently has. Sigh - can't win...

    • I'd rather have both Intel and AMD going at it, instead of either one alone. Monopolies are not kind.
  • IBM (Score:3, Insightful)

    by Renraku ( 518261 ) on Friday August 19, 2005 @06:58AM (#13354163) Homepage
    Don't be hatin' IBM. They've had some really good ideas/innovations in the past and I figured an IBM team member would end up either at AMD or Google.
    • Re:IBM (Score:3, Funny)

      by rholliday ( 754515 )
      Exactly. As I understand it, IBM developed and owns the 64-bit board architecture. Just a small thing.
  • But...why? (Score:4, Insightful)

    by MunchMunch ( 670504 ) on Friday August 19, 2005 @07:00AM (#13354168) Homepage
    I'm certainly not an expert, as I'm sure many replies will point out, but I thought AMD has been out-innovating IBM's PowerPC line for quite some time.

    So isn't this by all signs a step backwards?

    • Re:But...why? (Score:4, Insightful)

      by ucahg ( 898110 ) on Friday August 19, 2005 @07:06AM (#13354180)
      Do you think AMD has been out-innovating IBM because all of IBM's engineers are stupid? Do you think its the fault of this one man?

      Their strategy is simple: Hire the best they can find.
      • Do you think AMD has been out-innovating IBM because all of IBM's engineers are stupid? Do you think its the fault of this one man?

        Their strategy is simple: Hire the best they can find.

        Hey now, I understand I didn't say much about my reasons for believing AMD has been out-innovating IBM, but no reason to put little men made of straw in my mouth (and then bash them to their component molecules). As the article states, this guy is responsible for overall direction, and I was assuming that was the topic--

    • Re:But...why? (Score:4, Insightful)

      by ZenShadow ( 101870 ) * on Friday August 19, 2005 @07:06AM (#13354185) Homepage
      PowerPC isn't IBM's only line. How about Cell? If the rumors about Intel's "new direction" prove out, having someone who developed something like the Cell in house could prove to be very fortuitous.

      -S
      • Re:But...why? (Score:5, Interesting)

        by kalidasa ( 577403 ) * on Friday August 19, 2005 @07:14AM (#13354206) Journal
        The Cell's core CPU is a PowerPC processor [arstechnica.com]. And the PPC is a very good chip - the problem is that IBM decided that it should focus on Power5 and Cell, and neglected the G5 (and had some scaling issue, IIRC). The G5 wouldn't sell nearly as many units as Cell does, and the Power5 probably has a high margin (and is for their own server products). Again, IIRC, IBM tried to sell Apple on the Cell (so they could continue to fulfill their obligations to Apple without keeping up the G5), but Apple felt that the Cell wasn't really a good choice for general-purpose computing.
        • Re:But...why? (Score:1, Insightful)

          by Anonymous Coward
          Neglect is a strong word. Basically, IBM wants a volume partner on terms that they agree with. Apple wasn't it, Apple wasn't willing to put up the cash or the numbers for IBM to continue with them. The console market looks to be providing IBM with something that they want. From the looks of the new 970s, they can do everything than anyone else is doing multicore, low power, etc..

          POWER5, POWER6 and other initiatives are things IBM has to do for IBM, period. The margin is high but it's part of an integ

        • Just for completeness, the only the control core in the Cell is PPC-based (and it looks to be a totally new PPC desgin at that, being in-order and all). The other 8 cores are a new design that has nothing to do with the PPC.

          Besides, the Cell could be built with the control core running any architecture. It's the headaches with the basic technology that I believe AMD would be interested in -- imagine one of those suckers with an Ath64 control core? :-)

          Personally, I want a chip with an Opteron 8xx-based con
    • by Mobile Unit of the G ( 862058 ) on Friday August 19, 2005 @07:19AM (#13354223)
      Athlon wins the prize for brute CPU power, but the real strength of PowerPC is that IBM can design custom chips based on combining PowerPC cores with additional processing elements. This technology is behind Deep Blue, Blue Gene, the PS3, and the Xbox 360.

              This kind of chip is hard to program for, but can deliver unbeatable performance per dollar, square centimeter and watt when software is codesigned with the hardware.

              AMD and Intel are going in this direction with dual-core, but IBM is already way ahead. For instance, BlueGene is based on a special chip that has two PowerPC cores with an incoherent cache (tricky to program but cheap and fast) and adds an enhanced vector processing unit. IBM is a leader in higher-end SoC solutions (really, anything that gets power from the wallplug instead of a battery.) Lower-power applications are using MIPS and ARM cores instead...
      • Don't forget that PowerPC is the low-end, budget version of POWER. POWER5 is shiny. POWER5+ is due out soon. POWER4 had dual-core back when AMD was still pondering whether 64-bit was a good idea.
  • This is slightly off topic but I've heard that AMD chips are supposedly better for gaming than their Intel equivalents.

    Is this marketing hype? User hype? Any truth or unsubstantiated personal anecdotes to confirm or deny?
  • by bigtallmofo ( 695287 ) on Friday August 19, 2005 @07:07AM (#13354186)
    In other news...

    Local Ice Cream Shop Scores Big Hiring Scoop

    Rita's Water Ice yesterday announced it had hired Mary Lopez, 15-year old former ice-cream scooper at Little Shop of Ice Cream. Lopez's career at LSIC consisted of serving drinks, hot dogs and various frozen ice cream and custard products. She will now be responsible for Rita's [...]

  • by Serious Simon ( 701084 ) on Friday August 19, 2005 @07:14AM (#13354204)
    Maybe this IBM veteran is attracted to designing a lead chip, but there will be no market for it in Europe. Chips containing lead will be banned next year due to the RoHS directive...
  • Pah... (Score:5, Funny)

    by gowen ( 141411 ) <gwowen@gmail.com> on Friday August 19, 2005 @07:14AM (#13354211) Homepage Journal
    Let's face it, there hasn't been a major breakthrough in chip design since Lays produced their first prototype of the "crinkle cut".
  • Good, now AMD can get on that Terminator "Liquid Metal" technology.
  • ...who goes to work where? Keep luring each other. shaking things up is always good. Especially in these times when I hear that china has a double growth rate than US. There are more things than getting into a bitch-fight all the time.Eventually all those cheap chips will be sold on the chinese flea market. So keep fightin'...
  • SMT (Score:4, Interesting)

    by MrNemesis ( 587188 ) on Friday August 19, 2005 @08:06AM (#13354402) Homepage Journal
    One thing that has been interesting me lately, after reading a series of Anadtech articles on current and near-future processor tech [anandtech.com] is the possible inclusion of SMT (oft marketed as Hyperthreading by Intel) on AMD cores.

    The article mentions the POWER5 chip and it's implementation of SMT and how it behaves with multi-core chips (i.e. how it can devote all threads on one core to a single task, with the other core(s) sharing the workload via SMT) and how it's rather more impressive that the HyperThreading[TM] on Intel P4's, although I'm not a microprocesor guru.

    Whilst I can understand AMD's decision not to put SMT in their current processors, with the recent focus on multi-core and multi-threading I think they'd be foolish not to think about it soon, and (as someone not very up on non-x86 chips) it seems IBM's POWER5 is a good base to emulate. Does anyone have any information on SMT implementations in POWER other chips like Sparc and Itanium?
    • Sun's Niagara "combines chip multiprocessing (CMP) and SMT [sun.com] to do Chip Multithreading.
    • SMT isn't the straight up performance gimme it is often marketed to be. In some instances SMT is good, and some aspects of SMT are good most of the time. but SMT introduces issues that can sometimes trash performance. For instance If you have mutliple threads acting in a processor core you suddenly have to worry about register names, this is accomplished either by using register renaming or multiple sets of registers, both solutions adding complexity to the decode time of the chip. Another issue is cach
      • Thanks for the post, an awful lot of articles I've read don't often go into to performance issues like that.

        AFAIK, all modern x86 chips alredy use register renaming, but would the additional overhead of SMT make that slower?

        Re: the cache issues, it seems to be something that Intel have (maybe) tried to solve by dumping ever-increasing amount of cache on their chips (although of course they need generous cache alread because of their relatively high memory latency). Obviously I'm thinking that the AMD64's co
  • by edxwelch ( 600979 ) on Friday August 19, 2005 @08:12AM (#13354429)
    They'll need all the help they can to keep the lead on Intel.
    Intel's 90nm process was a disaster, due to leakage problems.
    According to here http://www.theinquirer.net/?article=25512 [theinquirer.net] Intels 65nm process solves some of the leakage problems and is due to be released very soon.
    I get the impression that this will make it on par with AMD's current 90nm process as regards power consumption.
    When the 45nm process comes out the leakage problem will be completly fixed completely.

    • When the 45nm process comes out the leakage problem will be completly fixed completely.

      Yeah, and they'll get the Nobel Prize for that, since the power consumption due to leakage increases with the descrease in process size. In fact, it's getting so high in chips being currently designed that the static power consumption is becoming higher than the dynamic power consumption due to the signal switching.

      But, Intel will fix it completely with their next process. It'll be easy.
  • VerHeul's most recent post during his 25-year stint at IBM was head of engineering and technology services.

    I read the first half of this and thought, "wow, this guy has had a weblog for 25 years!"
  • As bad as that is for IBM, it's great to see AMD taking some strides.
  • Cool! (Score:2, Funny)

    by sharkey ( 16670 )
    Opteron and MCA, together at last! What more could anybody want?
  • AMD Lures IBM Veteran to Lead Chip Design

    Change the letters on the sign and put on your ties. Hurry.

  • His new office is a conference room I used to use at AMD's Austin South site.
    It has no windows :(
  • Predicted when IBM hyped the cell processor a few months ago that you'd to better with an AMD cell processor once AMD hired IBM's managers. Sure enough, you'll do better with an AMD cell processor.

  • Now, he will lead the development of all future AMD computing products, including silicon roadmaps

    Can someone tell me where I might find silicon roadmaps? My roadmaps are all made of paper, and get ruined when I spill coffee on them. Silicon roadmaps would surely solve this problem...

Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing had happened. -- Winston Churchill

Working...