Forgot your password?
typodupeerror
AMD Businesses Digital

AMD Brings Back Athlon K8 Designer as Chief Architect 63

Posted by Unknown Lamer
from the alpha-king-of-architectures dept.
MojoKid writes with exciting news from AMD. From the article: "After more than six months of high-to-mid profile executive departures, AMD has major news to announce on its new executive hire — and he's a welcome addition. Starting today, Jim Keller will serve as a Vice President and the company's Chief Architect for CPU Cores. Keller has spent more than thirty years in the semiconductor business, including a few at AMD. When AMD brought members of DEC's Alpha team aboard in the late 1990s, Keller was one of the CPU architects that came along. Having worked on Alpha's EV5, Jim was lead architect on the first K8 project. Keller moved on and eventually became one of the core members of PA Semi which was bought by Apple in 2007."
This discussion has been archived. No new comments can be posted.

AMD Brings Back Athlon K8 Designer as Chief Architect

Comments Filter:
  • by Anonymous Coward on Wednesday August 01, 2012 @12:01PM (#40843377)

    Let's hope he will guide the development team to squeeze as much performance per-core rather than slap more and more cores....

    • by Anonymous Coward

      I think you need both. You need both, good multi-core, scalable chip *and* efficient cores.

      • The question is, can he turn the company around? I've seen a noticeable gap in the offerings from AMD in recent months, and given past performance, have wondered WTH has been going on. I say this is someone who is running on a FX-8150: where is my upgrade path when Intel bangs out something new?

      • by Iniamyen (2440798)
        Poo on efficiency; I want to be able to use my computer as a space heater, like I did with my Athlon XP!
    • rather than slap more and more cores....

      Yes, more performance per core is always good since some imporatne tasks are inherently serial, and almost certainly impossible ti parallelize.

      However, if your workload is parellelisable, then AMD provide some very good options. If you're buying a dual or quad socket machine, then you're going to get at a mimumum 8 cores, (probably more like 12, 16, 24 or 32), which means that you expect your workload to be quite parallel (if not, you'd be best off dumping all the mon

  • by Trepidity (597) <delirium-slashdotNO@SPAMhackish.org> on Wednesday August 01, 2012 @12:31PM (#40843825)

    Keller didn't just accidentally end up at Apple as the result of the purchase of PA Semi; the consensus is that PA Semi were specifically bought to acquire the team led by Keller, moreso than Apple caring about the company itself (i.e. it was what startups these days like to call an "acquihire"). Keller headed the A4/A5 design at Apple (the system-on-a-chip in the iPhone and iPad), so there's now a noticeable staffing gap if they plan to continue in-house development of their mobile chips.

    • by Ironhandx (1762146) on Wednesday August 01, 2012 @12:57PM (#40844295)

      Besides that this guy designed the K8, one of the most under-rated chips of all time, and set the direction for AMD that led to the AMD64 chip that had Intel flubbing about looking for an answer for nearly two whole years from a company that at the time was struggling to stay afloat.

      AMDs entire cash-on-hand balance can be nearly directly credited to this man.

      I'm not what you'd call a fanboy, but I am a fan of AMD products(in particular since they acquired ATI). Lets hope this guy can bring some of the bang back.

      • by GungaDan (195739)

        Wasn't that the K6 and K6II that preceded the AMD64 and early dual-core Athlons? K8, IIRC, was the disappointment known as "Phenom."

        • by Desler (1608317)

          Phenom is the K10.

        • by Ironhandx (1762146) on Wednesday August 01, 2012 @01:50PM (#40845229)

          Besides people correcting you, the Phenom and Phenom II chips were by no means "bad" excepting a few pricing faux-pahs where they priced new parts too high for their relative performance out of the gates.

          There are actually very few(nearly none) video games out there that max out any CPU these days. Its all GPU now. The physics engines pretty much do as much as they are going to and all thats left is to make it shinier.

          I bought my FX4100 purely because of how quiet I can make this thing run. The loudest thing in it is the 800 RPM PSU fan and the computer is overclocked.

          Which actually brings me to a pet-peeve about bulldozer. They've underclocked these chips by a LOT. Its actually ridiculous what they've done to themselves. This computer idles at room temp and hits maybe 40 Celcius under load on chip temp, and its overclocked about 600 MHZ which yields something like a 25% total increase in performance. In other words its competitive with an I5 or low end I7s for $100 or more less cost. The best part is any idiot can overclock it with one of these new UEFI boards from Asus, they don't even have to know how to install software or navigate a bios by keyboard.

          • by mellyra (2676159)

            There are actually very few(nearly none) video games out there that max out any CPU these days. Its all GPU now.

            MMOs are more popular than ever and are very CPU heavy - RIFT has to be one of the worst CPU hogs I know (my 2.8 GHz Phenom II x4 is at the very bottom-end of acceptable processors and even WoW did profit from going 64bit in benchmarks (definitely not due to memory but 32bit WoW does not require SSE2 iirc - so the performance gain is probably a mix between being able to rely on more modern instruction sets and having more registers available).

            • Rift was running at 60 fps on my 3800+ AMD Athlon X2 with a 4890 Video card attached on pretty high settings, all I had to do was turn shadows down. As such, I find your claims incredibily hard to believe, and I keep that board and processor around purely because of folks like you. It may say that on the requirements for the game but the game itself in no way requires that.

              WoW after updates runs the same.

              EVE Online is BY FAR the biggest CPU hog out there because largeish portions of its GFX engine still run

          • by rrohbeck (944847)

            Yup. They were designed for high clocks and run very well at high clocks. It's purely TDP that keeps AMD from spec'ing them for much higher clocks.

            • See, I wouldn't think TDP would be a problem. If I was willing to let the proc run at 50 celcius under load, which was previously the norm for AMD I'd probably have it cruising well north of 4 ghz.

              Is it just that an unacceptable amount of the chips due to some manufacturing flaws wouldn't be able to sustain those clocks?

              • by rrohbeck (944847)

                The 8-cores are thermally limited. I have an 8150 that has high leakage, i.e. runs HOT. It's a GloFo process issue.

                • They've sold themselves incredibly short on their 4 core chips then. Very few(if any) apps make use of more than 4, and those that do see very limited benefit from increasing the number of cores.

                  They could have had a dazzler on their hands in their 4-core line. I'm saying the business-marketing assholes caused this because "But but 8 cores obviously has to be faster than 4!!!! We need to limit the 4 core so the 8's are faster... mhmmm."

                  Its either that or an incredibly inept design dept. Either of these thin

      • by serviscope_minor (664417) on Wednesday August 01, 2012 @01:22PM (#40844765) Journal

        K8, one of the most under-rated chips of all time

        You misspelled "kept from market dominance by illegal abuse of monopoly by intel".

      • by Wolfrider (856)

        Well, I certainly hope he can help them out - since I remember the K6 was one of the worst abortions I've ever seen for a CPU chip. Honestly, it was pretty godawful. Nothing against AMD the company tho, but I really wish their Linux drivers were at least up to PAR with Nvidia's.

        • The K6-2 was actually a better performer than the Pentium II. The K6's problem was that Intel was already up to their shenanigans in those days and sprung SSE on the marketplace while AMD knew nothing about it but pretty much every other tech company had been told about it and forced to abide by NDAs etc in exchange for access and some kick backs from Intel.

          The K6 itself on raw performance metrics and stability trounced the Pentium it just wouldn't RUN certain programs and in particular most games of the da

  • by Lord Lode (1290856) on Wednesday August 01, 2012 @12:38PM (#40843953)

    Go design some awesome CPU's and bring back the competition between Intel and AMD!

  • Do it to it!! The faithful still agree AMD > INTEL
    • Disagree. Besides killing amd in the cpu relm, its pretty much killing them in the open source graphics department too. Intel gets their drivers released *before* the actual hardware. Thats good stuff man.
      • Re:SWEET! (Score:4, Funny)

        by halltk1983 (855209) <halltk1983@yahoo.com> on Wednesday August 01, 2012 @04:33PM (#40847823) Homepage Journal
        Well, to be fair, Intel doesn't have to write drivers that work at over 15 fps...
        • Funny, and kind of true. In windows, AMD blows the crap out of intel graphics, always. On linux, well the southern islands Chips from amd still don't work after 6 mounths. With the pre SI chips, Intel and AMD swap benchmark victories. with AMD's varying wildly depending on the exact chip. Intel is just more consistant across the board. Supposedly AMD will do a better job of that starting with the next generation, but we'll have to wait and see.
  • All in all, this is a good thing for AMD, and not nearly so good for Apple. The litany of execs leaving AMD of late has caused SemiAccurate to say that nothing good can come of this far too often for our liking lately. This time, all we can say is that a lot of good can and will come of this. Jim Keller is one of the good ones, and you don't leave directorships at Apple without a damn good reason.

    http://semiaccurate.com/2012/08/01/apples-cpu-architect-jim-keller-moves-back-to-amd/ [semiaccurate.com]

    P.S. I liked the subheading

  • The problem now is it'll take 2 or 3 or maybe even 4 years for this guys work to show up. Hopefully he'll help AMD turn the corner, but Intel won't exactly be sitting around waiting for them to catch up.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (1) Gee, I wish we hadn't backed down on 'noalias'.

Working...