AMD Brings Back Athlon K8 Designer as Chief Architect 63
MojoKid writes with exciting news from AMD. From the article: "After more than six months of high-to-mid profile executive departures, AMD has major news to announce on its new executive hire — and he's a welcome addition. Starting today, Jim Keller will serve as a Vice President and the company's Chief Architect for CPU Cores. Keller has spent more than thirty years in the semiconductor business, including a few at AMD. When AMD brought members of DEC's Alpha team aboard in the late 1990s, Keller was one of the CPU architects that came along. Having worked on Alpha's EV5, Jim was lead architect on the first K8 project. Keller moved on and eventually became one of the core members of PA Semi which was bought by Apple in 2007."
From "more cores" mantra to "smarter cores" (Score:3, Insightful)
Let's hope he will guide the development team to squeeze as much performance per-core rather than slap more and more cores....
Re: (Score:1)
I think you need both. You need both, good multi-core, scalable chip *and* efficient cores.
Re: (Score:2)
The question is, can he turn the company around? I've seen a noticeable gap in the offerings from AMD in recent months, and given past performance, have wondered WTH has been going on. I say this is someone who is running on a FX-8150: where is my upgrade path when Intel bangs out something new?
Re: (Score:2)
Re: (Score:2)
rather than slap more and more cores....
Yes, more performance per core is always good since some imporatne tasks are inherently serial, and almost certainly impossible ti parallelize.
However, if your workload is parellelisable, then AMD provide some very good options. If you're buying a dual or quad socket machine, then you're going to get at a mimumum 8 cores, (probably more like 12, 16, 24 or 32), which means that you expect your workload to be quite parallel (if not, you'd be best off dumping all the mon
Re:I'm waiting for the next version (Score:5, Insightful)
Haha, the K10 is long out, and while not bad, it is no where near as revolutionary as the K8. Of course the K8 was competing against the P4, which could be called a dog or an easy target.
If he can repeat what he did with the K8 (and if it was indeed due to his leadership), than AMD again has a chance. If not, they will end up like VIA and all the other Intel competitors, somewhere in a niche market.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Protip: if you want to make a <Chicago> tag, use <Chicago>
interesting loss from the other side (Score:5, Interesting)
Keller didn't just accidentally end up at Apple as the result of the purchase of PA Semi; the consensus is that PA Semi were specifically bought to acquire the team led by Keller, moreso than Apple caring about the company itself (i.e. it was what startups these days like to call an "acquihire"). Keller headed the A4/A5 design at Apple (the system-on-a-chip in the iPhone and iPad), so there's now a noticeable staffing gap if they plan to continue in-house development of their mobile chips.
Re:interesting loss from the other side (Score:4, Interesting)
Besides that this guy designed the K8, one of the most under-rated chips of all time, and set the direction for AMD that led to the AMD64 chip that had Intel flubbing about looking for an answer for nearly two whole years from a company that at the time was struggling to stay afloat.
AMDs entire cash-on-hand balance can be nearly directly credited to this man.
I'm not what you'd call a fanboy, but I am a fan of AMD products(in particular since they acquired ATI). Lets hope this guy can bring some of the bang back.
Re: (Score:1)
Wasn't that the K6 and K6II that preceded the AMD64 and early dual-core Athlons? K8, IIRC, was the disappointment known as "Phenom."
Re: (Score:1)
Phenom is the K10.
Re: (Score:1)
Re:interesting loss from the other side (Score:4, Insightful)
Besides people correcting you, the Phenom and Phenom II chips were by no means "bad" excepting a few pricing faux-pahs where they priced new parts too high for their relative performance out of the gates.
There are actually very few(nearly none) video games out there that max out any CPU these days. Its all GPU now. The physics engines pretty much do as much as they are going to and all thats left is to make it shinier.
I bought my FX4100 purely because of how quiet I can make this thing run. The loudest thing in it is the 800 RPM PSU fan and the computer is overclocked.
Which actually brings me to a pet-peeve about bulldozer. They've underclocked these chips by a LOT. Its actually ridiculous what they've done to themselves. This computer idles at room temp and hits maybe 40 Celcius under load on chip temp, and its overclocked about 600 MHZ which yields something like a 25% total increase in performance. In other words its competitive with an I5 or low end I7s for $100 or more less cost. The best part is any idiot can overclock it with one of these new UEFI boards from Asus, they don't even have to know how to install software or navigate a bios by keyboard.
Re: (Score:1)
There are actually very few(nearly none) video games out there that max out any CPU these days. Its all GPU now.
MMOs are more popular than ever and are very CPU heavy - RIFT has to be one of the worst CPU hogs I know (my 2.8 GHz Phenom II x4 is at the very bottom-end of acceptable processors and even WoW did profit from going 64bit in benchmarks (definitely not due to memory but 32bit WoW does not require SSE2 iirc - so the performance gain is probably a mix between being able to rely on more modern instruction sets and having more registers available).
Re: (Score:2)
Rift was running at 60 fps on my 3800+ AMD Athlon X2 with a 4890 Video card attached on pretty high settings, all I had to do was turn shadows down. As such, I find your claims incredibily hard to believe, and I keep that board and processor around purely because of folks like you. It may say that on the requirements for the game but the game itself in no way requires that.
WoW after updates runs the same.
EVE Online is BY FAR the biggest CPU hog out there because largeish portions of its GFX engine still run
Re: (Score:2)
Yup. They were designed for high clocks and run very well at high clocks. It's purely TDP that keeps AMD from spec'ing them for much higher clocks.
Re: (Score:2)
See, I wouldn't think TDP would be a problem. If I was willing to let the proc run at 50 celcius under load, which was previously the norm for AMD I'd probably have it cruising well north of 4 ghz.
Is it just that an unacceptable amount of the chips due to some manufacturing flaws wouldn't be able to sustain those clocks?
Re: (Score:2)
The 8-cores are thermally limited. I have an 8150 that has high leakage, i.e. runs HOT. It's a GloFo process issue.
Re: (Score:2)
They've sold themselves incredibly short on their 4 core chips then. Very few(if any) apps make use of more than 4, and those that do see very limited benefit from increasing the number of cores.
They could have had a dazzler on their hands in their 4-core line. I'm saying the business-marketing assholes caused this because "But but 8 cores obviously has to be faster than 4!!!! We need to limit the 4 core so the 8's are faster... mhmmm."
Its either that or an incredibly inept design dept. Either of these thin
Re:interesting loss from the other side (Score:5, Insightful)
You misspelled "kept from market dominance by illegal abuse of monopoly by intel".
Re:interesting loss from the other side (Score:5, Insightful)
I did. Thank you for fixing that for me.
If it wasn't for pre-existing exclusivity deals and pushes against it by intel at the time Intel probably would have been forced to burn through their entire cash-on-hand supply to catch up rather than just half of it.
Re: (Score:2)
Thank you for injecting a random bit of allegation into the conversation. Out of interest (sincerely), can you back that up?
Re: (Score:2)
I'm actually quite glad that they quit fighting for the "OMG WE HAVE THE BEST $1000
Re: (Score:2)
Well, I certainly hope he can help them out - since I remember the K6 was one of the worst abortions I've ever seen for a CPU chip. Honestly, it was pretty godawful. Nothing against AMD the company tho, but I really wish their Linux drivers were at least up to PAR with Nvidia's.
Re: (Score:2)
The K6-2 was actually a better performer than the Pentium II. The K6's problem was that Intel was already up to their shenanigans in those days and sprung SSE on the marketplace while AMD knew nothing about it but pretty much every other tech company had been told about it and forced to abide by NDAs etc in exchange for access and some kick backs from Intel.
The K6 itself on raw performance metrics and stability trounced the Pentium it just wouldn't RUN certain programs and in particular most games of the da
Yay! (Score:3)
Go design some awesome CPU's and bring back the competition between Intel and AMD!
SWEET! (Score:1)
Re: (Score:1)
Re:SWEET! (Score:4, Funny)
Re: (Score:2)
Semi-Accurate comments (Score:2)
http://semiaccurate.com/2012/08/01/apples-cpu-architect-jim-keller-moves-back-to-amd/ [semiaccurate.com]
P.S. I liked the subheading
Now we wait (Score:1)
The problem now is it'll take 2 or 3 or maybe even 4 years for this guys work to show up. Hopefully he'll help AMD turn the corner, but Intel won't exactly be sitting around waiting for them to catch up.