Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
AMD

AMD Layoffs Maul Marketing, PR Departments 136

Posted by timothy
from the big-ship-turns-slowly dept.
MojoKid writes "AMD's initial layoff announcement yesterday implied that the dismissals would occur across the company's global sales force. While that may still be true, it has become clear that AMD has slashed its PR and Marketing departments in particular. The New Product Review Program* (NPRP) has lost most of its staff and a Graphics Product Manager, who played an integral role in rescuing AMD's GPU division after the disaster of R600, also got the axe. Key members of the FirePro product team are also gone. None of the staff had any idea that the cuts were coming, or that they'd focus so particularly in certain areas. These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press."
This discussion has been archived. No new comments can be posted.

AMD Layoffs Maul Marketing, PR Departments

Comments Filter:
  • by Sycraft-fu (314770) on Friday November 04, 2011 @06:32PM (#37953504)

    The reason it was a disaster was the nVidia GeForce 8800. ATi was pretty sure that nVidia was still going to be back on teh old style of cards, with separate shaders, for their first DirectX 10 part. That is allowed, though not ideal (the programming interface has to be unified, not the hardware). ATi already had experience with unified shaders from the 360.

    So from all accounts their not-so-great GPU that was up and coming was going to be fine against nVidia. Then out of the blue nVidia drops the 8800, they did a real good job keeping a lid on it. Fully unified architecture that was fast as hell. We are talkign twice as fast as previous generation stuff often and that was on DirectX 9 stuff, never mind what it'd be able to do with the newer APIs.

    So ATi had to delay their release a bit and try to get something to compete better. When the R600 did launch as the Radeon 2000 series, it wasn't good competition.

    However ATi recovered very well with the Radeon 4000 and 5000 series. The 4000 series were extremely competitive cards. Good prices, good performance, low power usage, etc. Then the 5000 series were the first DX11 cards on the market by a number of months, and also great performers.

  • by Anonymous Coward on Friday November 04, 2011 @07:08PM (#37953748)

    After spending lots of area and design time on the R600 to make this "ring-bus" to get good memory performance, basically someone at Ati f'd up and accidentally implemented the design of the R600 ROP w/o a pipeline (basically get a batch of pixels, crunch on it, output it, instead of pipelined like get a batch of pixels, crunch on it, and get the next batch of pixels, output the first batch, crunch on the second batch, get the third batch, etc, etc.). Although perfectly functional, the perf sucked big time (compared to the nvidia 8800 which was available about the same time and didn't make that kind of silly mistake).

    Through lots of software hacks and their marketing group twisting developer arms (having developers do massively custom AA modes or huge shaders where the abysmal rop performance didn't matter as much), they managed to salvage the situation from their crappy design mistake... This was highly fortunate as OEMs that purchase the midrange chips often use game benchmarks to select cards for various price points and if the game benchmarks showed say 1/3 the perf of a comparable nvidia card, they wouldn't sell many cards. That would have probably happened if all the benchmarks were ROP limited and they didn't use lots of MRT hacks to get better perf out of their ROP.

    Since ATI was losing money at that time, it may have been the end of the rope for them. They had just made an aborted R500 design (which they eventually salvaged by selling it to MSFT for Xbox360) and they were hoping to have a killer product on their hands and suffering through the illusion that nvidia wouldn't show up with a unified shader DX10 part. The resultant R600 wasn't good for ATI (bad slow rop made bad benchmark scores and nvidia G80 design was unified dx10 despite what the pundits thought at the time), but saved them long enough to be bought by AMD...

    -Anon

  • Re:Good? (Score:5, Informative)

    by Chris Burke (6130) on Friday November 04, 2011 @07:57PM (#37954154) Homepage

    AMD did start building fabs when the Athlon64 and Opteron were kicking ass all over, and when their projections of market share showed that they would be fab limited -- which for a while, they were.

    The problem is that when they opened up the flood gates on their production capacity, the market share didn't follow. It bumped slightly, but not nearly enough to justify the massive investment in the fabs, wrecking their financials and ultimately forcing them to spin off the fabs as Global Foundries. This is due to the backroom deals Intel had with OEMs limiting the amount of AMD parts they could sell.

    This is the essence of AMD's lawsuit against Intel and the anti-trust rulings by Japan, North Korea, and the EU.

  • Re:Amazing (Score:5, Informative)

    by gman003 (1693318) on Friday November 04, 2011 @08:49PM (#37954468)

    From what I understand, Bulldozer isn't designed poorly - the implementation is just lacking. Sounds to me like they pushed a beta product out for quarterly product presence, but the real product isn't far behind...

    Actually, a huge part of Bulldozer's problem is marketing lies. The architecture is very interesting - it's based on a "module" made of an instruction fetcher/decoder, two integer cores, a floating-point core, and two levels of cache. The effect is comparable to Intel's Hyper-Threading, even if the implementation is different. A four-module Bulldozer chip is comparable to a hyper-threaded quad-core Intel chip - it can ALWAYS run four threads at once, and can theoretically reach eight.

    The problem is, AMD didn't market it that way. They market their four-module chips as 8-core, and their two-module chips as quad-core. Which isn't, technically, lying - they do have that many integer cores - but that marketing caused problems when benchmarks came out. People saw "AMD 8-core chip beaten by Intel 4-core chip" and thought "man, those cores must suck BALLS. And since even I know that a lot of programs are still single-threaded, it really makes no sense for me to buy an AMD chip right now".

    It's almost justice, seeing the marketers fired for this. They stretched the truth beyond what the public would believe, and it bit them in the ass.

    The other problem with Bulldozer is pricing - Bulldozer chips, at least right now, are ~$30 more expensive than the comparable Sandy Bridge processor. Sure, you'll quite likely save twice that if you're upgrading, since Bulldozer is mostly compatible with older motherboards while Intel is still thrashing sockets, but that's not going to be the case for everyone.

Nobody's gonna believe that computers are intelligent until they start coming in late and lying about it.

Working...