AMD Layoffs Maul Marketing, PR Departments 136
MojoKid writes "AMD's initial layoff announcement yesterday implied that the dismissals would occur across the company's global sales force. While that may still be true, it has become clear that AMD has slashed its PR and Marketing departments in particular. The New Product Review Program* (NPRP) has lost most of its staff and a Graphics Product Manager, who played an integral role in rescuing AMD's GPU division after the disaster of R600, also got the axe. Key members of the FirePro product team are also gone. None of the staff had any idea that the cuts were coming, or that they'd focus so particularly in certain areas. These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press."
Bye markedroids (Score:1, Insightful)
Honestly they haven't been performing and it's understandable they got the axe. Maybe now AMD can focus on product rather than image.
Re:Bye markedroids (Score:4, Insightful)
Honestly they haven't been performing and it's understandable they got the axe. Maybe now AMD can focus on product rather than image.
In my experience image sells more often than brand. Particularly image establishes brand, for what it's worth.
These look like the sort of cuts of a company which may be in particular stress. Not encouraging.
Re: (Score:3)
In my experience image sells more often than brand. Particularly image establishes brand, for what it's worth.
Yes, but the people doing that for AMD haven't exactly been doing a stellar job over the years... Their marketing messages have been constantly changing, and each version was a muddled mess.
Not saying they deserved to be sacked or anything... Just, marketing is not one of AMDs strengths and I don't think this will cost them as much as one might think.
These look like the sort of cuts of a company which may be in particular stress. Not encouraging.
I'm encouraged that they cut heavy on marketing and less on R&D, as opposed to the opposite. That'd imply they aren't planning on being competitive, eve
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
... I really doubt marketing when you are selling out of chips already is REALLY needed that much, do you?
I'll be watching to see if this has any effect on their sales. If this keeps up, then performance and reputation is what sells (as I've always suspected), not pissing money into the wind on advertising and PR.
Re: (Score:3)
Personally, I hate marketing. I hated commercials when I watched television, I hate adverts in my newspaper, I hate them on the tubez. Today, with this wonderful internet we have, when I want something, I start searching.
My youngest kid decided that a bike would be cool. He thought about a Harley. I told him that A; Harley is overpriced by an order of magnitude, and B; V-twins suck ass. He did some research, he half believed me, but still, the offer of a trade was just to good to pass up.
Now, three mon
Re: (Score:2)
Re: (Score:2)
The people that run the business at AMD are all geeks, and very good ones.
That said AMD is probably the first company that gets it (next to Apple, for that matter) in the sense that the product that you're making should sell itself.
I'm not a fan of Apple at all, but the point here is that I've never bought something because of an emotional response (marketing) but rather because I wanted something for a reason, like: I need something to do x, what is it and where can I get it?
AMD has already done lots of th
Re: (Score:2)
Honestly they haven't been performing and it's understandable they got the axe.
The products haven't been performing lately either - hell, the marketing people did their job too well, and got us thinking that Bulldozer might actually be worth waiting for. Oops.
What is the disaster of R600? (Score:4, Interesting)
This was back when they were a seperate company (Score:5, Informative)
The reason it was a disaster was the nVidia GeForce 8800. ATi was pretty sure that nVidia was still going to be back on teh old style of cards, with separate shaders, for their first DirectX 10 part. That is allowed, though not ideal (the programming interface has to be unified, not the hardware). ATi already had experience with unified shaders from the 360.
So from all accounts their not-so-great GPU that was up and coming was going to be fine against nVidia. Then out of the blue nVidia drops the 8800, they did a real good job keeping a lid on it. Fully unified architecture that was fast as hell. We are talkign twice as fast as previous generation stuff often and that was on DirectX 9 stuff, never mind what it'd be able to do with the newer APIs.
So ATi had to delay their release a bit and try to get something to compete better. When the R600 did launch as the Radeon 2000 series, it wasn't good competition.
However ATi recovered very well with the Radeon 4000 and 5000 series. The 4000 series were extremely competitive cards. Good prices, good performance, low power usage, etc. Then the 5000 series were the first DX11 cards on the market by a number of months, and also great performers.
Re:This was back when they were a seperate company (Score:5, Interesting)
Excellent point. It's also worth pointing out that the 8800 survived for five years as a very viable card. Released in 2006, it's still listed as a minimum requirement for many games today (including Battlefield 3). That's quite a feat considering how fast technology matures in this market. In 2009 the 8800-class cards were still selling north of $120, and while not mind blowing by today's standards, were pretty much the gold standard until mid-2008. It's hard to compete against that kind of technology.
Re: (Score:2)
Re: (Score:2)
> However ATi recovered very well with the Radeon 4000 and 5000 series. The 4000 series were extremely competitive cards. Good prices, good performance, low power usage, etc. Then the 5000 series were the first DX11 cards on the market by a number of months, and also great performers.
Yup. What's "ironic" is that the 6970 is significantly _slower_ then the 5970 *, since the 6000 series was even more about low power usage.
It will be interesting to see if the 7000 series focuses more on power or efficiency
Re: (Score:1)
Yup. What's "ironic" is that the 6970 is significantly _slower_ then the 5970
Because one is a Dual-GPU part and the other is not. The 6 series equivalent to the 5970 is the 6990, not the 6970 like any sane person would expect.
Re:This was back when they were a seperate company (Score:5, Funny)
The 6 series equivalent to the 5970 is the 6990, not the 6970 like any sane person would expect.
And who's to blame for that? Marketing.
Re: (Score:2)
Still even counting all high performance graphics cards those number aren't really the big numbers. Keeping focus on design rather than marketing AMD especially after buying ATI, might be focusing on a CPU with a high powered embedded GPU, with separate memory caching but combined main memory usage. The demand is pent up and growing for high powered portable graphics based devices. Tablets, netbooks, notebooks even more efficient smaller desktops. That's where the big numbers are. The first to crack a real
That's on account of renumbering BS (Score:2)
For whatever reason, AMD decided that they wanted to mess with their scheme. In the 5000 and 4000 series, the x8xx part was the high end single card, the x7xx was the lower range (like half as many shaders and so on) and the x9xx part was the dual GPU part: two actual GPUs on one board.
Well for the 6000 series, they changed it. The x8xx range is the same as the x7xx range from the 5000 series and the x9xx the same as the x8xx. So the 6970 is now the highest end single GPU card, and is equivalent in the line
Indeed (Score:2)
ati seems to know how to produce gpus.
Re: (Score:1)
Ironicly, the Datsun 1800 my dad purchased in 1978 (used for ~2grand) produces more megagiggles today.
Re: (Score:1)
Re: (Score:2, Interesting)
R600 was a huge, hot, and expensive design. It had to be delayed due to it being impossible to release on the 65nm process that was available at the time, and it barely fit on the 55nm half-node either.
All AMD (ATI) cards released after R600 have been build from the ground up to target the mainstream market, whereas in the past they would create big monolothic dies and then cut them down to fit the lower markets. The enthusiast slots from AMD are now filled by dual-GPU cards.
A parallel would be Intel moving
Re:What is the disaster of R600? (Score:4, Informative)
After spending lots of area and design time on the R600 to make this "ring-bus" to get good memory performance, basically someone at Ati f'd up and accidentally implemented the design of the R600 ROP w/o a pipeline (basically get a batch of pixels, crunch on it, output it, instead of pipelined like get a batch of pixels, crunch on it, and get the next batch of pixels, output the first batch, crunch on the second batch, get the third batch, etc, etc.). Although perfectly functional, the perf sucked big time (compared to the nvidia 8800 which was available about the same time and didn't make that kind of silly mistake).
Through lots of software hacks and their marketing group twisting developer arms (having developers do massively custom AA modes or huge shaders where the abysmal rop performance didn't matter as much), they managed to salvage the situation from their crappy design mistake... This was highly fortunate as OEMs that purchase the midrange chips often use game benchmarks to select cards for various price points and if the game benchmarks showed say 1/3 the perf of a comparable nvidia card, they wouldn't sell many cards. That would have probably happened if all the benchmarks were ROP limited and they didn't use lots of MRT hacks to get better perf out of their ROP.
Since ATI was losing money at that time, it may have been the end of the rope for them. They had just made an aborted R500 design (which they eventually salvaged by selling it to MSFT for Xbox360) and they were hoping to have a killer product on their hands and suffering through the illusion that nvidia wouldn't show up with a unified shader DX10 part. The resultant R600 wasn't good for ATI (bad slow rop made bad benchmark scores and nvidia G80 design was unified dx10 despite what the pundits thought at the time), but saved them long enough to be bought by AMD...
-Anon
Good? (Score:4, Insightful)
Re:Good? (Score:5, Interesting)
TSMC isn't the only fabber.
Rumor is that AMD and ARM may team up. But this means they might be thinking of an ARM/ATI combo chip. Which would be verrrrry interesting. But it would leave AMD's x86 department out in the cold for the future of computing.
It's also a clue as to why AMD dumped the marcom hacks: these are the people who are supposed to tell the bigwigs what the Next Big Thing is going to be, and they have consistently been 1-2 years behind the curve.
The only place AMD has been approaching the bleeding edge is in graphics, where the ATI engineers are merely advancing their skillz as fast as they can. No need to guess where their market is going, since there's always a call for more cores and more clock.
Re: (Score:2)
What is different at the CPU market? More cores, more clock, less power consuption, price (you forgot the later two). Ok, there was the change to 64 bits, and the doubt about ARM vs. x86*, but if they had an entire team just to figure those, they were really throwing money away.
* ARM vs. x86 is easy. What plataform will be better in cores, clock, power consuption and price? Each will win the markets th
Re: (Score:2)
AMD and its graphics subdivision also rely on globalfoundries, which used to be the production arm of AMD.
If anything, in the long run, the ARM business might help AMD and nVIDIA even indirectly, a big market for another high end foundry company gives them more potential suppliers.
But I agree, marketing handles itself if you have a good enough product right now. The 6000 series is good, but not revolutionary, but I suppose one could say exactly the same thing about the 500 series from nVIDIA. I wonder if t
Re: (Score:2)
Competitive products aren't the issue. Well, right now they are, but they do have issues with name recognition. PC carriers are not going to integrate AMD products of there isn't demand for them. And I've been shocked that AMD still doesn't bother with advertising the way that Intel does.
There have been periods where AMD chips were better than Intel chips, and yet that hasn't ever been reflected in market share. Right now, the move is the correct one, cut marketing and focus on developing better products, b
Re: (Score:2)
There have been periods where AMD chips were better than Intel chips, and yet that hasn't ever been reflected in market share.
It's kind of hard to massively ramp up market share in a short time when that requires spending billions of dollars on building new fabs to churn out those new chips. If AMD had built new fabs when the Athlon-64 proved to be significantly better than the space-heater P4s, they'd probably have finished them sometime after Intel release the Core line and blew them away.
Re:Good? (Score:5, Informative)
AMD did start building fabs when the Athlon64 and Opteron were kicking ass all over, and when their projections of market share showed that they would be fab limited -- which for a while, they were.
The problem is that when they opened up the flood gates on their production capacity, the market share didn't follow. It bumped slightly, but not nearly enough to justify the massive investment in the fabs, wrecking their financials and ultimately forcing them to spin off the fabs as Global Foundries. This is due to the backroom deals Intel had with OEMs limiting the amount of AMD parts they could sell.
This is the essence of AMD's lawsuit against Intel and the anti-trust rulings by Japan, North Korea, and the EU.
Re: (Score:2)
...anti-trust rulings by Japan, South Korea and the EU, I presume.
Re: (Score:1)
Re: (Score:2)
AMD should have bought a computer manufacturer, or a brand. AMD can't compete against Intel by trying to get into Dells etc. They can compete with Intel by making their own desktop and server lines, targeting them well and making decent profits by cutting out the middleman. The profits would allow them to continue to invest in their engineering.
AMD need a strategy that allows them to make money off a small market share without sacrificing their ability to invest in their engineering.
Re: (Score:1)
I would mod you informative for your post, and funny for north korea.
If I had the points.
Re: (Score:2)
Your pretty accurate reference to Hector's role aside, that description of events sounds suspect. AMD bought ATI in 2006, when the housing bubble hadn't even burst yet and the financial crisis was way over the horizon. The lack of value of the ATI purchase was factored into the AMD share price long before the market crash -- stock being the main thing they bought ATI with, not cash. There's another thing, too.
You don't pay for a fab with cash, either; they're just too expensive to build and update. It's
Re: (Score:2)
Sadly for AMD I feel their time has come and gone, for a number of years they were well ahead of Intel but we all know why they could never press home their dominance. A billion euro fine; despite being a record, was chump change for Intel and in reality they profited greatly by shutting AMD out of the market.
Re: (Score:1)
Some nights I miss my old K133 and its solid decoding of 128kbps mp3s.
Re: (Score:2)
AMD still doesnt have the market share in the segments where Intel should really be getting their ass handed to them. Why would any manufacturer pick inferior Intel for a netbook or notebook? Makes you wonder.
Re: (Score:3)
Ultimately, they need both, but right now they need engineering far more than they need marketing. Intel has a significant advantage in that they sell far more chips and can afford to spend more money on bribes and development
Re: (Score:2)
Considering how much higher quality Intel chips have been the last two years, they don't even need to bribe anyone.
Re: (Score:2)
That's true.
The unfortunate truth for Intel, though, is that their chips have historically been fairly overpriced in contrast with comparable offerings from AMD. Of course, AMD is beaten into a pulp by Intel's high end offerings and can't even compete in that market segment, but I can't think of anyone much who'd fork out $1,000+ for a desktop processor unless they had a business-related reason
Re: (Score:2)
I expect Intel to continue dropping the price on their low- to medium-end offerings in order to compete, but I also don't expect to see them drop very far since 1) the low end has tighter profit margins and 2) Intel has volume (in terms of production capabilities) plus market share in their favor--don't expect that combination to allow for much generosity on their behalf.
On the other hand, now that Intel has AMD up against the ropes, Intel might be content to put the squeeze on them, and take a hit (or just break even) on the low-end chips to push them even further out. Intel is certainly pulling in more than enough cash to buffer a bit of a hit in that market.
Re: (Score:2)
Intel needs AMD, otherwise it faces even more antitrust suits and a closer eye from the DoJ and whatever agency in Europe does the same thing. I'd expect that Intel won't make the mistake of squeezing AMD completely out of the market.
Intel definitely could do it, but they'd end up being broken up if they managed to succeed. There's only so much anticompetitive activity that the DoJ can turn a blind eye to.
Re: (Score:2)
They don't need to right now, because they successfully drained AMD of customers long enough for AMD to have funding issues with their R&D. The incompetence of over paying for ATI didn't help, but the consequences of Intel's anticompetitive activities are going to take years for AMD to overcome.
Vital? (Score:5, Insightful)
These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press."
Better to cut marketing and the "vital" line of communication to the press, than to cut product development and not have a new product next quarter... because then having lines of communication to the press won't seem so vitally important anymore.
Still it sucks for anyone to lose their jobs.
Re: (Score:2)
It can't be easy to determine where the cuts are going to be made if you've decided to not just do an even across-the-board cut. Most divisions within any sizable company could have good arguments made for not selecting them for cuts.
Unless your company has a "wing" ripe for picking anyway. Something that can be cut out all at once like a tumor without much effect to the rest of the company. You just have to hope you have quality bean-counters working closely with the company directors/visionaries to det
Re: (Score:3, Insightful)
It is easy: You cut down to the entire companies managmants wages to lowest engineering wage, and no bonuses. That includes the stockholdes, CEOs and other "high positions".
It wouldn't surprise me one bit if that would earn them a really nice surplus of cash, which again could be used to massive amounts of R&D.
Of course, no corporation these days wants to sit down and do what needs to be done.
Re: (Score:1)
If that's the way you want it to work, then move to North Korea.
Re: (Score:2)
You cut down to the entire companies managmants wages to lowest engineering wage, and no bonuses. That includes the stockholdes, CEOs and other "high positions".
Or maybe if they cut the engineering wages to that of the lowest janitors wage and no bonuses. What do you think that would accomplish?
A big fat surplus of cash? Of course not. All the engineers would find new jobs and quit.
What makes you think management would be any different?
I agree executive compensation is way out of whack, but you can't just c
Re:Vital? yes Vital (Score:2)
AMD is a Fabless chip company now. That means they design chips. They are behind in performance on the x86 side, are about to be behind in low power when Intel uses FinFETs (sorry Tri-Gate). The last thing they need to cut is their core design business - it's what the company
Re: (Score:2)
Hi, a member of the press here (proper press, not a blogger);
So the issue is that the NPRP was responsible for providing support for us when reviewing products at launch. This meant tracking down bugs, letting us know about internally known issues, and getting drivers issued for important bug fixes. The fact of the matter is that pre-launch hardware (particularly for a new architecture) is practically beta testing, and we're the beta testers. The risk AMD takes by not having a well staffed NPRP is that if w
Re: (Score:1)
NPRP is the last line of defense? Cool. Glad they didn't cut the first line of defense, which is the engineer team that is responsible for creating a superior product in the first place.
Re: (Score:1)
that's a stigma that's would stick to a product for its entire life
I think that depends entirely on the consumer. Personally, I go for the best (current) performance for the lowest cost. I realize that getting your CPU in a larger share of mainstream computers from Dell etc may be affected by general perception, but AMD is already losing on that front. Intel has a much larger desktop market share and, anecdotal as it is, most people I talk to favor intel for seemingly emotional reasons. I believe that the perception of the "enthusiasts", however, are ultimately what sways
Re: (Score:2)
Or to put this another way, the NPRP was the last line of defense against a bad review.
Not having a new product to review in the first place is far worse.
Marketing of tech is almost free. (Score:3, Insightful)
Seriously, with 14 bazillion bloggers fighting to get clicks to their webpages, all you need is one guy with a copy of the datasheet and a twitter account, and you'll have your part's nomenclature showing up on every RSS feed in the world within minutes if not days. And, if you're lucky (or just know where to put the typos), you can get /. to send your favorite blogger enough clicks to buy an iPhone.
Re: (Score:3)
Yes, but those aren't the people that AMD needs to be reaching. To people that know little about computers, Intel is a name brand that has been associated with quality. The problem is that it isn't always true, there are periods where Intel is doing really good work and there's periods where AMD chips are better, but you don't really ever see that in the market share, in large part because for the most part you have to build your own computer if you want AMD parts.
Not quite so much now that AMD does GPUs as
Re: (Score:2)
Okay, so in addition to your twitter guy, hire a guy to call OEMs and say "use our chips and we'll give you free stickers."
And a guy to negotiate for a NASCAR team. Because, fuck, man, this is Amurrca.
Fact is, if AMD had ONE THOUSAND FOUR HUNDRED people doing that job, they were wasting about $140 million a year on dead wood.
Re: (Score:2)
It's not a fact. That's how marketing works, it doesn't matter how incredibly brilliant and affordable your product is if nobody knows it exists. And for most people, AMD products just aren't available when they go to the store. And they don't know about them because there's no marketing and they often aren't carried by the stores.
Re: (Score:2)
Re: (Score:1)
because there's no marketing and they often aren't carried by the stores.
So, just WTF were the 1,400 marketers doing then? Apparently not their job. Maybe that's why they were purged.
Re: (Score:2)
The problem is that it isn't always true, there are periods where Intel is doing really good work and there's periods where AMD chips are better, but you don't really ever see that in the market share, in large part because for the most part you have to build your own computer if you want AMD parts.
Well that, and you have the problem that it's not that easy to ramp up or ramp down CPU production. Building fab capacity is started years in advance so by the time AMD actually brings a processor to market they got a fairly narrow percentage of the market they can supply. Good chips mean high prices, poor chips means low prices but they can't take that much market share in one generation. And if you overextend yourself you risk that Intel pulls a very good processor out of the hat and you're left with way
Re: (Score:2, Insightful)
Seriously, with 14 bazillion bloggers fighting to get clicks to their webpages, all you need is one guy with a copy of the datasheet and a twitter account.
If this were true why is Linux clinging by its fingertips to a bare 1% market share?
You need people who can negotiate OEM system installs, retail placement and sales promotions. Your bazillion bloggers aren't as useful as the one man or woman who knows how to cut the right deal with Walmart.
Re: (Score:1)
Oh shit son, did they just unthaw you?
Linux doesn't have 1% marketshare of anything. It has closer to 10% desktop marketshare and dominates the mobile, server and embedded spaces.
Re: (Score:2)
Oh shit son, did they just unthaw you?
Linux doesn't have 1% marketshare of anything. It has closer to 10% desktop marketshare and dominates the mobile, server and embedded spaces.
Recent surveys by respectable groups like Gartner put Linux at 1.1% share on the desktop.
http://en.wikipedia.org/wiki/Usage_share_of_operating_systems [wikipedia.org]
http://marketshare.hitslink.com/report.aspx?qprid=9&qpcustom=Linux [hitslink.com]
Linux is still hovering around 20% in the server market, hardly what I'd call dominating. Sure Android is based on a linux kernel, but Iit's a stretch to call it Linux. I will believe you on embedded systems though.
Re: (Score:1)
Re: (Score:2)
A simple CPU performance comparison chart would work [cpubenchmark.net]
But how can anyone tell whether one CPU is going to be fast or slow for their purposes? There's all that hyperthreading, cache size, memory size to consider as well.
You'd really need to know how many MHz and Megabytes a particular application requires in order to run smoothly.
My parents have a laptop (1.5 GHz) that is so slow on starting up, that they just basically keep it on all the time, and don't bother shutting down the applications (E-mail, spreadshe
And... (Score:1)
Nothing of value was lost.
Does this remind any one of... (Score:5, Insightful)
Office Space?
These two departments may not design products, but they create and maintain vital lines of communication between the company, its customers, and the press.
Bob Slydell: What would you say ya do here?
Tom Smykowski: Well look, I already told you! I deal with the goddamn customers so the engineers don't have to! I have people skills! I am good at dealing with people! Can't you understand that? What the hell is wrong with you people?
Amazing (Score:5, Insightful)
Here it is, 2011, when CEO's live and die by 10K's and stock prices, we have a company that layed off marketing and PR and kept their engineers. How much AMD stock can I buy? Sign me up!
Re: (Score:2)
Absolutely. This is a "bold" (ie, different) move. I've always liked AMD as a company, as their business decisions have always at least been long-game driven. "It hurts now, but two, three quarters from now, we'll like it".
I was starting to think AMD was going to fall way, way behind. Now, I think they're going to pull ahead of this one. (Hell, it took Nvidia one major revision in their cards to get from "poor performance and high power use" to "the head of the pack by a bit". From what I understand, Bulldo
Re: (Score:1)
In most cases, losses to Marketing are not a big deal, hell Berkshire Hathaway could probably cut down half their advertising budget and you'd see only slightly less Geico and Dairy Queen commercials, but you'd still know they sell insurance and ice cream.
AMD sells CPU's, and GPU's. I know they sell other stuff, but nobody ever talks about the other stuff. I think that tells you all you need to know.
From an engineering POV, the only reason I buy the Radeon's but not the AMD CPU's is because the CPU's tend t
Re:Amazing (Score:5, Informative)
From what I understand, Bulldozer isn't designed poorly - the implementation is just lacking. Sounds to me like they pushed a beta product out for quarterly product presence, but the real product isn't far behind...
Actually, a huge part of Bulldozer's problem is marketing lies. The architecture is very interesting - it's based on a "module" made of an instruction fetcher/decoder, two integer cores, a floating-point core, and two levels of cache. The effect is comparable to Intel's Hyper-Threading, even if the implementation is different. A four-module Bulldozer chip is comparable to a hyper-threaded quad-core Intel chip - it can ALWAYS run four threads at once, and can theoretically reach eight.
The problem is, AMD didn't market it that way. They market their four-module chips as 8-core, and their two-module chips as quad-core. Which isn't, technically, lying - they do have that many integer cores - but that marketing caused problems when benchmarks came out. People saw "AMD 8-core chip beaten by Intel 4-core chip" and thought "man, those cores must suck BALLS. And since even I know that a lot of programs are still single-threaded, it really makes no sense for me to buy an AMD chip right now".
It's almost justice, seeing the marketers fired for this. They stretched the truth beyond what the public would believe, and it bit them in the ass.
The other problem with Bulldozer is pricing - Bulldozer chips, at least right now, are ~$30 more expensive than the comparable Sandy Bridge processor. Sure, you'll quite likely save twice that if you're upgrading, since Bulldozer is mostly compatible with older motherboards while Intel is still thrashing sockets, but that's not going to be the case for everyone.
Re: (Score:2)
Bulldozer is a bad design, and AMD's current roadmap (10-15% improvement per year) doesn't look very reassuring. Still, I'd like to see them get their shit back together, b
Re: (Score:2)
AMD has another problem: its own Phenom line. For all intents and purposes, an older and dirt cheap Phenom II X4 will more than please 99% of the population. For the remaining 1%, performance is an important criterion, at which point they'll probably go for a top-of-the-line Sandy Bridge or Ivy Bridge processor since they're rebuilding every three years anyways, negating the upgrade discounts on AMD's platform.
That AMD is refocusing its strategy towards APUs and possibly teaming up with ARM should be a tell
Re: (Score:1)
Bulldozer is mostly compatible with older motherboards
I think you've got that backwards. The *older* chips (Phenom/Athlon II) are mostly compatible with *newer* motherboards.
Re: (Score:2)
Goes both ways:
"Some manufacturers have announced that some of their AM3 motherboards will support AM3+ CPUs, after a simple BIOS upgrade. Mechanical compatibility has been confirmed and it's possible AM3+ CPUs will work in AM3 boards, provided they can supply enough peak current. Another issue might be the use of the sideband temperature sensor interface for reading the temperature from the CPU. Also, certain power-saving features may not work, due to lack of support for rapid VCore switching. Note that us
Re: (Score:2)
This has turned out to be rarely true in practice. Only about 3 models each of MSI and Asus 800-series motherboards can run a Bulldozer CPU with a BIOS update and NO Gigabyte 800-series boards will except for the very last hardware revisions of about 8 of their models.
The motherboard OEMs did a much better job of supporting new CPUs on their existing boards for the previous AM2 - AM2+ - AM3 transitions.
Re: (Score:2)
Except if they called it a quad core, then it'd be the world's biggest and most power hungry quad core. It's got the transistor count and power consumption of an octo core, without actually delivering that performance. I'm not so sure marketing it the other way would have looked any better.
Re: (Score:1)
Small correction (Score:2)
Re: (Score:2)
So, for gaming, why would I want one of these? That's still the question on most people's mind.
Most of the remainder of geeks think: but Bulldozer uses a gobton more power than the performance-equivalent Sandy Bridge chip (and it costs more). My existing AMD system out-performs it for single and double core workloads. I don't want one.
The common consumer buys what's almost cheapest, usually, thinking it's the value proposition. Sometimes that's a good deal, and sometimes it'll be AMD. Or, at least, that's t
Re: (Score:2)
From what I understand, Bulldozer isn't designed poorly - the implementation is just lacking. Sounds to me like they pushed a beta product out for quarterly product presence, but the real product isn't far behind...
I don't know jack about this, so I'll just quote what an acquaintance of mine wrote on my gaming community forum:
The main issue is that the Windows 7 scheduler doesn't understand the effective use of the modules, which drastically cuts down on the ability of the processor to turn off cores and run in turbo mode.
The bottom line in the end unfortunately will be that most desktop work loads still will not take advantage of the Bulldozer architecture. It'll fair much better in the server world but it's going to be a while before the desktop software truly shifts to the style of programming Bulldozer requires. A long while. Probably a lot longer than 5+ years. . .
[I ask a stupid question]
. . .Intel also disables cores. The idle cores are turned off to reduce the thermal foot print while running the active cores at higher clock speeds. Bulldozer's method is a little more complicated than Intel's though, and Windows 7 doesn't understand how to deal with it. For example, for two integer heavy threads with shared data, it should schedule them to a single module and throw the turbo on. Two integer heavy threads, with non-shared data? Two modules. Two floating point threads? Two modules. There's a lot of conditionals about the work load, based on the new longer pipeline, the dual integer unitss, but only a single fp unit.
Piledriver will improve the FP a lot since it'll have the GPU on die, which is what AMD is really aiming to do. They don't really want FP units at all in the module, they want to push that work to GPU-style modules. Which makes sense, as they're a hundred times better at it. But programs aren't written to take advantage of that yet.
I have no idea if he's correct, but that's the extent of my understanding.
Re: (Score:2)
Re: (Score:2)
I'm not sure what's so funny about this post, it's clear that they understand that engineers in a company that depends upon creating new products can't cut back on engineering indefinitely.
Re: (Score:2)
Except for the repetitive redundant part where you keep saying "engineers, engineers, engineers", lots of companies seem to not understand that at all.
Re: (Score:2)
In other words, marketing is necessary because marketing exists. In the same manner, we all hate lying salesmen, but for the sa
Oh yeah? (Score:2)
None of the staff had any idea that the cuts were coming, or that they'd focus so particularly in certain areas.
And this is precisely why they were fired. I mean duh, this is not news that marketing is among the first areas to be axed in a dying company. There's quite a bit of precedent in the business world. If those employees didn't even know this, and had no situational awareness as to how their brands were doing, I can just imagine how they were handling their day to day work.
Re: (Score:3)
Actually, a lot of companies are hitting R&D the heaviest, since they require lots of space, supplies, and equipment (overhead) and are often the highest-paid non-management/attorney positions. Usually a terrible, terrible plan for any company as a whole, since THAT'S HOW A COMPANY STAYS COMPETITIVE, but it keeps the stock healthy for a long-enough period that management can cash out and then get the fuck out of Dodge before the house of cards comes tumbling down. For a thrill, keep an eye on the big
good product + marketing = sales (Score:1)
take a look a apple... marketing wizards. you may love their products, or hate it, but their growth and sales tell the story for itself. this may not end up to be such a good move for AMD in the long run.
Re: (Score:2)
The problem is vastly different, as many posters here are pointing out: AMD has (had?) a lousy marketing department.
The illustration you're making is apples (heh) to oranges. Apple has an extremely strong and talented marketing wing--so much so that the actual real world quality of their products almost doe
As a result (Score:2)
forthcoming chips from AMD will just have numbers instead of names
This is a good thing for AMD. (Score:1)
Re: (Score:1)
Advertisers and marketing people might be the lowest form of life too.
I'm pretty sure that title goes to IP droids as far as corporate lifeforms are concerned. At least marketers and advertisers create something of value (usually).
Re: (Score:1)
In all fairness, that's a pretty low bar. Nothing to brag about if you say at least you are better that patent trolls
Re: (Score:2)
Advertisers and marketing people might be the lowest form of life too.
Eh, I am sitting here uncomfortably as my wife is head of market research (not marketing) at a Fortune 500, and my sister in law is regional VP of a large media/advertising corporation...
Re: (Score:2)
I don't know how long it's been since they've done this, but when I built my very first system more than ten years ago, I put a 1.4 GHz Athlon Thunderbird in the system, and it came with a 1"x1" (or whatever) AMD sticker to put on the case where the "Intel Inside" stickers always went.
Unfortunately for AMD, my case also came with an "Antec Outside" case, which suited my sense of humor much better.
Re: (Score:1)
AMD seriously needs to jump on some Taiwanese necks and get the fab stuff fixed pronto.
Most of AMD's chips are fabbed by GlobalFoundries. GF is Chartered Semiconductor merged with AMD's former fab operations (spun off to raise cash when AMD was circling the drain a couple years ago). GlobalFoundries doesn't have anything in Taiwan. Try Germany, Singapore, and upstate New York.