ATI and AMD Seek Approval for Merger? 229
Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."
Why ATI... Go NVidia (Score:4, Insightful)
AMD + ATi vs. Intel + nVidia (Score:5, Insightful)
And if AMD and ATi merge.. It sort of seems like a punch in the face to nVidia. Leaving them wanting to talk to Intel. Leading to... what?
For a long time there have been two beasts in the CPU market and two beauties in the GPU market. AMD and Intel in CPUs, and ATi and nVidia in GPUs. If they marry respectively, the offspring might have the good qualities of neither and the bad qualities of both. I think overall the consumer would probably (more than likely) lose out.
So, I really kind of hope this is just a rumor.
TLF
Re:Does that mean.... (Score:2, Insightful)
Conflict - nForce? (Score:3, Insightful)
I also worry that chipsets for AMD based motherboards will not work so well with my nVidia video card. Not an ATI fan at all.
I'm going to be watching these guys very closely. This would sway me away from AMD.
Poor Choice For AMD (Score:3, Insightful)
The Xbox 360 is the first console ever to have PCs outperform it before the console has hit store shelves. In the past, consoles have had at least a year or so before PCs could touch them.
What the hell is AMD thinking?
AMD needs to come up with its own bogus SPEC score generating compiler to grow in the market, not a fucked up GPU maker.
Re:Why ATI... Go NVidia (Score:5, Insightful)
If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..
If the rumour is truee, I hope AMD care about open drivers..
Huge Opportunity for Free Software Drivers (Score:1, Insightful)
completely agree (Score:4, Insightful)
Depends. (Score:5, Insightful)
You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume. There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up. That would be expensive and earn them little. In fact, given how much they've made from their component-neutrality, sacrificing that might mean they'd actually lose money overall.
On the other hand, CPUs are high volume, high profit, and AMD is gaining market-share. It is an ideal target for a buy-out, particularly as ATi can't be doing that well in the GPU market. Buying AMD would be like buying a money-printing-machine, as far as ATi were concerned. Better still, AMD is a key player in bus specifications such as HyperTransport, which means that if ATi owned AMD, ATi could heavily influence the busses to suit graphics in general and their chips in particular.
(Mergers are never equal, as you don't have two CEOs, two CFOs, etc. One of them will be ultimately in charge of the other.)
If the rumour is correct, then don't assume AMD is the one instigating things - they have the most to lose and the least to gain - and don't assume either of them will be around when the mergers and buyouts finish.
So let me get this straight (Score:2, Insightful)
I think that's the concensus on here, certainly the linux drivers are apparently awful.
My AMD64 desktop machine has an NVidia graphics card which works much better than the ATI rubbish built into the motherboard. But I'm not using that machine to write this. In fact, other than for occasional gaming, that machine rarely gets switched on.
I tend to use my laptop. Which has a Centrino chipset.
You know - that one that Intel brought out for laptops? The one that's hugely, massively successful in one of the main growth areas of hardware sales? Everyone wants a laptop... or a home media centre based on a pc but doesn't run like one... Everyone is buying Intel. Why? Because to all intents and purposes all the laptops come with Intel centrino sets. It's dead easy - they're dead easy to support, all the bits work together, no conflicts. AMD? Sure nice chips but who makes Turion laptops? Acer... Asus... and... um... some other companies... Perhaps Alienware? HP make a couple, Fujitsu Siemens make a couple but these aren't their high-end desirable laptops. It's like "well if I spend money I get a centrino, otherwise it's a toss-up between Celeron - the cacheless wonder - and a chip that sounds like a sticky nut treat..."
Who makes Centrino laptops? Dell, Sony, Toshiba, Fujitsu Siemens, Samsung, Panasonic, whatever IBM are calling themselves now - oh and Acer and Asus and Alienware too but - oh yes, and one really important company who basically stuck 2 fingers up to AMD - Apple. I'll bet Apple choosing Intel hurt. But everyone's buying laptops with Centrino chipsets in... No-one's really buying AMD... because AMD don't provide a chipset and an easy way for manufacturers to just kind of put their machines together in a lego-style fashion.
Does it make business sense for AMD to tie up with the chipset and motherboard manufacturer that also happens to make graphics cards? Hell yes. Does it make sense for AMD to try to get into the laptop market in a meaningful way? Probably. Will their driver support get any better? We can hope...
Re:completely agree (Score:5, Insightful)
Sigh. This detrimentally short-sighted acceptance of binary-only drivers that users like you have is precisely why there are no good drivers for recent ATi hardware, or most recent graphics besides Intel. And until users like yourself start demanding that vendors provide documentation, not binary blobs, graphics support will continue to suck.
Binary drivers kill kittens (thanks airlied for that one). They don't help if you run other free Unixen, they don't help if you use a non-mainstream platform (e.g. PPC, AMD64 up until recently, it doesn't help the Radeon in the Alpha I have here).
Demand DOCUMENTATION - even if it's gibberish to you personally, it's will benefit you far more than binary blobs eventually...
Re:Why ATI... Go NVidia (Score:3, Insightful)
Re:Depends. (Score:5, Insightful)
Once, CPUs didn't do vector computations. They were either converted to scalar operations, or performed on a dedicated (expensive) coprocessor. Now, lots of CPUs have vector units.
Once, CPUs didn't do stream processing. Now, a few CPUs (mainly in the embedded space) have on-die stream processors.
A GPU is not much more than an n-way superscalar streaming vector processor. I wouldn't be surprised if AMD wants to create almost-general coprocessors with similar characteristics that connect to the same HT bus as the CPU; plug them directly into a CPU slot and perform all of the graphics operations there. Relegate the graphics hardware to, once more, being little more than a frame buffer. This would be popular in HPC circles, since it would be a general purpose streaming vector processor with an OpenGL / DirectX implementation running on it, rather than a graphics processor that you could shoehorn general purpose tasks onto. The next step would be to put the core the same die as the CPU cores.
The CPU industry knows that they can keep doubling the number of transistors on a die every 18 months for 10-15 years. They think they can do it for even longer than this. They also know that in a much smaller amount of time, they are going to run out of sensible things to do with those transistors. Is a 128-core x86 CPU really useful? Not to many people. There are still problems that could use that much processing power, but most of them benefit more from specialised silicon.
Within the next decade, I think we will start to see a shift towards heterogeneous cores. The Cell is the first step along this path.
Talking out the ass? (Score:2, Insightful)
No shit, Sherlock? Lemme see:
- Dreamcast: had a PowerVR graphics chip that had been available for the PC too for a year or two. Not even the most powerful at that. It was the predecessor of the Kyro and generally a flop in the PC market. In the Dreamcast it had a whole 8 MB video RAM too, at a time when PC graphics cards were moving to 32 MB.
- XBox: basically had a predecessor of the NForce chipset, with integrated graphics. Look at some PC benchmarks for how much those suck. Hint: having half the buss width, half the memory speed, _and_ having to share that choked bandwidth with the CPU, doesn't exactly help with rendering speed.
- PS2: read some developper complaints from back then. It didn't have even half the fill rate or triangle processing rate that Sony had claimed. Trying to even replicate Sony's rigged demos was a failure as soon as you had more than one character on the screen or an even moderately complex background. It took a lot of low level work to get it to run fast enough, while on a PC even a mid-range card never needed such tricks to do its job. And even then there's a reason the vast majority of PS2 games never had more than a handful of characters on the screen at the same time.
Get this, Sherlock: what saved all 3 was that they just didn't have to render in higher res than 640x480. _That_ was their only saving grace.
And it was a saving grace in more ways than the number of pixels rendered too. Rendering in low res makes it ok to use lower resolution textures too (hence needs less memory bandwidth and uses the cache better), _and_ lets you get away with lower polygon counts. If you use the exact same models, the same triangle may be something 8x8 pixels in a console game, but 16x16 on a PC in 1280x1024. The same model may look horribly polygonal on a PC game in 1280x1024, but decently rounded in a console game in 640x480. So PC games had to compensate by using higher polycounts, and PC graphics cards had to be able to process those extra polygons.
So in a nutshell, oh please... pretending that any console from the last decade was actually faster than a high end PC, is just plain old false.