Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

ATI and AMD Seek Approval for Merger? 229

Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."
This discussion has been archived. No new comments can be posted.

ATI and AMD Seek Approval for Merger?

Comments Filter:
  • by jhfry ( 829244 ) on Friday July 21, 2006 @06:41PM (#15760593)
    I always thought that AMD and Nvidia were the better combo. Besides the ATI Drivers suck for Linux, where a large percent of the enthusiast market's interests lie. Isn't AMD still more of an enthusists processor until it can get into one of the top vendor's machines?

  • by The Living Fractal ( 162153 ) <banantarr@hot m a i l.com> on Friday July 21, 2006 @06:42PM (#15760602) Homepage
    As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

    And if AMD and ATi merge.. It sort of seems like a punch in the face to nVidia. Leaving them wanting to talk to Intel. Leading to... what?

    For a long time there have been two beasts in the CPU market and two beauties in the GPU market. AMD and Intel in CPUs, and ATi and nVidia in GPUs. If they marry respectively, the offspring might have the good qualities of neither and the bad qualities of both. I think overall the consumer would probably (more than likely) lose out.

    So, I really kind of hope this is just a rumor.

    TLF
  • by jhfry ( 829244 ) on Friday July 21, 2006 @06:44PM (#15760616)
    I think NVidia needs to get into the processor market themselves. Maybe not for general computing, but I bet their designers have some great ideas for a processor that would be at home in a console! With GPU's being so powerful these days, I can't imagine that they lack the expertise to do it.
  • Conflict - nForce? (Score:3, Insightful)

    by Coplan ( 13643 ) on Friday July 21, 2006 @06:46PM (#15760631) Homepage Journal
    I'm a big AMD fan. But I'd be really upset to loose the nForce line of chipsets. In my opinion, it's a must for any AMD user. And I think it would be very difficult to come up with a good replacement.

    I also worry that chipsets for AMD based motherboards will not work so well with my nVidia video card. Not an ATI fan at all.

    I'm going to be watching these guys very closely. This would sway me away from AMD.
  • by Anonymous Coward on Friday July 21, 2006 @06:48PM (#15760642)
    As anyone familiar with the botched ATI graphics system in the Xbox 360 knows, whatever competence ATI may have had in the past is long gone.

    The Xbox 360 is the first console ever to have PCs outperform it before the console has hit store shelves. In the past, consoles have had at least a year or so before PCs could touch them.

    What the hell is AMD thinking?

    AMD needs to come up with its own bogus SPEC score generating compiler to grow in the market, not a fucked up GPU maker.

  • by Paul Jakma ( 2677 ) on Friday July 21, 2006 @06:51PM (#15760653) Homepage Journal
    Actually, the X.org drivers for ATis are probably the best out there. The problem is they lack support for recent ATi hardware (lacking good 3D support for vaguely recent, e.g. R300 and up, though it's getting there apparently, and completely lacking any support 2D or 3D, for the most recent R500 hardware), as ATi havn't made documentation available in a *long* time.

    If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..

    If the rumour is truee, I hope AMD care about open drivers..
  • by Anonymous Coward on Friday July 21, 2006 @07:04PM (#15760726)
    AMD, like Intel, could be convinced to open up the specifications to their graphics hardware in order to sell more of their complement product, processors. The difference is that ATI Graphics Processing Units (GPUs) don't suck like Intel GPUs. AMD could have almost 100% of the Linux notebook market within a year and my guess is HP would be the big winner because they already have a business line of AMD notebooks with ATI GPUs: HP Compaq nx6125 Notebook for Business (New Zealand link since this Anonymous Coward is from NZ) [hp.com]
  • completely agree (Score:4, Insightful)

    by RelliK ( 4466 ) on Friday July 21, 2006 @07:07PM (#15760743)
    Nvidia makes the best chipsets for AMD. Why would they want to merge with second-rate vendor? I hope AMD doesn't become as unstable as ATI drivers.
  • Depends. (Score:5, Insightful)

    by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Friday July 21, 2006 @08:13PM (#15760996) Homepage Journal
    If it's ATi trying to buy out AMD (which is perfectly possible), then they might not have enough money left to stop nVidia doing a hostile takeover of them both. That would eliminate one of nVidia's competitors -and- give them control over the CPU that looks set to take over.


    You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume. There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up. That would be expensive and earn them little. In fact, given how much they've made from their component-neutrality, sacrificing that might mean they'd actually lose money overall.


    On the other hand, CPUs are high volume, high profit, and AMD is gaining market-share. It is an ideal target for a buy-out, particularly as ATi can't be doing that well in the GPU market. Buying AMD would be like buying a money-printing-machine, as far as ATi were concerned. Better still, AMD is a key player in bus specifications such as HyperTransport, which means that if ATi owned AMD, ATi could heavily influence the busses to suit graphics in general and their chips in particular.


    (Mergers are never equal, as you don't have two CEOs, two CFOs, etc. One of them will be ultimately in charge of the other.)


    If the rumour is correct, then don't assume AMD is the one instigating things - they have the most to lose and the least to gain - and don't assume either of them will be around when the mergers and buyouts finish.

  • by uptoeleven ( 845032 ) on Friday July 21, 2006 @08:15PM (#15761002) Journal
    ATI and AMD shouldn't merge because ATI's drivers suck.

    I think that's the concensus on here, certainly the linux drivers are apparently awful.

    My AMD64 desktop machine has an NVidia graphics card which works much better than the ATI rubbish built into the motherboard. But I'm not using that machine to write this. In fact, other than for occasional gaming, that machine rarely gets switched on.

    I tend to use my laptop. Which has a Centrino chipset.

    You know - that one that Intel brought out for laptops? The one that's hugely, massively successful in one of the main growth areas of hardware sales? Everyone wants a laptop... or a home media centre based on a pc but doesn't run like one... Everyone is buying Intel. Why? Because to all intents and purposes all the laptops come with Intel centrino sets. It's dead easy - they're dead easy to support, all the bits work together, no conflicts. AMD? Sure nice chips but who makes Turion laptops? Acer... Asus... and... um... some other companies... Perhaps Alienware? HP make a couple, Fujitsu Siemens make a couple but these aren't their high-end desirable laptops. It's like "well if I spend money I get a centrino, otherwise it's a toss-up between Celeron - the cacheless wonder - and a chip that sounds like a sticky nut treat..."

    Who makes Centrino laptops? Dell, Sony, Toshiba, Fujitsu Siemens, Samsung, Panasonic, whatever IBM are calling themselves now - oh and Acer and Asus and Alienware too but - oh yes, and one really important company who basically stuck 2 fingers up to AMD - Apple. I'll bet Apple choosing Intel hurt. But everyone's buying laptops with Centrino chipsets in... No-one's really buying AMD... because AMD don't provide a chipset and an easy way for manufacturers to just kind of put their machines together in a lego-style fashion.

    Does it make business sense for AMD to tie up with the chipset and motherboard manufacturer that also happens to make graphics cards? Hell yes. Does it make sense for AMD to try to get into the laptop market in a meaningful way? Probably. Will their driver support get any better? We can hope...
  • by Paul Jakma ( 2677 ) on Friday July 21, 2006 @08:32PM (#15761071) Homepage Journal
    I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction

    Sigh. This detrimentally short-sighted acceptance of binary-only drivers that users like you have is precisely why there are no good drivers for recent ATi hardware, or most recent graphics besides Intel. And until users like yourself start demanding that vendors provide documentation, not binary blobs, graphics support will continue to suck.

    Binary drivers kill kittens (thanks airlied for that one). They don't help if you run other free Unixen, they don't help if you use a non-mainstream platform (e.g. PPC, AMD64 up until recently, it doesn't help the Radeon in the Alpha I have here).

    Demand DOCUMENTATION - even if it's gibberish to you personally, it's will benefit you far more than binary blobs eventually...
  • by Pulzar ( 81031 ) on Friday July 21, 2006 @11:16PM (#15761571)
    Why ATI? I think there are two major reasons... First, ATI dominates the mobile market, and AMD is very weak in it. Creating a solution to compete with Intel's mobile offerings requires you to offer all the parts at a good price, and it's much harder to do that as 2 companies instead of one. ("Buy our CPU, we'll toss in a cheaper ATI chipset/card in" doesn't work if you don't own ATI :) ). Second, nVidia, even with its recent dismal stock performance, is worth over $6B, making it a lot more expensive then ATI. And, really, when you look past the Linux driver issue that irks so many here, they have very similar offerings.
  • Re:Depends. (Score:5, Insightful)

    by TheRaven64 ( 641858 ) on Saturday July 22, 2006 @08:45AM (#15762614) Journal
    Once, CPUs did integer computation. Floating point computation was performed by an external chip or emulated with (lots of) integer operations. Now, most CPUs have a floating point unit on-die.

    Once, CPUs didn't do vector computations. They were either converted to scalar operations, or performed on a dedicated (expensive) coprocessor. Now, lots of CPUs have vector units.

    Once, CPUs didn't do stream processing. Now, a few CPUs (mainly in the embedded space) have on-die stream processors.

    A GPU is not much more than an n-way superscalar streaming vector processor. I wouldn't be surprised if AMD wants to create almost-general coprocessors with similar characteristics that connect to the same HT bus as the CPU; plug them directly into a CPU slot and perform all of the graphics operations there. Relegate the graphics hardware to, once more, being little more than a frame buffer. This would be popular in HPC circles, since it would be a general purpose streaming vector processor with an OpenGL / DirectX implementation running on it, rather than a graphics processor that you could shoehorn general purpose tasks onto. The next step would be to put the core the same die as the CPU cores.

    The CPU industry knows that they can keep doubling the number of transistors on a die every 18 months for 10-15 years. They think they can do it for even longer than this. They also know that in a much smaller amount of time, they are going to run out of sensible things to do with those transistors. Is a 128-core x86 CPU really useful? Not to many people. There are still problems that could use that much processing power, but most of them benefit more from specialised silicon.

    Within the next decade, I think we will start to see a shift towards heterogeneous cores. The Cell is the first step along this path.

  • by Moraelin ( 679338 ) on Saturday July 22, 2006 @03:17PM (#15763690) Journal
    The Xbox 360 is the first console ever to have PCs outperform it before the console has hit store shelves. In the past, consoles have had at least a year or so before PCs could touch them.


    No shit, Sherlock? Lemme see:

    - Dreamcast: had a PowerVR graphics chip that had been available for the PC too for a year or two. Not even the most powerful at that. It was the predecessor of the Kyro and generally a flop in the PC market. In the Dreamcast it had a whole 8 MB video RAM too, at a time when PC graphics cards were moving to 32 MB.

    - XBox: basically had a predecessor of the NForce chipset, with integrated graphics. Look at some PC benchmarks for how much those suck. Hint: having half the buss width, half the memory speed, _and_ having to share that choked bandwidth with the CPU, doesn't exactly help with rendering speed.

    - PS2: read some developper complaints from back then. It didn't have even half the fill rate or triangle processing rate that Sony had claimed. Trying to even replicate Sony's rigged demos was a failure as soon as you had more than one character on the screen or an even moderately complex background. It took a lot of low level work to get it to run fast enough, while on a PC even a mid-range card never needed such tricks to do its job. And even then there's a reason the vast majority of PS2 games never had more than a handful of characters on the screen at the same time.

    Get this, Sherlock: what saved all 3 was that they just didn't have to render in higher res than 640x480. _That_ was their only saving grace.

    And it was a saving grace in more ways than the number of pixels rendered too. Rendering in low res makes it ok to use lower resolution textures too (hence needs less memory bandwidth and uses the cache better), _and_ lets you get away with lower polygon counts. If you use the exact same models, the same triangle may be something 8x8 pixels in a console game, but 16x16 on a PC in 1280x1024. The same model may look horribly polygonal on a PC game in 1280x1024, but decently rounded in a console game in 640x480. So PC games had to compensate by using higher polycounts, and PC graphics cards had to be able to process those extra polygons.

    So in a nutshell, oh please... pretending that any console from the last decade was actually faster than a high end PC, is just plain old false.

Happiness is twin floppies.

Working...