Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

ATI and AMD Seek Approval for Merger? 229

Posted by Zonk
from the curiouser dept.
Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."
This discussion has been archived. No new comments can be posted.

ATI and AMD Seek Approval for Merger?

Comments Filter:
  • Does that mean.... (Score:4, Interesting)

    by mikael (484) on Friday July 21, 2006 @05:40PM (#15760587)
    NVidia would seek a partnership with Intel (Although some news articles reported that they felt that Intel
    were holding back progress in 3D graphics performance).
    • Depends. (Score:5, Insightful)

      by jd (1658) <imipakNO@SPAMyahoo.com> on Friday July 21, 2006 @07:13PM (#15760996) Homepage Journal
      If it's ATi trying to buy out AMD (which is perfectly possible), then they might not have enough money left to stop nVidia doing a hostile takeover of them both. That would eliminate one of nVidia's competitors -and- give them control over the CPU that looks set to take over.


      You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume. There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up. That would be expensive and earn them little. In fact, given how much they've made from their component-neutrality, sacrificing that might mean they'd actually lose money overall.


      On the other hand, CPUs are high volume, high profit, and AMD is gaining market-share. It is an ideal target for a buy-out, particularly as ATi can't be doing that well in the GPU market. Buying AMD would be like buying a money-printing-machine, as far as ATi were concerned. Better still, AMD is a key player in bus specifications such as HyperTransport, which means that if ATi owned AMD, ATi could heavily influence the busses to suit graphics in general and their chips in particular.


      (Mergers are never equal, as you don't have two CEOs, two CFOs, etc. One of them will be ultimately in charge of the other.)


      If the rumour is correct, then don't assume AMD is the one instigating things - they have the most to lose and the least to gain - and don't assume either of them will be around when the mergers and buyouts finish.

      • Re:Depends. (Score:3, Interesting)

        by dubbreak (623656)
        There is precisely no reason whatsoever for AMD to want to merge with ATi or to buy them up.

        What about (I hate that I am going to type this word) synergies. Maybe AMD thinks that they have enough in common with ATI that they could reduce redunancies after the merge (ie fire people and possibly sell off a fab plant) and make both companies more profitable. Just a thought.
        • Re:Depends. (Score:3, Informative)

          by jd (1658)
          Hmmm. The only possible overlap is in the fabrication. Designing a good graphics processor is going to be very different from designing a good CPU, so they can't overlap the development teams (which will likely be small anyway). It's very doubtful the chips would be of similar enough size and have similar enough characteristics to do much about packaging or testing. Unless they're planning on unifying the scale at which they're making the chips, it's not clear they could do much about the etching. They coul
          • Half the directors could be fired, but it's doubtful either CEO is going to consider their choice of senior management to be the inferior choice. Which means that one board would win and the other board will lose. On the whole, that is. The CEO of the winning board might cherry-pick a few who are really exceptional or who have given him lots of money.

            What makes you think the CEO gets to choose the board? You clearly don't know shit about corporations.

            The shareholders elect the board. The board chooses the C
          • Re:Depends. (Score:3, Interesting)

            by scum-e-bag (211846)

            It's very doubtful the chips would be of similar enough size and have similar enough characteristics to do much about packaging or testing.

            Not at the moment. But with a little more miniaturisation and time both CPU and GPU will be merged onto the one chip package. This is a situation where the combined company will have more than a small edge over their rivals. Avoiding the use of (relatively) long transmission wires to communicate across the motherboard bus; speeds will increase beyond anything current

            • Re:Depends. (Score:5, Insightful)

              by TheRaven64 (641858) on Saturday July 22, 2006 @07:45AM (#15762614) Journal
              Once, CPUs did integer computation. Floating point computation was performed by an external chip or emulated with (lots of) integer operations. Now, most CPUs have a floating point unit on-die.

              Once, CPUs didn't do vector computations. They were either converted to scalar operations, or performed on a dedicated (expensive) coprocessor. Now, lots of CPUs have vector units.

              Once, CPUs didn't do stream processing. Now, a few CPUs (mainly in the embedded space) have on-die stream processors.

              A GPU is not much more than an n-way superscalar streaming vector processor. I wouldn't be surprised if AMD wants to create almost-general coprocessors with similar characteristics that connect to the same HT bus as the CPU; plug them directly into a CPU slot and perform all of the graphics operations there. Relegate the graphics hardware to, once more, being little more than a frame buffer. This would be popular in HPC circles, since it would be a general purpose streaming vector processor with an OpenGL / DirectX implementation running on it, rather than a graphics processor that you could shoehorn general purpose tasks onto. The next step would be to put the core the same die as the CPU cores.

              The CPU industry knows that they can keep doubling the number of transistors on a die every 18 months for 10-15 years. They think they can do it for even longer than this. They also know that in a much smaller amount of time, they are going to run out of sensible things to do with those transistors. Is a 128-core x86 CPU really useful? Not to many people. There are still problems that could use that much processing power, but most of them benefit more from specialised silicon.

              Within the next decade, I think we will start to see a shift towards heterogeneous cores. The Cell is the first step along this path.

      • You need to bear in mind that the GPU is the critical component in most systems, but makes almost no money for the vendor and has a relatively low volume ... On the other hand, CPUs are high volume, high profit ...

        I beg to differ. GPUs have higher volumes than CPUs, assuming you count GPUs embedded in chipsets, along with the discrete GPUs. Just think about how often people upgrade their CPUs as opposed to their GPUs.

        As for profit margins, then you have a point there, although for the wrong reasons, I
        • Not at all.
          Every computer sold has at least 1 CPU, but may not have a GPU at all (what use does a server have for a GPU if it's sitting in a rack and never has a screen attached), or it might have one integrated into the chipset.
          The most a single system will have is 2 GPUs, whereas highend machines could have many CPUS, and are unlikely to need a GPU at all.
      • Wow. That scenario would make for an interesting marketplace...ATi merges with AMD, gets swallowed by nVidia. Now we have one super-company, with two world-class GPU design teams, a world class CPU maker, and a world-class chipset maker. No need to hide sources from what are now themselves, so we'd have a much better chance at proper docs or even (gasp) shared-source, the design teams can be re-organized, so one dept focuses on gaming, and another on multimedia devices (HTPC, hardware video decoding/scal
        • No, we'd have one entity that assumes control of most of the market (let's face it, Matrox and VIA are as important to the GPU market as ASRock is to high-end gaming) and puts out shitty but utterly competition-free products until the public is aggravated enough to risk incompatiblity by defecting to a fringe product. It'd be Internet Explorer all over, only this time even more people would have to suffer - for example, how high would accelerated Linux drivers be on the new company's priority list? As for n
          • I think you have a slightly skewed view of the GPU marketplace. Here are the figures from Q1 of this year:
            1. Intel: 39.1%
            2. ATi: 28.7%
            3. nVidia: 18.7%
            4. VIA: 9%
            5. SIS: 3%

            Where it really matters these days is in the laptop space. Laptop sales are set to pass desktops in the next year or two. They did for Apple last year, and they're at about 50% of desktop sales for the rest of the industry. While desktop GPU sales grew by about 25%, laptop GPU sales grew by over 30%; particularly noteworthy since most laptops d

  • by jhfry (829244) on Friday July 21, 2006 @05:41PM (#15760593)
    I always thought that AMD and Nvidia were the better combo. Besides the ATI Drivers suck for Linux, where a large percent of the enthusiast market's interests lie. Isn't AMD still more of an enthusists processor until it can get into one of the top vendor's machines?

    • by Paul Jakma (2677) <paul+slashdot@jakma.org> on Friday July 21, 2006 @05:51PM (#15760653) Homepage Journal
      Actually, the X.org drivers for ATis are probably the best out there. The problem is they lack support for recent ATi hardware (lacking good 3D support for vaguely recent, e.g. R300 and up, though it's getting there apparently, and completely lacking any support 2D or 3D, for the most recent R500 hardware), as ATi havn't made documentation available in a *long* time.

      If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..

      If the rumour is truee, I hope AMD care about open drivers..
      • If the rumour is truee, I hope AMD care about open drivers..

        well corporations are shizophrenic, but: http://www.amd-jobs.de/de/einstieg/freiestellen_os rc.php [amd-jobs.de]
      • I would consider lack of 3D support "sucking." So if ATI's own drivers are WORSE than not having 3D support, wow!

        -matthew
        • by Paul Jakma (2677) <paul+slashdot@jakma.org> on Friday July 21, 2006 @07:22PM (#15761023) Homepage Journal
          The X.org drivers do support 3D, and quite well, on the older R100 and R200 cards. R300/400 are also supported for 3D, but those have needed extensive reverse engineering, and hence are not quite as mature (though, getting there apparently), also they have only really reverse engineered the equivalent of the R200 feature set, so they're not getting the most out of the cards - all thanks to ATis silly attitude about supplying documentation.
      • by waveclaw (43274) on Saturday July 22, 2006 @12:38AM (#15761959) Homepage Journal
        The problem is they lack support for recent ATi hardware (lacking good 3D support for vaguely recent, e.g. R300 and up

        Funny way to define recent. You don't happen to be a Debian developer, do you?

        I just threw away a R300 series card (ATi 9800 XT) for an nVidia SLI. I bought the ATi back in mid '05 and it has sit on the store shelves for 1/2 a year before I picked it up for the "Free" Half-life 2 and then "stable" accelerated proprietary drivers.

        I game under Linux. But with an ATi card, nothing worked well or for very long. Wine, the commercial Cedega, even native games would kill the driver. I had to install nVidia dependancies for my team's 3d software. Software which in the end wouldn't work without the nVidia drivers.

        If you meant ATis' own drivers, yeah, they suck. But really, if ATi just made docs available, the much better X.org drivers would be able to support far more of their hardware..

        I don't see that improving quickly unless somebody is a big itch to scratch builds a community like the one around nVidia. A lot of people doing games in Linux only develop and test with nVidia hardware. Not everyone can afford two $600-800 rigs with recent cards.

        Once I switched to nVidia 3D a ton of games that only worked on Windows now install and play as well if not better than native on Windows. Older 3d games like Diablo 2, Warcraft 3 and Startopia fly at high frames-per-second (> 60-100._ Current generation games like Tron 2.0, Guildwars and Half-life 2 get respectible fps (~30) where the ATi drivers would struggle to get 2-3 fps and often crash if anything changed the drawing state.

        I hope AMD care about open drivers..

        This assumes that AMB comes out on top. Or that the ATi proprietary midset doesn't infect AMD. On one side you have two companies that are basic chip fabbers, spewing out GPUs, CPUs and chipset engineeing specs as fast and cheaply as possible. On the other you have ATi, buried deep in a race with nVidia, and AMD, who won the last round of CPU wars with x86_64. As has been mentioned by others, mergers are little more than one company eating another. I for one would not be surpised if after any such ATi/AMD merger that the next (last?) AMD nVidia motherboard chipsets are at least 6 months to a year behind the next ATi releases.

        At the best, it would be intersting to see a dual-core CPU with one core a GPU and a metric ton of cache. I'd be almost like the old 468SX vs. 468DX days.
    • completely agree (Score:4, Insightful)

      by RelliK (4466) on Friday July 21, 2006 @06:07PM (#15760743)
      Nvidia makes the best chipsets for AMD. Why would they want to merge with second-rate vendor? I hope AMD doesn't become as unstable as ATI drivers.
      • Re:completely agree (Score:3, Interesting)

        by powerlord (28156)
        Look at the other possibility:

        AMD, after buying out ATI opens up the architecture or supports Linux as a 1st tier platform.

        I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction ... of course it might also push nVidia to do the same ... arms races can be fun for the spectators (and consumers :) )
        • by Paul Jakma (2677) <paul+slashdot@jakma.org> on Friday July 21, 2006 @07:32PM (#15761071) Homepage Journal
          I bet if ATI was putting out first rate drivers it might influence quite a few purchases in that direction

          Sigh. This detrimentally short-sighted acceptance of binary-only drivers that users like you have is precisely why there are no good drivers for recent ATi hardware, or most recent graphics besides Intel. And until users like yourself start demanding that vendors provide documentation, not binary blobs, graphics support will continue to suck.

          Binary drivers kill kittens (thanks airlied for that one). They don't help if you run other free Unixen, they don't help if you use a non-mainstream platform (e.g. PPC, AMD64 up until recently, it doesn't help the Radeon in the Alpha I have here).

          Demand DOCUMENTATION - even if it's gibberish to you personally, it's will benefit you far more than binary blobs eventually...
          • Here's a better idea...

            YOU demand documentation for your other free *nixen and your mainstream platforms.

            *I* will use the binary drivers Nvidia provides because they fulfill the most important requirement AFAIC: They make my stuff WORK.
            • *I* will use the binary drivers Nvidia provides because they fulfill the most important requirement AFAIC: They make my stuff WORK.

              Yeah, AIGLX and Xegl work real well with my Geforce FX.

              NVidia does provide somewhat decent drivers, but they just fulfill the necessary requirements for being useful. Proper documentation would go all the way to "sufficient".
            • YOU demand documentation for your other free *nixen and your mainstream platforms.

              Right - cause they don't make any difference to your chosen platform, do they? Except that 99% of graphics work on Unix platforms is done in userspace, in Mesa and Xorg code, so work done by FreeBSD, Sun, etc.. engineers also tends to apply to your Linux machines (and vice versa).

              You're simply an ignoramus: you're using a system (only parts of which are either Linux or Linux specific) which tens of thousands of people have don
      • become as unstable as ATI drivers.

        You are so out of the loop, it's not even funny. This tiresome argument is so fucking late 1990s!!! Seriously, ever since ATI moved to a unified driver architecture known as Catalyst (equivalent to nVidia's ForceWare), problems of instability have long since vanished.

        So for the love of all that's holy, please stop spreading OUTDATED INFORMATION!!!
    • ATI and AMD shouldn't merge because ATI's drivers suck.

      I think that's the concensus on here, certainly the linux drivers are apparently awful.

      My AMD64 desktop machine has an NVidia graphics card which works much better than the ATI rubbish built into the motherboard. But I'm not using that machine to write this. In fact, other than for occasional gaming, that machine rarely gets switched on.

      I tend to use my laptop. Which has a Centrino chipset.

      You know - that one that Intel brought out for laptops? The one
    • It's already so hard to find an AMD laptop with nvidia graphics.
    • by Pulzar (81031)
      Why ATI? I think there are two major reasons... First, ATI dominates the mobile market, and AMD is very weak in it. Creating a solution to compete with Intel's mobile offerings requires you to offer all the parts at a good price, and it's much harder to do that as 2 companies instead of one. ("Buy our CPU, we'll toss in a cheaper ATI chipset/card in" doesn't work if you don't own ATI :) ). Second, nVidia, even with its recent dismal stock performance, is worth over $6B, making it a lot more expensive then A
    • I always thought that AMD and Nvidia were the better combo. Besides the ATI Drivers suck for Linux, where a large percent of the enthusiast market's interests lie. Isn't AMD still more of an enthusists processor until it can get into one of the top vendor's machines?

      I think, then, what you're looking for could come from this merger. AMD being the less expensive of the major CPU producers is a first choice for the free Unix group, and they know it. Maybe joining with ATI will cause the joined company to beco
    • OK, So let me first declare that I am a stockholder in both NVDA and AMD, and I've been mulling this over all day. Clearly by that, I agree with the nvda/amd combo. I've seen plenty articles that tout how NVDA has benefited from "neurality" between the two, but where's the neutrality? green grid? certified desktops? These AMD and NVDA have recognized each other's strengths and been building on that for awhile.

      First of all, this arrangement benefits NVDA as much as AMD. How? It eliminates their main compet

  • by The Living Fractal (162153) <banantarr&hotmail,com> on Friday July 21, 2006 @05:42PM (#15760602) Homepage
    As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

    And if AMD and ATi merge.. It sort of seems like a punch in the face to nVidia. Leaving them wanting to talk to Intel. Leading to... what?

    For a long time there have been two beasts in the CPU market and two beauties in the GPU market. AMD and Intel in CPUs, and ATi and nVidia in GPUs. If they marry respectively, the offspring might have the good qualities of neither and the bad qualities of both. I think overall the consumer would probably (more than likely) lose out.

    So, I really kind of hope this is just a rumor.

    TLF
    • nVidia does just as well with both Intel and AMD processors. Even if AMD and ATI merged, it would be in nVidia's best interest to stay on their own unaligned. It's not like the ATI + AMD combo would actually make something better than nVidia could for chipsets. And assuming they could, so what? nVidia would just turn it up and prove that they can compete. nVidia can always crank up the heat when they need to. They're good at that.

      The only concern nVidia should have is if the AMD Process line started c
      • Nobody will ever get to align with Intel. Intel already makes their own graphics, and, sucky as they are, they sell more of them than either ATI or nVidia. Their chipsets are already rock-solid and well-enough performing. They have all the pieces of the puzzle to sell a good "all-in-one" platform -- they've already taken over the mobile market with the Centrino platform, because they had the best mobile CPUs on top of the other things. Now they seem to be ready to attempt the same on the desktop, with the n
    • Even if there is some sort of a merger, it's not like that means AMD will make their processors only work with ATi cards, or make ATi cards only work with AMD processors. Well, I guess they could do that, but I'm not sure what the point would be.
    • As much as I like AMD, I have to say that if Intel and nVidia teamed up they would probably beat the crap out of AMD + ATi.

      People say that, but I have to wonder what Intel has to gain. I mean, they're already the biggest player [reghardware.co.uk] in the graphics industry when it comes to market-share, so they clearly have the know-how to build graphics chips. Sure, they don't currently go after the enthusiast market, but there might be reasons for that:
      1. Lower margins - Nvidia's gross margin [yahoo.com] is under 40%, and Intel's [yahoo.com] is cl
  • New Logo (Score:5, Funny)

    by managementboy (223451) on Friday July 21, 2006 @05:43PM (#15760608) Homepage
    AMD
    T
    I
  • by overshoot (39700) on Friday July 21, 2006 @05:45PM (#15760621)
    Well, as an AMD stockholder I'll certainly vote against it (not that I have enough shares to matter.)

    The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.

    • That's probably because AMD missed earnings estimates - most of the drop happened between closing yesterday and opening today, not after The Inq's story.
    • The market's view of this is visible from the fact that ATI is up and AMD is waaaay down.

      Wrong. The company doing the takeover (AMD) almost always declines -- rather noticably, too -- and the company being taken over almost always increases -- usually because the takeover bid is at a higher stock price.

      AMD is just reporting bad earnings news in a volatile, short-heavy, news-sensitive market. With companies reporting good earnings still trading downward, it's no surprise that reporting bad earnings will

  • Conflict - nForce? (Score:3, Insightful)

    by Coplan (13643) on Friday July 21, 2006 @05:46PM (#15760631) Homepage Journal
    I'm a big AMD fan. But I'd be really upset to loose the nForce line of chipsets. In my opinion, it's a must for any AMD user. And I think it would be very difficult to come up with a good replacement.

    I also worry that chipsets for AMD based motherboards will not work so well with my nVidia video card. Not an ATI fan at all.

    I'm going to be watching these guys very closely. This would sway me away from AMD.
    • I'm a big AMD fan. But I'd be really upset to loose the nForce line of chipsets.

      I'm a big AMD fan, but after dealing with nForce platform drivers, I'll be really upset not to lose the nForce line of chipsets.

      • Amen!

        Forget even Linux, nForce is seriously crippled under WindowXP also.

        It was very painful trying to get a new system built with nForce4, RAID 1 SATA aray, and a SATA Optical Drive (system is AMD also, but that wasn't a problem :) )

        Finally I traced the problem to an incompatibility in the nForce chipset. It can EITHER support a SATA RAID array, or it can support a SATA optical drive. Doing both unfortunately causes the system to bluescreen.

        (and yes, I know SATA on the optical doesn't buy you much ... ex
    • I'm wondering, why are people jumping to these kinds of assumptions? Intel makes its own motherboards, chipsets, graphic chipsets, etc., but that doesn't prevent them from functioning with other manufacturers' parts. What business sense would there be in AMD making their processors incompatible with nVidia chipsets? If either were Microsoft, then maybe they could get away with it, but generally hardware/software benefits from compatibility.
  • by Anonymous Coward on Friday July 21, 2006 @05:48PM (#15760642)
    As anyone familiar with the botched ATI graphics system in the Xbox 360 knows, whatever competence ATI may have had in the past is long gone.

    The Xbox 360 is the first console ever to have PCs outperform it before the console has hit store shelves. In the past, consoles have had at least a year or so before PCs could touch them.

    What the hell is AMD thinking?

    AMD needs to come up with its own bogus SPEC score generating compiler to grow in the market, not a fucked up GPU maker.



  • It's definitely going to be one of those positive situations where software is doctored to perform particularly well [when combinations are involved].

    ;)
  • GPU in socket? (Score:3, Interesting)

    by tomstdenis (446163) <tomstdenis@@@gmail...com> on Friday July 21, 2006 @06:00PM (#15760694) Homepage
    There is a company out there that has an FPGA in a 940 pin socket. What about putting a GPU in it? Dual channel memory, HT link to the main processor, HT link to a DAC from the GPU [make mobos with fixed DACs on the board].

    That'd be hella cool.

    Tom
  • I think this is bad for AMD because ATI has crappy support, crappy customer experience, and crappy drivers.

    Either this would vastly improve ATI or it could drag AMD down into mediocrity. If the merger does happen I truly hope that it is the former (ATI cleaning up its act across the board) but all too often with these sorts of mergers its the former that happens. ATI has a lot of great technology with fast GPUs, but when the drivers suck, customer service and support are nonexistent, and they absolutely re
  • I am a hard-core AMD and nVidia fan. I don't have any Intel PCs in my house except those that I got as freebies, and I've never had good luck with *any* ATI card. I cringe in fear at what would (or at least could) happen to my gaming systems of the future if ATI and AMD merge. Yes, I can see some type of exclusivity where ATI cards are going to somehow be more advantageous than nVidia when it comes to gaming hardware for reasons other than plain, old competition.

    Damn. This worries me
  • Dilbert, anyone? (Score:4, Interesting)

    by ivoras (455934) <ivoras@@@fer...hr> on Friday July 21, 2006 @06:36PM (#15760870) Homepage
    Doesn't this story look like a Dilbert-ish situation - the companies themself don't even consider merging but because "the word is out" and "everybody knows they'll do it" it somehow becomes a reality?
  • The recent announcement by Apple that they are going to be partnering with Nvidia for future ipod use. This could be the first step in them getting ready to switch over to Nvidia for their graphics processors since they use Intel chipsets and ATI graphics cards currently. I'm sure AMD is bitter over Intel being picked instead of AMD for the new "Mactels" too, so I could easily see them withdrawing ati support if the merger takes place.
  • by WoTG (610710) on Friday July 21, 2006 @08:14PM (#15761218) Homepage Journal
    At first glance, this is a stupid idea for AMD, but upon reflection, it isn't that bad. We've got to look at the 5 year picture for a deal of this size. What will AMD need to do to be more successful in 5 years than they are today? Well, despite what the teenage gamers will say, it actually doesn't mean having the highest FPS in Quake 5. The stable, highest volume, and generally profitable sales are in corporate servers and workstations. That's Dell, HP, and to a lesser extent Gateway, Lenovo, et al. So, what do they need from AMD or Intel? They want cheap, fast, reliable supply, few defects, and ease of integrating into the individual computers. After several years of the Athlon and Opteron, AMD is only now starting to get a toe hold in workstations and a reasonable share of server CPUs.

    IMHO, AMD would be well advised to start shipping it's own chipsets, just like Intel. It just makes things easier for their most important customers, the big OEMs. They have one less vendor to worry about. There's less testing required, since presumably AMD would test the CPU and chipset together. And it's less risky for both customers and AMD since AMD has a very strong incentive to make sure that chipsets will be available for their platform on time, whereas third parties have different priorities.

    Then there's the whole GPU angle. Why shouldn't GPUs be produced in company owned, i.e. tweaked for performance, fabs? They're every bit as complex and big and expensive as CPUs. Bringing that in house should give a nice bump to performance. And what is a GPU going to be in five years anyway? On the AMD platform, all the tools are in place to allow the GPU to work much more like a cheap DSP/co-processor than we've ever seen before. If the Opteron wasn't an Itanium killer, maybe a couple Opterons and a couple "GPU-DSPs" will do the trick. Even for regular workstations, imagine just plugging a GPU into a free socket on the MB? That would fit very nicely in the middle of the graphics market... way better than integrated, but way cheaper than an add-on card.

    Lastly, AMD needs a way to use the last generation fab equipment a little longer. Making chipsets would let them use the fab equipment for an extra few years. They lost that cost efficiency when they spun off the flash business. Fab gear is expensive, so it's kind of a waste for them to be yanking it out everytime the minimum for a marketable CPU moves higher.

    Five years ago AMD needed partners and an ecosystem to support their own platform and survive as a company. The next five years are about turning the CPU market into a duopoly.

    I have a few shares of AMD. And I'd like to see this deal happen, but only at a decent price (from AMD's point of view). Hmm... this post turned rather long...
    • True dat. I've always loved AMD's own line of chipsets - IME they're never the fastest out of the bunch, but they're always rock-solid stable and (naturally) are open spec and so work perfectly in Linux, often before release. Much like Intel and their chipsets in fact.

      Contrast with ATI and nVidia chipsets (now that VIA, SiS and ULi are pretty much out of the market) - drivers are always binary blobs. True,you can generally run Linux on an nVidia chipset with open source drivers, even up to the reverse engin
  • Just an interesting side-note is that Intel has been filling it's low-end motherboard lineup with ATI chipset-based systems.
    Check out the D101GGC: http://www.intel.com/products/motherboard/d101ggc/ index.htm [intel.com]
    I find it odd for Intel to use a third-party's chipset in their mobos, but it would be double-weird if that third-party was AMD.
  • AMD has Centrino envy. More specifically, they need a platform strategy.

    Let's face it. CPUs are commodities. You buy price/performance.
    Recently, Intel has been using the platform to differentiate itself.
    Centrino is one example in the notebook world.
    You can see other examples with "advanced I/O" in the newer server platforms.
    Intel dictates the platform and can define it to suit their needs.

    AMD has no platform strategy. It's at the mercy of various 3rd party chipset makers.

    This is why this makes strategic
  • 100% Going To Happen (Score:4, Interesting)

    by mpapet (761907) on Friday July 21, 2006 @11:30PM (#15761795) Homepage
    Stock trading volume on ATI spiked today and price went up. Volume tells you traders are looking to make some quick cash on the spread between today and the announced merger price. Increase in ATI price says people buying stock think it's a good deal for ATI.
  • This is precisely what a marketing guy would come up with to get people to attend an otherwise boring announcement. Something is in the works but not a merger.
  • Good news (Score:3, Interesting)

    by mnmn (145599) on Friday July 21, 2006 @11:38PM (#15761818) Homepage
    I dont care what all other comments say. This is good news.

    The AGP slot has been getting faster and faster. The GPU has been getting bigger and has been doing more. There is an obvious need for a physics core and multicore CPUs. Clearly this is leading to adding the GPU to the CPU on the same chip, or at least very close to it, like the L2 cache on the slot1 Intel CPUs. After a certain AGP/PCIX bus speed, the AGP or PCIX slot will become less feasible, and it will be important to put the GPU as close to the CPU as possible.

    Now think of the PS3. Its a revolution. Its not here yet, and its release is not being managed very well, but the ball on multicore CPUs (not just dual core) has gotten rolling. The Ultrasparc T1 has shown the world that multicores can be real and actually work. Not to mention the fact that most computers bought today at least has a mediocre GPU somewhere in it. This means AMD needs a GPU to add to its multicore CPUs as another core. They've already added the northbridge to it havent they? And that has saved us money hasnt it?

    Intel has one-upped AMD recently with its Core chips, and AMD sounds like its really gonna one-up Intel with chips that should take the market away.
  • I think the merger only makes sense for AMD. They could sell a very competitive platform with processor, chipset and much better integrated graphics than Intel for the upcoming Windows Vista.

    But why should ATI be interested in a merger? They would probably lose all their Intel chipset business and a lot of the enthusiasts graphics card business on Intel platforms.

"If you don't want your dog to have bad breath, do what I do: Pour a little Lavoris in the toilet." -- Comedian Jay Leno

Working...