Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

It's Official - AMD Buys ATI 508

FrankNFurter writes "It's been a rumour for several weeks, but now it's confirmed: AMD buys ATI. What implications is this merger going to have for the hardware market?" In addition to AMD's release, there's plenty of coverage out there.
This discussion has been archived. No new comments can be posted.

It's Official - AMD Buys ATI

Comments Filter:
  • Re:Tomorrow (Score:3, Insightful)

    by d3bruts1d ( 639027 ) on Monday July 24, 2006 @07:56AM (#15768522)
    Good lord... I hope not.

    *shudder*
  • AMD designs (Score:5, Insightful)

    by bjb ( 3050 ) * on Monday July 24, 2006 @08:08AM (#15768566) Homepage Journal
    Interesting possibility:
    • Today: AMD has integrated memory controllers to get good memory performance.
    • Tomorrow: AMD has integrated video controllers to get good 3D performance.

    OK, so not very close to reality considering what would be involved. AMD bought into ATI because it wants to focus on CPUs, not chipsets.

    However, it does make for an interesting point of interest: the three primary components of PC architecture today are the CPU, GPU and chipset that bind the two together. AMD had two parts of the equation, and ATI has two parts as well, though one of these parts overlap. Now AMD is one company that has end-to-end solutions? There's got to be something interesting coming out of that marriage.

  • by Roy van Rijn ( 919696 ) on Monday July 24, 2006 @08:11AM (#15768572) Homepage
    Well, that could be a very good thing. Those specifications will also help driver-makers a lot. It might also help to get the linux drivers which are pretty poor for ATi at the moment.

    The AMD-fans/nerds are more linux-minded then Intel (IMHO), and AMD probably knows this. They can really make a business-blow by releasing this, in the mind of open-source.
  • It WILL Be Good! (Score:5, Insightful)

    by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Monday July 24, 2006 @08:13AM (#15768576) Journal
    this could be real good if AMD's acquisition of ATI allows them to produce full chipsets in the same fashion Intel has with its Centrino line. let the competition begin!
    Yeah, the part that really sweetens the deal for us end consumers is that ATI will now get to benefit from the research that AMD inherits from IBM [nytimes.com] for chipsets. Hopefully ATI can make some better video cards with all the research that the other two have benefited off of. I hope that the same chipmaking technologies AMD has been using can now be used to improve ATI's GPUs and chipsets.

    Since (in my opinion) NVidia has taken the lead in GPUs, I hope that ATI will be boosted back into a competitive state and price wars ensue.

    Again, to me this is nothing but great news for the end-consumer.
  • Re:AMD designs (Score:4, Insightful)

    by PFI_Optix ( 936301 ) on Monday July 24, 2006 @08:22AM (#15768606) Journal
    One thing that Intel has always done better than AMD is provide the "whole package".

    What I can buy from Intel:

    Server chassis + power supply
    Motherboard
    CPU(s)
    NIC
    RAID

    What I can buy from AMD:

    CPU(s)

    Small-medium OEMs are going to like Intel because it gives them one point of support for most of their major components. It also gives them a single "partner" with which to negotiate pricing; the larger volume of product means they can get overall better pricing.

    Taking on ATI might be AMDs move to start fixing that shortfall in their business model. If they put a solid OEM-friendly motherboard on the market, it will be a huge step in the right direction. With Conroe presently beating the pants off AMD's offerings, this is well-timed.
  • Ugggh (Score:5, Insightful)

    by LaughingCoder ( 914424 ) on Monday July 24, 2006 @08:24AM (#15768615)
    I think the marketplace has been very well-served by the two dualities that existed before this move: ATI and NVidia beat each other's brains out, as did Intel and AMD. This new dynamic with 3 players does not seem, to me, to promise anywhere near as many benefits for us, the customers. Will ATI become more AMD-centric? Undoubtably. Will NVidia (which has been a great AMD booster) become less supportive of AMD processors? Probably. As this plays out, it seems to me that NVidia will basically be an Intel graphics house (including Macs), and ATI will melt into AMD, becoming mostly an internal chipset house. In the end we lose a very healthy competition between NVidia and ATI. We gain, perhaps, a stronger AMD to keep Intel honest.
  • by Xest ( 935314 ) on Monday July 24, 2006 @08:28AM (#15768632)
    ...for people like me who were in the AMD/nVidia fanclub? I've always had countless problems with ATI cards both at home and work, generally down to driver issues so I really don't want to switch to ATI, I'd personally rather go the Intel/nVidia route if this will have some adverse effect on using nVidia kit with AMD kit. I'm not sure this is good for the market either if there is some kind of lock in to ATI if you used Intel, it was kind of nice knowing you could choose between 2 processor manufacturers and 2 graphics chipset manufacturers, now it kinda feels like the choice has been dented somewhat in that you can't mix and match so well.
  • by NXprime ( 573188 ) on Monday July 24, 2006 @08:36AM (#15768665)
    Here's what I don't get. If they do that, how do you upgrade the memory bus bandwidth so that it's futureproof to a degree? Memory on graphics cards changes all the time. It's not just a GPU and Memory. It's everything in-between as well. Power voltages... ect.
  • by neersign ( 956437 ) on Monday July 24, 2006 @08:53AM (#15768773)

    I read thru most of the comments on this page, and several people came close to what I think the real reason for this deal is, but no one nailed it. To me, this is a simple example of business 101. AMD has always been a niche vendor. Recently they have begun to spread out, but it is obvious from all the comments on this page that they are still a "gamers" chip. Where Intel and Dell made it big was low-end, mass sale business computers. Intel has their crappy but good enough integrated video chipset which is a part of the vast majority of motherboards. In order for AMD to really be a big player, it needed to a) build it's own integrated chipset from scratch or b) buy a company that already makes integrated video chipsets. Option b won, and while it might cost more initially, it should pay off in the long term.

    I believe this will not stop nVidia from making nForce boards, and it would be stupid of AMD to stop production of ATI 3d cards. I think this may increase the quality of ATI's support for Linux, but I don't think it will be anything drastic.

  • Re:Goodbye ATI? (Score:3, Insightful)

    by WombatControl ( 74685 ) on Monday July 24, 2006 @08:59AM (#15768814)

    Why would AMD do that?

    No company would kill off a profitable product line just to spite their opposition. Undoubtedly ATI's deal with Apple is profitable, and just because Apple uses Intel processors doesn't mean that such a transaction is any less profitable than it was before.

    Companies don't act in that way, they look out for their bottom line. Unless there's something that would cause that business to become less profitable, ATI is unlikely to give up the block of sales they get from Apple. Is it better to cede that entire block of sales to the competition just because they don't use AMD processors? You don't win in business by reducing your sales, and having ATI graphics cards in Apples gives AMD/ATI a foothold in a very profitable market. It makes no business sense to give that up.

  • Just one question (Score:3, Insightful)

    by martinultima ( 832468 ) <martinultima@gmail.com> on Monday July 24, 2006 @09:04AM (#15768840) Homepage Journal
    From what I've heard, AMD tends to be pretty Linux-friendly, and very helpful to open-source developers who want to, say, implement AMD64 support and that kind of thing – so will this mean that ATI might start giving a damn about us too? I dunno, probably way too far-fetched, although I can't stand how my brand-new Athlon 64 box can't run 3D because ATI's stupid drivers pretty much don't work on my distribution... either way, though, so long as at least one of them keeps churning out good chips, more power to 'em!
  • by EmagGeek ( 574360 ) on Monday July 24, 2006 @09:06AM (#15768859) Journal
    "AMD has always been a niche vendor."

    Are you smoking crack? AMD has most certainly NEVER been a niche vender...

    CPUs
    FLASH
    SRAM
    PLDs
    Embedded Processors
    Microcontrollers
    Ethernet Controllers and PHYs

    What niche exactly are you talking about here?
  • Re:Linux (Score:5, Insightful)

    by dfjghsk ( 850954 ) on Monday July 24, 2006 @09:16AM (#15768917)
    right.. because as we all know.. Linux support == market leader.

    They may become a market leader for Linux desktops (GPU's aren't needed in servers where Linux is popular).. but Linux desktops are only 1-2 percent of the desktop market...

    so even if they gain all of it.. they still won't be a market leader in GPUs.
  • Re:Maybe (Score:5, Insightful)

    by Giant Ape Skeleton ( 638834 ) on Monday July 24, 2006 @09:28AM (#15769016) Homepage
    Why are posters so fond of the anti-open source hardware vendor NVidia?

    Probably because most Slashdotters are not driver hackers nor OSS purists, they are developers, gamers, and power users -- and Nvidia's hardware (and driver support for the hardware) is phenomenal.

    Your gripe is not baseless, though: would it kill Nvidia to open up a bit? Perhaps the renewed competition will encourage them to do so, although it's equally likely that they will take the opposite tack and circle their wagons ever more tightly. As long as they provide excellent binary drivers for Linux, I doubt that they will feel much incentive to go Open Source...

  • Re:Tomorrow (Score:4, Insightful)

    by vhogemann ( 797994 ) <`victor' `at' `hogemann.com'> on Monday July 24, 2006 @09:30AM (#15769037) Homepage
    Consider for a moment, that Intel does provide usable OpenSource drivers for their Video Chipsets.
  • by MrNemesis ( 587188 ) on Monday July 24, 2006 @09:43AM (#15769141) Homepage Journal
    It's not just the nerds being more Linux minded - AMD has, to some extent, bet the farm on the K8 being the king of the server room, since the entire core was designed from the off to be highly scalable across multiple CPU's. ANd now we're seeing that most of the big advances (new "enterprise" sockets, K8L stuff) are going to benefit the servers before they benefit Joe Public.

    AMD knows that, whatever market share it has in the desktop arena, Linux is a major player in the HPC and 2P+ spaces and knows that Linux sysadmins won't tolerate buggy chipsets or flaky binary drivers that may end up being unsupported under kernel 2.8 or whatever. Hopefully AMD has the nouse to do an Intel and make their chipsets open spec across the board a la Intel, enabling excellent support under Linux and any other OS that happens to come along. I know for a fact that sometimes shoddy chipset support under Linux has been a turnoff for me in the past and I've lusted after some of Intel's chipsets on my own A64 systems.
  • Re:Linux Support ? (Score:3, Insightful)

    by Dave2 Wickham ( 600202 ) on Monday July 24, 2006 @09:47AM (#15769169) Journal
    Whilst I use the nVidia blob (actually I'm using a Matrox Millennium II at the moment, but I digress), the reason why people want open source drivers is so that if it doesn't work, they can make it work, rather than relying on nVidia to fix it.

    Another benefit would be that if nVidia's drivers were GPLd, they could be included with the Linux kernel and X(org|Free86) if they were to a high enough standard, completely eliminating the current issue of having to kill X to install the drivers, and reinstall with every kernel update; an open source driver would be far simpler to work with for all users.

    Whilst the blob is, IMO, better than nothing, I'd still much rather prefer good OSS drivers.
  • Re:Goodbye ATI? (Score:3, Insightful)

    by Johnny Mozzarella ( 655181 ) on Monday July 24, 2006 @10:10AM (#15769333)
    Apple first began offering ATI graphics and then Nvidia and most recently intel graphics.

    iBooks always used only ATI graphics.
    iMacs have used both ATI and Nvidia graphics.
    PowerBooks have used both ATI and Nvidia graphics.
    PowerMacs have used both ATI and Nvidia graphics.

    The Mac mini and MacBook are currently using intel integrated graphics (high volume products)
    The MacBook Pro and iMac both currently use ATI graphics (high volume products)
    The PowerMac currently uses Nvidia graphics (low volume product)

    Apple has enjoyed the benefits of being able to pit ATI, Nvidia and intel against each other to get the best prices for their chips.
    I wouldn't be surprised to see AMD create a low cost chipset solution that we might see in a future Mac mini or iBook.
    Nothing is going to change in the next year but this will give AMD an opportunity to work with Apple and pitch it's wares.
  • Hmmmm, Consoles (Score:4, Insightful)

    by MrCopilot ( 871878 ) on Monday July 24, 2006 @10:27AM (#15769449) Homepage Journal
    Q.) How many next-gen consoles have AMD in them now?

    A.) Xbox, Nintendo

    Analysis.....Good move.

  • Re:Tomorrow (Score:3, Insightful)

    by ivan256 ( 17499 ) on Monday July 24, 2006 @10:31AM (#15769472)
    Ok, considered.... And dismissed.

    I hate how people write off ATI and Nvidia as Open Source scrooges since their drivers are closed. The reality is that their code isn't all home grown and they couldn't open source it even if they wanted to. The copyright and patent holders on their licensed technologies wouldn't let them.
  • by Anonymous Coward on Monday July 24, 2006 @10:42AM (#15769556)
    NVIDIA has stated many times that even if they wanted to open up documentation to their cards, they can't.
    There are cross-licensing issues that prophibits them from releasing the specs.
    So they release closed binary drivers for linux instead.

    Stop whining people, they are doing the best they can.

    (P.S I can imagine ATIs situation is similar)
  • Re:Tomorrow (Score:5, Insightful)

    by kimvette ( 919543 ) on Monday July 24, 2006 @10:50AM (#15769615) Homepage Journal
    Actually, their competitors are unaffected because:

    1. They have large enough staff to decompile and perform clean reverse engineering of NVidia's drivers, e.g., one team analyzes the decompiled code and takes notes (without copying code of course), another team designs improvements and implements based on that analysis

    2. Their competitors own electron microscopes, making analysis of the chip internals relatively simple.

    Now tell me: why are the likes of NVidia and ATI keeping their products undocumented and their drivers closed?

    And to counter your argument: what happens in two years when ATI and NVidia decide your card is too old to support, and yet it still performs very well but you NEED the features in the latest kernel and latest x.org? Go ahead, buy a new video card -- oops, nope, sorry, they changed slot specs again, and PCI Express cards are no longer available because PCI-X finally gained market share in the consumer market and PCI-E ended up as short-lived as VLB did in the VLB vs. PCI war.

    (do I expect PCI-E to die? No, it was a hypothetical example showing the potential problem with proprietary drivers)
  • by Wudbaer ( 48473 ) on Monday July 24, 2006 @10:51AM (#15769627) Homepage
    Really great idea NOT. This would create more or less a monopoly for high-end graphic chips for NVidia. Who else is there ? Intel with their cheap chipset graphics ? Matrox ? A handful of far eastern companies that produce cheap and sucky low-end graphics products noone uses ? Nvidia buying ATI would be the worse thing that could happen for the consumer, even worse than the Intel quasi-monopoly of the dark years before the Athlon. As the history of Intel-vs.-AMD cleary shows competition not only is good for the consumer but in the end also for the companies which are required to innovate and improve their products and to keep themselves strong and vital.
  • by default luser ( 529332 ) on Monday July 24, 2006 @11:18AM (#15769821) Journal
    Now tell me: why are the likes of NVidia and ATI keeping their products undocumented and their drivers closed?

    Because, if they DO PROTECT THEIR IP, The OTHER GUY has to waste TONS OF MONEY on reverse-engineering teams and highly-qualified people to reverse-engineer the processor via electron microscopes.

    It's not the EQUIPMENT that is expensive, it is the PEOPLE. And, as you Linux zealots know FULL WELL, reverse-engineering is EXPENSIVE in terms of PEOPLE and TIME.

    If you publish the specifications of your latest graphics chip for all to see, suddenly your competitors don't have to divert staff from working on next-generation architectures just to reverse-engineer your system. Instead, they can analyze your documentation in a fraction of the time.

    It's a two-way street, so stop deluding yourself that there's only one side to the story. Publishing full specs for your graphics chips is like writing your competition a blank check. Intel is the only one who doesn't have an issues doing this because their graphics technology is always following.

    And to counter your argument: what happens in two years when ATI and NVidia decide your card is too old to support, and yet it still performs very well but you NEED the features in the latest kernel and latest x.org? Go ahead, buy a new video card.

    Yes. There are still many well-supported video cards sold in AGP. In fact, you can still get well-supported video cards in PCI, a fifteen-year-old technology. They're not top-performers, but beggars can't be choosers.

    The video card market is transitioning to PCIe with surprising speed precisely because they do not want another VLB fiasco. The PCI -> AGP transition was slow because PCI still had a future for other types of cards, but the AGP -> PCIe transition was rushed to avoid market confusion. You can still buy plenty of AGP cards, but the big players have made it clear: there won't be any more improvements for AGP.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday July 24, 2006 @12:09PM (#15770245)
    Comment removed based on user account deletion
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Monday July 24, 2006 @12:23PM (#15770348)
    Comment removed based on user account deletion
  • by i7dude ( 473077 ) on Monday July 24, 2006 @01:16PM (#15770762)
    ignoring the obvious chipset arguement that seems to arise. could another posibility be that amd wishes to improve its floating point design? i mean, gpu's are essentially just huge application specific floating point units. i dont really have enough information or understanding of the extent of amd's current designs to know, but it would seem they could gain a lot of ip in that area.

    dude.
  • by Lisandro ( 799651 ) on Monday July 24, 2006 @01:49PM (#15771021)
    If you publish the specifications of your latest graphics chip for all to see, suddenly your competitors don't have to divert staff from working on next-generation architectures just to reverse-engineer your system. Instead, they can analyze your documentation in a fraction of the time.

        Bullshit, sorry. We don't want their beloved silicon blueprints of their latest GPUs, just information on how to make them work. Want to draw a polygon? Send this command to the card. Do a hardware T&L? This other one. You can learn only so much from a driver sourcecode or techincal specifications on how to program a GPU. Don't beleive me? Check the information released by both nVidia and ATI for their older GPUs, and see how much you can infer from them.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...