Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

ATI and AMD Seek Approval for Merger? 229

Krugerlive writes "The rumor of an ATI/AMD merger/buyout has been out now for sometime. However, this morning an Inquirer article has said that a merger deal has been struck and that the two companies will seek shareholder approval on Monday of next week. In the market, AMD is down as a result of lackluster earnings announced last evening, and ATI is up about 4% on unusually high volume." This is nothing but a rumour at the moment, a point that C|Net makes in examining the issue. From the article: "AMD has always boasted that it only wants to make processors, leaving networking and chipsets to others. AMD does produce some chipsets, but mostly just to get the market started. Neutrality has helped the company garner strong allies."
This discussion has been archived. No new comments can be posted.

ATI and AMD Seek Approval for Merger?

Comments Filter:
  • by Paul Jakma ( 2677 ) on Friday July 21, 2006 @08:22PM (#15761023) Homepage Journal
    The X.org drivers do support 3D, and quite well, on the older R100 and R200 cards. R300/400 are also supported for 3D, but those have needed extensive reverse engineering, and hence are not quite as mature (though, getting there apparently), also they have only really reverse engineered the equivalent of the R200 feature set, so they're not getting the most out of the cards - all thanks to ATis silly attitude about supplying documentation.
  • by PhoenixPath ( 895891 ) on Friday July 21, 2006 @08:22PM (#15761025)
    I think NVidia needs to get into the processor market themselves.

    GPU = Graphics Processing Unit.

    AFAIK, they've been in the processor business since they launched their first graphics card. :p
  • by Anonymous Coward on Friday July 21, 2006 @08:24PM (#15761034)
    I think NVidia needs to get into the processor market themselves. Maybe not for general computing, but I bet their designers have some great ideas for a processor that would be at home in a console! With GPU's being so powerful these days, I can't imagine that they lack the expertise to do it.

    Hardly. CPU and GPU design are very different tasks at so many levels.

    At the highest level, the architectures are radically different - a GPU is basically a bunch of the instantiations of a minimally-programmable, customized, low-speed DSP pipeline on a core, whereas CPUs are highly programmable, general purpose, extremely agressive designs. Saying nvidia has the know-how is like saying that someone who designed a system of 100 rowboats to troll in a lake has the know-how to design racing speedboats.

    At lower levels, GPUs are designed using synthesis and place&route, while CPUs tend to be semi-custom with some full-custom blocks. Circuit design is not something GPU companies do - they're given a library of gates from the fab company they use, and use those gates. In CPUs, lots of circuits are designed using fancier circuits (for example, the Itanium's adder has dynamic logic and complex passgate logic) and many things are laid out by hand (i.e. an engineer draws the physical shapes that will be used after some processing to make the masks)
  • by Anonymous Coward on Friday July 21, 2006 @08:33PM (#15761074)
    This is fairly obviously a troll, but I'll respond anyway. Have you considered that it's also the first console ever that wasn't rendering at 640x480 60 fields a second, versus 1024x768, 1280x1024 or even higher for a pc? At the second res that's more than 4 times the resolution of what most Xbox/PS2 games were, 8 times if the game wasn't progressive scan, which the XBox was the first to do for most games. The Xbox 360 renders at 1280x720 at the lowest, which is much closer to a normal PC res.
  • by Anonymous Coward on Friday July 21, 2006 @09:50PM (#15761312)
    "Circuit design is not something GPU companies do..."

    interesting... incidentally, i happen to work for a gpu company (one mentioned in this article even...), and we have a large number of engineers doing full-custom circuit-design work. they may not be working on custom adders (we don't need them), which is perhaps the point you were trying to make, but they are often doing some very complicated circuits nonetheless...
  • Re:Depends. (Score:3, Informative)

    by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Saturday July 22, 2006 @12:55AM (#15761872) Homepage Journal
    Hmmm. The only possible overlap is in the fabrication. Designing a good graphics processor is going to be very different from designing a good CPU, so they can't overlap the development teams (which will likely be small anyway). It's very doubtful the chips would be of similar enough size and have similar enough characteristics to do much about packaging or testing. Unless they're planning on unifying the scale at which they're making the chips, it's not clear they could do much about the etching. They could buy the materials jointly, thus increasing bulk orders and reducing costs, but they could do that with a simple agreement.


    Management is a fairly big expense, but as the total number of projects wouldn't change significantly, neither would the number of managers required. That just leaves the board of directors. Half the directors could be fired, but it's doubtful either CEO is going to consider their choice of senior management to be the inferior choice. Which means that one board would win and the other board will lose. On the whole, that is. The CEO of the winning board might cherry-pick a few who are really exceptional or who have given him lots of money.


    You also need to bear in mind that CPU sales for AMD are (on average) rising but their profit margins are slumping, so if they gain access to another fab plant, it won't be to close it. It'll be for continuing in a price-war against Intel that both companies are losing. (Neither has the resources to continue until the other is completely vanquished AND remain competitive with other CPU manufacturers. Both Intel and AMD are latecomers in both the multi-core and 64-bit arenas, and neither can match the more experienced players on scalability at this time.) However, neither AMD nor Intel can afford to back off - their designs are divergent enough that the market cannot sustain both of them indefinitely. Intel can't even afford to maintain the StrongARM architecture anymore, things are so tight.

  • by Anonymous Coward on Saturday July 22, 2006 @01:56AM (#15762001)
    Circuit design is not something GPU companies do - they're given a library of gates from the fab company they use, and use those gates. In CPUs, lots of circuits are designed using fancier circuits and many things are laid out by hand

    (disclaimer: I work for one of the two big GPU companies)

    Man, your information is very outdated. I would estimate that at least 25% of a current GPU is laid out by hand. CPUs definitely have more custom parts, but not more than 50-60% of the chip. The rest is synthesized using standard logic gates, just as 75-80% of GPUs are. This is the only way to be able to reach the insanely fast clock speeds on some of the interfaces.
  • Re:GPU in socket? (Score:2, Informative)

    by KDR_11k ( 778916 ) on Saturday July 22, 2006 @12:35PM (#15763217)
    I think you're referring to that 1T "S"RAM tech that was used in the Gamecube and is going into the Wii as well. That'll probably work. Not sure if SRAM isn't a bit overkill for that purpose, though.

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...