Here's the article.
Here's my thoughts:
AMD acquired ATI to put a GPU or two on their silicon, to add value to a processor that is already faster than most people need.
AMD sees the writing on the wall and have started creating a new architecture to integrate more functionality into the CPU silicon. Its no longer just a CPU, its a system on a chip (SOC).
If AMD takes their four core processor and delete two cores, then put in an ATI GPU and sound, SATA, Gigabit ethernet, etc, then they will have an entire computer on a chip. No doubt this is where they are heading.
VIA has been doing this for years but their CPU / Video performance is not high enough to earn them much market share.
Intel has now seen the writing on the wall (AMD showed it to them again, just like they did with the Athlon 64 on chip memory controller) and have talked about putting a GPU and other functions on their CPU. They are behind, but with as much cash and resources as they command, they will be catching up.
Convergence is happening and its happening quickly.
The point which was missed in this article is simply that NVidia will be forced into being an ultra high end supplier where people will actually question why someone would need a separate video card when the SOC GPU is plenty for most stuff (a situation they are well aware of because its basically where they have stood up till now). They may be able to ship a GPU for a year or two to major OEMs because of Vista's demands, but the integration of GPU to CPU will once again remove them from the mainstream PC.
This has got me thinking about the future of computers, here are some ideas I've come up with.
I predict that 90% if not more of the systems produced in late 2008 / early 2009 will have tiny little mother boards with no upgrade-ability and CPUs soldered to the board. The board will have no PCI, no PCI express, no slots to plug goodies into. They will have USB, firewire, ethernet, wireless, video, audio and some memory slots. They will be little boxes smaller than a Mac Mini that cost $200.00 without a hard drive, and $350.00, about the size of a mini, with a hard drive. There will be enough flash to store an OS with some file storage, it will have a DVD burner and many people will hook up external hard drives to store their media or they will install rack-like servers in their home. This machine will turn on instantly. "Boot time" will become a joke (as it should already be).
Files will be saved online or on DVD. The DVD storage method being a throwback to the old floppy disk days.
It may be a laptop. Desktops may go away, there may only be two classes of computer in 2008/2009, laptop and server.
I predict, if intel fails to make a GPU with enough umph to match AMDs SOC, then Intel WILL BUY NVIDIA (maybe only the graphics division). They will have to buy NVidia to compete with AMD. This is not idle speculation, lets remember how successful intel's last foray into GPUs was. They were slow and barely enough to sell integrated graphics to low end desktops. intel's add in cards disappeared after a year or so. intel video manages to command some market share though because its cheap and integrated. Cheap won't be good enough if ATI GPUs beat them on performance because we know AMD is willing to keep prices low enough to compete.
What other predictions can you make about the future of computing?
Think we'll all be computing from our XBOX on our HD TV?
What do you think the future holds for the only giant fab-less semiconductor manufacturer left?