Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel

AMD, Intel, and NVIDIA Over the Next 10 Years 213

GhostX9 writes "Alan Dang from Tom's Hardware has just written a speculative op-ed on the future of AMD, Intel, and NVIDIA in the next decade. They talk about the strengths of AMD's combined GPU and CPU teams, Intel's experience with VLIW architectures, and NVIDIA's software lead in the GPU computing world." What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?
This discussion has been archived. No new comments can be posted.

AMD, Intel, and NVIDIA Over the Next 10 Years

Comments Filter:
  • by Lord Ender ( 156273 ) on Monday March 01, 2010 @03:24PM (#31320370) Homepage

    The GPU will go the way of the coprocessor

    On the contrary, I think the CPU will go the way of the coprocessor. The humble Atom may be enough CPU power for most people these days, but you can never have enough GPU power... at least not until your po-- I mean, games, are photorealistic in real time.

  • by ircmaxell ( 1117387 ) on Monday March 01, 2010 @03:29PM (#31320428) Homepage
    Well... There's 2 ways of looking at it. Either the GPU and the CPU will be merged into one beast, or there will be further segregating of tasks. In terms of price, what's more efficient: Having 1 chip that can do everything (Picture a 128 core CPU, that has different cores optimized for different tasks. So 32 cores optimized for floating point processes, 32 for vector processes and 64 for "generic computing") or having multiple chips that are each fully optimized for their task. Actually, now that I think about it, I'd probably say both. Economy computers would be based off the "Generic" cpu, whereas performance computers and servers would have add-in modules that let you tailor the hardware more towards the task at hand. So the motherboard could get an additional 8 sockets (similar to DIMM sockets) that would let you plug in different modules. So if you need to do graphics heavy processing (video games, movie rendering, etc) you'd add 8 GPU modules to the motherboard. If you needed floating point capacity, you'd add 8 FPU modules... Etc... The advantage of doing it that way over the current PCIe method, is that you get to skip the southbridge (So these modules would have full speed access to system memory, hardware and each other). Of course, there are a lot of hurdles to implementing such a thing...

    I am not an engineer, these are just thoughts that rolled off my head...
  • Re:ARM (Score:4, Interesting)

    by Angst Badger ( 8636 ) on Monday March 01, 2010 @03:53PM (#31320788)

    All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.

    ARM may well come to dominate personal computing, but it sure won't be via the smartphone. No one is going to edit long word processor documents on their phone, much less edit spreadsheets, write code, or do much else that qualifies as actual work. And it's not because they don't already -- in many cases -- have enough processor power; it's because they don't have full-sized keyboards and monitors. I'll grant that it's possible that phones or PDAs of the future might well attach to full-featured I/O devices, but by themselves, no.

    The cloud, too, has some significant limits that it will be difficult if not actually impossible to overcome. Security is a major issue, arguably theoretically resolvable, but trusting your critical data to an outside company to whom you are, at best, a large customer is not.

  • Re:ARM (Score:5, Interesting)

    by Grishnakh ( 216268 ) on Monday March 01, 2010 @04:04PM (#31320988)

    And why would they bother with that, when they can simply have a separate computer at home instead of having to worry about dropping theirs and losing everything?

    PCs aren't going anywhere, and the idea that they'll be replaced by smartphones is utterly ridiculous. Despite the giant increases in computing abilities, and the ability to fit so much processing power into the palm of your hand, even mainframe computers are still with us; their capabilities have simply increased just like everything else. Why limit yourself to the processing ability that can fit into your hand, if you can have a desktop-size computer instead, in which you can fit far more CPU power and storage? Today's smartphones can do far more than the PCs of the 80s, but we still have PCs; they just do a lot more than they used to.

    Of course, someone will probably reply saying we won't need all the capability that a PC-sized system in 20 years will have. That sounds just like the guy in the 60s who said no one would want to have a computer in their home. A PC in 2030 will have the power of a supercomputer today, but by then we'll be doing things with them that we can't imagine right now, and we'll actually have a need for all that power.

  • by buddyglass ( 925859 ) on Monday March 01, 2010 @05:07PM (#31322022)

    What I find interesting is the overall lack of game-changing progress when it comes to non-3d-or-hd-video-related tasks. In March 2000, i.e. ten years ago, top of the line CPU would be a Pentium III coppermine, potentially topping out around 1 Ghz. I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU. Heck, it would probably handle Win7 okay. Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.

  • In 1990 you would be looking at a 25mhz 486DX.

    Which is the minimum for most x86 OSes nowadays. In fact, some newer x86 OSes and software have even higher requirements. Windows XP and SQL Server 7.0 and later for example require the CMPXCHG8B instruction, and Flash 8 and later require MMX.

  • PaaT (Score:3, Interesting)

    by Belial6 ( 794905 ) on Monday March 01, 2010 @07:13PM (#31323864)
    Phone as a Terminal

    The best solution would not be to run apps on the phone at all. It would be to get always on bandwidth from a PC at home to your phone that was fast enough to do remote desktop at a speed where you couldn't tell that you were working remotely. Once we have that kind of bandwidth, the phones are basically done. The phone as a terminal. With this configuration, you get:

    * Massive upgradeability on the phone since to make your phone faster, you just upgrade the PC in your home.
    * Far greater battery life, as once the phone is a good terminal, adding more processing power to the PC will add power, but since that part is plugged into the wall, it won't drain your battery at all.
    * Losing your phone does not effect any of your data.
    * Replacing your phone is simpler.
    * You can access the same application from a desktop, TV, or the phone, and there is no reason the interface cannot change for each.
    * Better utilization of processing power, since people will end up with a home server anyway, for running their home media servers, security systems, home automation, etc...
    * Cheaper. It will always be more expensive to build these things smaller, so putting it in a PC makes it cheaper.
    * Faster to market. It takes time to shrink electronics.
    * Possible functionality that is impossible on the phone. We are getting to the point where we may be limited by physics on how small transistor can become. This means that the amount of processing power that would be supplied to the phone as a terminal may be impossible to have in a handheld device.
  • by TikiTDO ( 759782 ) <TikiTDO@gmail.com> on Monday March 01, 2010 @07:31PM (#31324088)

    Your post is based on several assumptions that make no sense to me as a student of human nature, and an engineer.

    1. 1080p is current technology. Even if we assume we will not have hologram visual output within the near future, there will still be some new technology that the powers that be will sell to the masses. It may be an incremental improvement, but it will still be enough to drive the markets.
    1a. As long as it's new and shiny, there will always be someone to buy it.
    2. Consoles use GPUs and CPUs the same as PCs do. There is a longer update cycle in place, but whenever each cycle ticks they adopt all the new technology that has been developed during the lifetime of a console. As such, it makes sense for the console makers to encourage such development.
    3. Intel would have to shut down all of their operations to let nVidia claim the workstation market. Like it or not, Intel still makes pretty hefty CPUs, owns the workstation market, and has more disposable cash, and a bigger engineering staff than any other chip maker. The embedded market has even more competition for its crown, so I will not go there. The supercomputer market, while good for satisfying the nerd bragging rights quota, is not know for being an amazing source of profit.
    4. The AMD vs Intel battle for the mid-range market is actually something I can see coming to pass. I would not be too surprised if this market gets a third player as the line between computation devices becomes blurred.
    5. ARM is not the only company in the world that can make a low power chips. Worst case, ARM has a few years of dominance before the other guys catch up. Also, as the article pointed out, integrated CPU/GPU has several obvious advantages over discrete CPU + discrete GPU.

    In all, while I am not ready to make my own predictions, yours could use a bit more analysis and tweaking.

  • by Kjella ( 173770 ) on Monday March 01, 2010 @07:48PM (#31324292) Homepage

    I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU.

    It's wetware-limited, doesn't matter how much hardware or software you throw at it. We can spend two minutes reading a page then expect the computer to render a new one in 0.2 seconds, in practice it will never go faster. I don't know why it's become such a myth that we'll always find new uses for computing power. A few specialized tasks now and then perhaps, but in general? No, people will chat and email and listen to music and do utterly non-intensive thing that go from taking 10% to 1% to 0.1% to 0.01% of your CPU.

    Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.

    Yes, computers are starting to return to the normal world from Moore's bizarro-universe where unbounded exponential growth is possible. After decades of conditioning you become oblivious to how crazy it is to expect something double as fast for half the price every 18 months (or whichever bastardization you choose to use). Rventually a ten year old computer will be like a ten year old car, sure they've polished the design a little but it's basically the same. And that is normal, it's we that live in abnormal times where computers have improved by several orders of magnitude.

  • by Anonymous Coward on Monday March 01, 2010 @07:48PM (#31324304)

    The answer is simple. Diminishing returns.

    What's a 200Mhz bump to a quadcore 2.8Ghz system? Not much, overall.

    What's a 200Mhz bump to a 25mhz 486? 3 orders of a magnitude of improvement.

    Consider that each major generation of CPU brought improvements that allowed for faster performance while bringing clock speed down... and the effective limits we've hit using silicon and air cooling... all we can do now is add more cores.

    Nowadays... Bumping up the clock speed alone when we're already past 2Ghz is probably the most insignificant thing we can do for performance.

BLISS is ignorance.

Working...