Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Intel

AMD, Intel, and NVIDIA Over the Next 10 Years 213

GhostX9 writes "Alan Dang from Tom's Hardware has just written a speculative op-ed on the future of AMD, Intel, and NVIDIA in the next decade. They talk about the strengths of AMD's combined GPU and CPU teams, Intel's experience with VLIW architectures, and NVIDIA's software lead in the GPU computing world." What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?
This discussion has been archived. No new comments can be posted.

AMD, Intel, and NVIDIA Over the Next 10 Years

Comments Filter:
  • by newdsfornerds ( 899401 ) on Monday March 01, 2010 @03:08PM (#31320134) Journal
    I predict wrong predictions.
    • Re: (Score:3, Insightful)

      by Pojut ( 1027544 )

      I don't even understand why people do this in an official capacity. I mean, I know they have to for shareholders or business planning purposes or whatever, but these sorts of things are almost always wrong.

      Are they just doing it for the lulz?

    • by kiwirob ( 588600 ) on Monday March 01, 2010 @04:08PM (#31321026) Homepage
      I predict they are seriously mistaken in forgetting about ARM processors in their analysis. ARM processors have taken over pretty much all the mobile and a lot of the netbook space. From wikipedia As of 2007, about 98 percent of the more than one billion mobile phones sold each year use at least one ARM processor ARM Wikipedia [wikipedia.org] The world is getting more and more mobile and the desktop processing capacity is becoming irrelevant.

      I believe Moore's Law stating the number of transistors will double on an integrated circuit every two years and the continual increase of CPU GPU processing power is a solution looking for a problem. What we need is power efficient processors that have enough processing capacity to do what we need and nothing more. Unless you are a Gamer or doing some serious GPGPU calculations in CUDA or OpenCL what on earth is the need to have a graphics card like the Nvidia GeForce GT 340 with around 380 GFLOPs of floating point processing. It's ridiculous.
      • Re: (Score:2, Informative)

        by Entropy98 ( 1340659 )

        Your right, besides video games, servers, scientific research [stanford.edu], movies, increased productivity, and probably a dozen things I haven't thought of what do we need more processing power for?

        Of course efficiency is good. Computers have been becoming more efficient since day one.

        There is a place for tiny, low power ARM chips and 150 watt 8 core server chips.

      • ARM processors have taken over pretty much all the mobile and a lot of the netbook space.

        Where can I buy those mythical ARM netbooks I keep hearing about? The only thing I've seen is that detachable tablet/netbook thing I forgot the name of and I don't even know if it was just a prototype or if they are selling them. So far ARM has no presence in the netbook market.

      • Re: (Score:3, Informative)

        by Blakey Rat ( 99501 )

        I predict they are seriously mistaken in forgetting about ARM processors in their analysis. ARM processors have taken over pretty much all the mobile and a lot of the netbook space.

        Mobile I'll give you, but netbook?

        When's the last time you saw a ARM netbook? If you've ever seen one? Sure, you read articles on Slashdot saying that they'll be here Any Day Now! But they aren't. I think there was one model in Fry's for a short while-- that's about it. Unless you count the numerous vaporware-spouting websites.

        An

    • by PopeRatzo ( 965947 ) * on Monday March 01, 2010 @08:43PM (#31324810) Journal

      I predict wrong predictions.

      Not only wrong predictions, but predictions based on a completely faulty notion.

      From the summary:

      What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?

      Do you get that? It's no longer enough for a company to innovate, to produce a quality product, and to make a profit. They have to be "on top". They have to kill the competition, to put everyone else out of business. Welcome to Capitalism, 2010.

      This might be why some people see this era as being the end-game of "free-market" capitalism. Because now the only way to produce is to destroy. Because it's not enough to succeed, but others have to fail. What good is being rich unless there are poor people to which you can compare your success? After all, if everyone's standard of living goes up, who's going to clean my fucking house?

      There was a time, in my lifetime (and I'm not that old) when a company, let's say an electronics manufacturing company, could sell some stock and use the proceeds to fund the building of a new plant, the purchase of new equipment, the hiring of new employees. The family that owns the company sees their success in terms of this growing and profitable concern. A "healthy" profit on investment for such a company could be as little as 8 percent (and this was a time when you could get 5 percent for a savings account). The people who work for this company like it so much, have done so well as employees, that entire extended families go to work for the company, generation after generation. I watched this entire cycle occur right here in my home town to a company that made industrial lighting (like the kind you'd see at a major league ballpark during a night game). Now, the company is gone. Swallowed by a company that was swallowed by a company that was swallowed by a foreign company that lost contracts to a company in Europe. There's a trail of human loss all along the way.

      The theory of markets and business that sees the killing off of companies as a preferred outcome will always end up badly.

  • ARM (Score:3, Insightful)

    by buruonbrails ( 1247370 ) on Monday March 01, 2010 @03:17PM (#31320224) Homepage
    All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.
    • Re: (Score:3, Insightful)

      I really, really hope you're wrong. Forced to choose between a smartphone and nothing at all, I'd likely go with nothing. Which would be professionally problematic, since I code for a living.
      • Re:ARM (Score:4, Insightful)

        by h4rr4r ( 612664 ) on Monday March 01, 2010 @03:37PM (#31320560)

        So you could get an arm laptop or x86 workstation. For work use thinclients will be popular again soon and many people will use a smart-phone, hooked to their tv for display when at home, instead of a home computer.

        Then the cycle will restart. Welcome to the wheel of computing.

        • Re:ARM (Score:5, Interesting)

          by Grishnakh ( 216268 ) on Monday March 01, 2010 @04:04PM (#31320988)

          And why would they bother with that, when they can simply have a separate computer at home instead of having to worry about dropping theirs and losing everything?

          PCs aren't going anywhere, and the idea that they'll be replaced by smartphones is utterly ridiculous. Despite the giant increases in computing abilities, and the ability to fit so much processing power into the palm of your hand, even mainframe computers are still with us; their capabilities have simply increased just like everything else. Why limit yourself to the processing ability that can fit into your hand, if you can have a desktop-size computer instead, in which you can fit far more CPU power and storage? Today's smartphones can do far more than the PCs of the 80s, but we still have PCs; they just do a lot more than they used to.

          Of course, someone will probably reply saying we won't need all the capability that a PC-sized system in 20 years will have. That sounds just like the guy in the 60s who said no one would want to have a computer in their home. A PC in 2030 will have the power of a supercomputer today, but by then we'll be doing things with them that we can't imagine right now, and we'll actually have a need for all that power.

          • by Sir_Sri ( 199544 )

            I dunno, I can imagine a 3D gaming tapestry taking up half my wall, and I can imagine redundant computing boxes in the house for my kids (whom I haven't had yet, and must presume they will live at home in 2030) and their half wall gaming/TV tapestries. And my girlfriend will probably want something to browse her recipes and listen to music on, not to stereotype mind you, he computer can do a lot more than that now, she just seems to only care about recipes and listening to music.

            A computer that costs a bil

          • I agree, but I think an even more important factor is the interface. Sure, your cellphone is could have enough computing power to run most of the applications you use on a regular basis, but... How fast can you type on it? How comfortable is it to do that for extended periods of time? What about all the students doing assignments, all the people writing reports and making spreadsheets? How comfortable do you feel working with such a tiny screen? The fact that cellphones have to be small to be portable limit
            • Exactly. And if you have a full-fledged desktop computer, instead of a smartphone with a docking station, you won't be completely screwed when you accidentally drop your phone in the toilet.

            • PaaT (Score:3, Interesting)

              by Belial6 ( 794905 )
              Phone as a Terminal

              The best solution would not be to run apps on the phone at all. It would be to get always on bandwidth from a PC at home to your phone that was fast enough to do remote desktop at a speed where you couldn't tell that you were working remotely. Once we have that kind of bandwidth, the phones are basically done. The phone as a terminal. With this configuration, you get:

              * Massive upgradeability on the phone since to make your phone faster, you just upgrade the PC in your home.
              * Far
              • Far greater battery life, as once the phone is a good terminal, adding more processing power to the PC will add power, but since that part is plugged into the wall, it won't drain your battery at all.

                I think you got this the wrong way around - powering the radio in your phone uses a _lot_ of battery power - far more than the CPU. If you're going to use the phone at a thin-client all the time then you're going to be keeping that radio powered up all the time and your battery won't last long at all.

                You can access the same application from a desktop, TV, or the phone, and there is no reason the interface cannot change for each.

                If you're going to have a totally different UI for each platform (and you will need that - a desktop UI is completely unsuited to a small-screen device), you may as well be using a different application on eac

                • by Belial6 ( 794905 )

                  I think you got this the wrong way around - powering the radio in your phone uses a _lot_ of battery power - far more than the CPU. If you're going to use the phone at a thin-client all the time then you're going to be keeping that radio powered up all the time and your battery won't last long at all.

                  The radio is always on anyway.

                  If you're going to have a totally different UI for each platform (and you will need that - a desktop UI is completely unsuited to a small-screen device)

                  There is no reason that a single PC cannot serve dimension appropriate UI from a single back end, so complaining about the PC's UI on a phone is a red herring derived from a lack of imagination. In fact, I have a PC here that happily serves up an Android UI, and a Mac that happily serves up an iPhone UI.

                  you may as well be using a different application on each platform anyway.

                  It isn't 1983 any more. Writing the same code for a half dozen different platforms is a waste of resources and money.

                  I think you underestimate the cost of mobile bandwidth.

                  If you read my post, you would have seen that getting the

                • I appreciate your thinking this through, but I think you're way off base.

                  My prediction: your phone will BE your computer. You'll stick it in your pocket, then when you get to work, you'll drop it in the monitor-connected dock and fire up your bluetooth (or whatever) input devices. At the end of the day, you'll pull it out of there, stick it back in your pocket. When you get home, same thing, except dock could be networked to your TV as well.

                  Simple. Gives you everything in one place, and eliminates need for

          • A PC in 2030 will have the power of a supercomputer today, but by then we'll be doing things with them that we can't imagine right now, and we'll actually have a need for all that power.

            Of course we will; Microsoft will still be producing operating systems around Windows architecture, so we will need the usual CPU-hogging Windows+antispyware+antivirus stack, and IE6 will still be in heavy use in the enterprise. ;)

      • I suspect we might be connecting our smart-phones to a monitor (or some other display technology), and using a wireless keyboard. When programming, or doing other really serious stuff.

    • Re:ARM (Score:4, Interesting)

      by Angst Badger ( 8636 ) on Monday March 01, 2010 @03:53PM (#31320788)

      All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.

      ARM may well come to dominate personal computing, but it sure won't be via the smartphone. No one is going to edit long word processor documents on their phone, much less edit spreadsheets, write code, or do much else that qualifies as actual work. And it's not because they don't already -- in many cases -- have enough processor power; it's because they don't have full-sized keyboards and monitors. I'll grant that it's possible that phones or PDAs of the future might well attach to full-featured I/O devices, but by themselves, no.

      The cloud, too, has some significant limits that it will be difficult if not actually impossible to overcome. Security is a major issue, arguably theoretically resolvable, but trusting your critical data to an outside company to whom you are, at best, a large customer is not.

      • by h4rr4r ( 612664 )

        Wireless connectivity to an HDTV for dislplay and bluetooth for keyboard and mouse.

        • by Belial6 ( 794905 )
          The opposite is the correct answer. Wireless connectivity to a PC for all phone functions as the phone is just a terminal. It is far easier and more cost effective to have your processing in a large device and transmit to the small one than it is to have your processing in a small device and transmit to a large one.
        • Comment removed based on user account deletion
    • Exactly. And programmers will all be writing their code on a smartphone with a touchscreen, and no one will have a monitor larger than 4" diagonal. 10 years from now, desktop computers will be gone.

    • You mean like all those Java network PCs you are seeing everywhere now?

    • Isn't NVIDIA making an ARM now (Tegra?) Is anyone using it yet?

      I have to take a walk get some fresh air and pizza. I almost had a seizure reading TFA. I probably would've if I'd had a better GPU!

    • by Ltap ( 1572175 )
      No, no, and no. ARM just can't scale up, and I really doubt that PCs will just disappear. Plus, would we want them to? Tiny netbooks and smartphones, with consoles for gaming? It's an ugly world, and the worst possible thing that could happen. It's also very unlikely; the tend we've seen over the past few decades is on computing power centralizing on the PC, a unified platform for everything.

      I believe (prediction alert!) that, while most of the idiotic, regular populace will just buy smartphones and netb
  • It seems like VLIW is a bit like nuclear fusion -- in ten years, people will still be talking about how its practical realization is ten years away.
    • Re: (Score:3, Insightful)

      by TheRaven64 ( 641858 )
      Sun's VLIW architecture (MAJC) was more interesting than Intel's. The point of VLIW is the same as that of RISC; take more stuff that isn't directly connected to executing instructions off the CPU and make the compiler do it. EPIC missed the point and tried to do VLIW + a load of extra stuff on the chip. Sun did proper VLIW and took it to its logical conclusion with a JIT doing absolutely everything (branch prediction, instruction scheduling, even dynamic partitioning for threads). Unfortunately, it cam
  • by Foredecker ( 161844 ) * on Monday March 01, 2010 @03:36PM (#31320548) Homepage Journal

    You can always spot a sensationalist post when part of it predicts or asks who will go out of business. Or what thing will disappear.

    For example, in his post, ScuttleMonkey [slashdot.org] asks this:

    ...Or, will we have a newcomer that usurps the throne and puts everyone else out of business?

    NNote, the post is a good one - Im not being critical. But change in the tech industry rarely result in big companies going out of business - if they do, it takes a long time. I think sun is the canonical example here. It took a long time for them to die - even after many, many missteps. Sun faded away not because of competition or some gaming changing technology, but simply because they made bad (or some would say awful) decisions. Same for Transmeta.

    People have been predicting the death of this or that forever. As you might imaging, my favorite one is predicting Microsofts death. Thats being going on for a long, long time. The last I checked, we are still quite healthy.

    Personally, I dont see Intel, AMD, or NVIDIA ding any time soon. Note, AMD came close this last year, but they have had several near death experiences over the years. (I worked there for several years...).

    Intel, AMD and NVIDIA fundamental business is turning sand into money. This was a famous quote by Jerry Sanders the found of AMD. Im paraphrasing, but it was long the idea at AMD that it didnt matter what came out of the fabs as long as the fabs were busy. Even though AMD and NVIDIA no longer own fabs, this is still their business model (more or less).

    I think its interesting how a couple of posters have talked about ARM - remember, AMD and NVIDIA can jump on the ARM bandwagon at any time. Intel already is an ARM licensee. Like AMD, they are in the business of turning sand into money - they can and will change their manufacturing mix to maintain profitability.

    I also dont see the GPU going away either. GPUs are freakishly good at what they do. By good - I mean better than anything else. Intel flubbed it badly with Larabee. A general purpose core simply isnt going to do what very carefully designed silicon can do. This has been proven time and time again.

    Domain specific silicon will always be cheaper, better performing and more power efficient in most areas than a general purpose gizmo. Note, this doesnt mean I dislike general purpose gizmos (like processors) - I believe that the best system designs have a mix of both - suited to the purpose at hand.

    -Foredecker

    • Re: (Score:2, Funny)

      by Anonymous Coward

      we [Microsoft] are still quite healthy

      Oh noes! Who let him post on slashdot?!

    • by Twinbee ( 767046 )

      Although the Larabee may not be as cost/energy/speed efficient as a dedicated GPU, perhaps it will be almost as fast, but easier to program for, thus ensuring its popularity...

    • NVIDIA's already got an ARM platform: Tegra.

  • The article is about the GPU business, which, IMO, is the less interesting aspects of these companies. The new hotness is all in the mobile space. You want interesting, read about the upcoming Cortex-A9-based battle going on for 'superphones', smartbooks and tablets.

    People have been predicting the death of the desktop computer for a long time, but the problem is that there hasn't been anything realistic to replace it. We're within sight of that replacement. Once everyone has a superphone that can do the bul

    • Mobile phones don't have the shelf life required to replace the desktop. I like having my PC at home simply because it cannot easily break like a phone can. I don't worry about getting hit in the holster with a door from a careless worker and shattering my touchscreen on my home PC, and I don't worry about someone crushing it when they carelessly sit on a table that my PC rests on. Everyone I know with a smart phone replaces it every couple of years - but I have desktops that last 5 years before I move it t
      • Mobile phones don't have the shelf life required to replace the desktop.

        This would be part of that ecosystem mention I made.

        If you dock your mobile phone to access the 'big' data (videos, music, etc.), and other stuff is in the cloud, then replacing the phone itself isn't going to be a big deal, and with cell providers going the hwole 'go with us for 2 years and get your phone for half price' thing, that will incentivize people to upgrade their shit more often, which, from a web developer's perspective, I f

        • True, but cloud computing has a serious obstacle to overcome in this area - the ISP. Every ISP is working on a way to charge per gig per month right now (I know, I work for one that actually has been doing it in the central US for over 3 years). ISPs are one of the greatest threats to tech like this simply because they want to find a way to make more money off providing the same service which is slowly choking the customer out of wanting to use the internet.
    • Desktops will always be much more powerful than phones, and there will always be ways for the "bulk of people" to utilize that extra power.

      Why on Earth won't the future be like now in this sense - mobile small computers (smart phones, netbooks, tablet pcs) that you can lose or break reasonably easily. Bulkier boxes sitting at home, that never go anywhere (not exactly a huge drain on the space inside people's home...where's the push to plug a tiny phone into your huge monitor)?

      And in the middle laptops.

      And w

    • by SEE ( 7681 )

      the ecosystem to support such a change is the hard part. Can you get access to all your data, etc., from your docked mobile? That's gonna be the key.

      And it's going to be why it doesn't happen for a LOOOOONG time. In the US, Wireless gives you a choice of AT&T, Verizon, Sprint, or T-Mobile. Wired gives you choice of your phone company or cable company. None beat having your data local, and as long as you're keeping a machine around with your data on it, why not pay ~$50 more so it can drive your keyboard, video, and monitor without plugging in your mobile?

  • Batteries limit mobile devices. Heat limits supercomputer servers.
    The first low power CPUs like the Atom were lame. Better devices on the way.
    • by Nadaka ( 224565 )

      Since when was the Atom the first low power device? ARM was earlier, uses less power and has much better performance/watt.

  • Consoles come out with 1080p/DX11-class graphics. Graphics cards for the PC try to offer 2560x1600+ and whatnot but the returns are extremely diminishing and many current PC games will come to a console with keyboard and mouse. GPGPU remains a small niche for supercomputers and won't carry the cost without mass market gaming cards. The volume is increasingly laptops with CPU+GPU in one package and there'll be Intel CPUs with Intel GPUs using an Intel chipset on Intel motherboards - and they'll be a serious

  • by Animats ( 122034 ) on Monday March 01, 2010 @04:09PM (#31321044) Homepage

    At some point, the GPU goes on the CPU chip, and gets faster as a result.

    Maybe.

    GPUs need enormous bandwidth to memory, and can usefully use several different types of memory with separate data paths. The frame buffer, texture memory, geometry memory, and program memory are all being accessed by different parts of the GPU. Making all that traffic go through the CPU's path to memory, which is already the bottleneck with current CPUs, doesn't help performance.

    A single chip solution improves CPU to GPU bandwidth, but that's not usually the worst bottleneck.

    What actually determines the solution turns out to be issues like how many pins you can effectively have on a chip.

    • At some point, the GPU goes on the CPU chip

      Actually, that point was back in January when Intel started shipping the Clarkdale Core i3 and i5 processors [anandtech.com]

      Clarkdale uses a dual-core Westmere and sticks it next to a 45nm Intel GMA die. That's right, meet the first (er, second) Intel CPU with on-chip graphics. Next year we'll see Sandy Bridge bring the graphics on-die, but until then we have Intel's tried and true multi-chip-package to tide us over.

  • by janwedekind ( 778872 ) on Monday March 01, 2010 @04:09PM (#31321046) Homepage

    I am still waiting for OpenCL to get traction. All this CUDA and StreamSDK stuff is tied to a particular company's hardware. I think there is a need for a free software implementation of OpenCL with different backends (NVidia-GPU, AMD-GPU, x86-CPU). Software developers will have great difficulties to support GPUs as long as there is no hardware-independent standard.

    • by Xrikcus ( 207545 )

      Why do you need a free *implementation*? The whole point is that OpenCL is a free standard, which is at the heart of AMD's stream SDK (with AMD GPU and CPU backends) and also included in nvidia's SDK. Given that OpenCL has an ICD interface so that you can use any of the backends from any OpenCL-supporting software, I'm not sure what isn't present that you're after other, arguably, than current market penetration.

  • by buddyglass ( 925859 ) on Monday March 01, 2010 @05:07PM (#31322022)

    What I find interesting is the overall lack of game-changing progress when it comes to non-3d-or-hd-video-related tasks. In March 2000, i.e. ten years ago, top of the line CPU would be a Pentium III coppermine, potentially topping out around 1 Ghz. I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU. Heck, it would probably handle Win7 okay. Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.

    • Re: (Score:2, Interesting)

      by yuhong ( 1378501 )

      In 1990 you would be looking at a 25mhz 486DX.

      Which is the minimum for most x86 OSes nowadays. In fact, some newer x86 OSes and software have even higher requirements. Windows XP and SQL Server 7.0 and later for example require the CMPXCHG8B instruction, and Flash 8 and later require MMX.

    • Re: (Score:3, Insightful)

      by rev_sanchez ( 691443 )
      We do seem to be in a period of diminishing returns with the top-of-the-line consumer PC hardware. Argueably we're at a point where it's difficult to add more performance to a single core and from the benchmarks I've seen suggest that we're getting to a point where adding more cores isn't helping that much for most consumer PC use.

      The biggest challenges we have today are getting more processing performance from less electricity because we're running more things on batteries and quiet computers for the ho
    • by Kjella ( 173770 ) on Monday March 01, 2010 @07:48PM (#31324292) Homepage

      I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU.

      It's wetware-limited, doesn't matter how much hardware or software you throw at it. We can spend two minutes reading a page then expect the computer to render a new one in 0.2 seconds, in practice it will never go faster. I don't know why it's become such a myth that we'll always find new uses for computing power. A few specialized tasks now and then perhaps, but in general? No, people will chat and email and listen to music and do utterly non-intensive thing that go from taking 10% to 1% to 0.1% to 0.01% of your CPU.

      Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.

      Yes, computers are starting to return to the normal world from Moore's bizarro-universe where unbounded exponential growth is possible. After decades of conditioning you become oblivious to how crazy it is to expect something double as fast for half the price every 18 months (or whichever bastardization you choose to use). Rventually a ten year old computer will be like a ten year old car, sure they've polished the design a little but it's basically the same. And that is normal, it's we that live in abnormal times where computers have improved by several orders of magnitude.

      • I think it's true that my Mom will never have use for a computer faster than what she's using today. But I find that I am still waiting on computers on a regular basis. The real bottleneck for me is multitasking. I get into several projects at once, with perhaps ten applications and 50 browser tabs open, and then the computer starts choking. Some of the problem may be more to do the available memory or with the operating system's code quality, but the point is that computers have not yet reached that point

  • I have only one thing to say about NVIDIA: nvlddmkm. That says it all. I won't ever buy NVIDIA again.

Work is the crab grass in the lawn of life. -- Schulz

Working...