Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
AMD Intel

AMD, Intel, and NVIDIA Over the Next 10 Years 213

Posted by ScuttleMonkey
from the what-next dept.
GhostX9 writes "Alan Dang from Tom's Hardware has just written a speculative op-ed on the future of AMD, Intel, and NVIDIA in the next decade. They talk about the strengths of AMD's combined GPU and CPU teams, Intel's experience with VLIW architectures, and NVIDIA's software lead in the GPU computing world." What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?
This discussion has been archived. No new comments can be posted.

AMD, Intel, and NVIDIA Over the Next 10 Years

Comments Filter:
  • by newdsfornerds (899401) on Monday March 01, 2010 @02:08PM (#31320134) Journal
    I predict wrong predictions.
  • by MobileTatsu-NJG (946591) on Monday March 01, 2010 @02:15PM (#31320210)

    With greater personal power, we won't have Microsoft dictating what 3D features we can have. With individuals become supercomputers, these three companies will be out of business. However, personal survivability and power will be sufficient that former employees will be fine.

    What?

  • ARM (Score:3, Insightful)

    by buruonbrails (1247370) on Monday March 01, 2010 @02:17PM (#31320224) Homepage
    All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.
  • Re:ARM (Score:3, Insightful)

    by ShadowRangerRIT (1301549) on Monday March 01, 2010 @02:23PM (#31320350)
    I really, really hope you're wrong. Forced to choose between a smartphone and nothing at all, I'd likely go with nothing. Which would be professionally problematic, since I code for a living.
  • by Pojut (1027544) on Monday March 01, 2010 @02:34PM (#31320518) Homepage

    I don't even understand why people do this in an official capacity. I mean, I know they have to for shareholders or business planning purposes or whatever, but these sorts of things are almost always wrong.

    Are they just doing it for the lulz?

  • by Foredecker (161844) * on Monday March 01, 2010 @02:36PM (#31320548) Homepage Journal

    You can always spot a sensationalist post when part of it predicts or asks who will go out of business. Or what thing will disappear.

    For example, in his post, ScuttleMonkey [slashdot.org] asks this:

    ...Or, will we have a newcomer that usurps the throne and puts everyone else out of business?

    NNote, the post is a good one - Im not being critical. But change in the tech industry rarely result in big companies going out of business - if they do, it takes a long time. I think sun is the canonical example here. It took a long time for them to die - even after many, many missteps. Sun faded away not because of competition or some gaming changing technology, but simply because they made bad (or some would say awful) decisions. Same for Transmeta.

    People have been predicting the death of this or that forever. As you might imaging, my favorite one is predicting Microsofts death. Thats being going on for a long, long time. The last I checked, we are still quite healthy.

    Personally, I dont see Intel, AMD, or NVIDIA ding any time soon. Note, AMD came close this last year, but they have had several near death experiences over the years. (I worked there for several years...).

    Intel, AMD and NVIDIA fundamental business is turning sand into money. This was a famous quote by Jerry Sanders the found of AMD. Im paraphrasing, but it was long the idea at AMD that it didnt matter what came out of the fabs as long as the fabs were busy. Even though AMD and NVIDIA no longer own fabs, this is still their business model (more or less).

    I think its interesting how a couple of posters have talked about ARM - remember, AMD and NVIDIA can jump on the ARM bandwagon at any time. Intel already is an ARM licensee. Like AMD, they are in the business of turning sand into money - they can and will change their manufacturing mix to maintain profitability.

    I also dont see the GPU going away either. GPUs are freakishly good at what they do. By good - I mean better than anything else. Intel flubbed it badly with Larabee. A general purpose core simply isnt going to do what very carefully designed silicon can do. This has been proven time and time again.

    Domain specific silicon will always be cheaper, better performing and more power efficient in most areas than a general purpose gizmo. Note, this doesnt mean I dislike general purpose gizmos (like processors) - I believe that the best system designs have a mix of both - suited to the purpose at hand.

    -Foredecker

  • Re:ARM (Score:4, Insightful)

    by h4rr4r (612664) on Monday March 01, 2010 @02:37PM (#31320560)

    So you could get an arm laptop or x86 workstation. For work use thinclients will be popular again soon and many people will use a smart-phone, hooked to their tv for display when at home, instead of a home computer.

    Then the cycle will restart. Welcome to the wheel of computing.

  • by lorenlal (164133) on Monday March 01, 2010 @02:44PM (#31320644)

    I think it's because they're being paid to.

  • by maxume (22995) on Monday March 01, 2010 @02:49PM (#31320712)

    If he is getting paid well, he doesn't give two shits what kind of clap you are doing.

  • by Grishnakh (216268) on Monday March 01, 2010 @02:54PM (#31320818)

    I disagree. Floating-point coprocessors basically just added some FP instructions to a regular single-threaded CPU. There was no parallelism; they just removed the need to do slow floating-point calculations using integer math.

    However, GPUs, while they mainly do floating-point calculations, are essentially vector processors, and do calculations in parallel. They can easily benefit from increased size and parallelism: the more parallel processing capability a GPU has, the more realistic it can make graphical applications (i.e. games). And with all the GPGPU applications coming about (where you use GPUs to perform general-purpose (i.e., not graphics) calculations), there's no end to the amount of parallel computational power that can be used. The only limits are cost and energy.

    So if someone tried to fold the GPU into the processor, just how much capability would they put there? And what if it's not enough? Intel has already tried to do this, and it hasn't killed the GPU at all. Not everyone plays bleeding-edge 3D games; a lot of people just want a low-powered computer for surfing the web, and maybe looking at Google Earth. An Intel CPU with a built-in low-power GPU works fine for that, but it won't be very useful for playing Crysis unless you think 5 fps is good. People who want to play photo-realistic games, however, are going to want more power than that. And oil exploration companies and protein-folding researchers are going to want even more.

    GPUs aren't going anywhere, any time soon. Lots of systems already have eliminated them in favor of integrated solutions, but these aren't systems you're going to play the latest games on. For those markets, NVIDIA is still doing just fine.

  • by janwedekind (778872) on Monday March 01, 2010 @03:09PM (#31321046) Homepage

    I am still waiting for OpenCL to get traction. All this CUDA and StreamSDK stuff is tied to a particular company's hardware. I think there is a need for a free software implementation of OpenCL with different backends (NVidia-GPU, AMD-GPU, x86-CPU). Software developers will have great difficulties to support GPUs as long as there is no hardware-independent standard.

  • by Foredecker (161844) * on Monday March 01, 2010 @03:41PM (#31321586) Homepage Journal
    Yes - it takes about two years (or more) to go from a white board to first silicon. Until I worked at Microsoft, I worked at hardware and silicon companies. But remember, the competition to Intel, AMD and NVIDIA will be other silicon companies - not software companies. The new compitetion will have the same constraints. This is also a small industry - its very difficult to do someting both major and new in secret. When I was at AMD, we knew about Transmeta's plans when they were still in stealth mode. It wasn't because of anything nefarious - the community is small and leaky. -Foredecker
  • by rev_sanchez (691443) on Monday March 01, 2010 @05:36PM (#31323430)
    We do seem to be in a period of diminishing returns with the top-of-the-line consumer PC hardware. Argueably we're at a point where it's difficult to add more performance to a single core and from the benchmarks I've seen suggest that we're getting to a point where adding more cores isn't helping that much for most consumer PC use.

    The biggest challenges we have today are getting more processing performance from less electricity because we're running more things on batteries and quiet computers for the home theater (which tends to mean fanless which tends to mean less heat which tends to mean less electricity) and I don't see that going away. The prime motivator for high-end PC hardware is high-quality gaming and that is a shrinking market as publishers focus on console development because of piracy.
  • by washu_k (1628007) on Monday March 01, 2010 @07:19PM (#31324628)
    What are you talking about?

    The current version of OSX can run apps as far back as 2001. Apple does not provide any official way of running apps older than that.

    The current 64 bit versions of Windows can run apps as far back as 1993. Microsoft provides an official way of running even older apps (XP Mode). 32 bit Windows can often run 16 bit apps without the emulator.

    Microsoft has lots to fault them for, but their record on backwards compatibility is WAY better than Apple's.
  • by PopeRatzo (965947) * on Monday March 01, 2010 @07:43PM (#31324810) Homepage Journal

    I predict wrong predictions.

    Not only wrong predictions, but predictions based on a completely faulty notion.

    From the summary:

    What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?

    Do you get that? It's no longer enough for a company to innovate, to produce a quality product, and to make a profit. They have to be "on top". They have to kill the competition, to put everyone else out of business. Welcome to Capitalism, 2010.

    This might be why some people see this era as being the end-game of "free-market" capitalism. Because now the only way to produce is to destroy. Because it's not enough to succeed, but others have to fail. What good is being rich unless there are poor people to which you can compare your success? After all, if everyone's standard of living goes up, who's going to clean my fucking house?

    There was a time, in my lifetime (and I'm not that old) when a company, let's say an electronics manufacturing company, could sell some stock and use the proceeds to fund the building of a new plant, the purchase of new equipment, the hiring of new employees. The family that owns the company sees their success in terms of this growing and profitable concern. A "healthy" profit on investment for such a company could be as little as 8 percent (and this was a time when you could get 5 percent for a savings account). The people who work for this company like it so much, have done so well as employees, that entire extended families go to work for the company, generation after generation. I watched this entire cycle occur right here in my home town to a company that made industrial lighting (like the kind you'd see at a major league ballpark during a night game). Now, the company is gone. Swallowed by a company that was swallowed by a company that was swallowed by a foreign company that lost contracts to a company in Europe. There's a trail of human loss all along the way.

    The theory of markets and business that sees the killing off of companies as a preferred outcome will always end up badly.

  • by hairyfeet (841228) <bassbeast1968@nOSpam.gmail.com> on Monday March 01, 2010 @09:36PM (#31325640) Journal

    For me and my customers it isn't so much about the raw horsepower, it is the "bang for the buck" that is just getting insane lately. I am sitting here running an AMD 925 quad, with 8Mb of total cache, 8Gb of DDR 2 RAM, a 4650 with a Gb of RAM for buffering, and nearly a Tb of HDD space, all for less than $700 with Windows 7 HP X64. That is just nuts! hell even my 67 year old dad has a quad now. I told him at those prices might as well have the room to grow. With the onboard Radeon GPU his widescreen is smooth as butter for watching videos, and no matter how much he does it never slows down. That's just crazy power for that little $$$!

    Now as for the article, I have to call bullshit on a couple of points. One Nvidia and CUDA. Yes Nvidia owns the GP-GPU market with CUDA, but is that really enough to sustain the entire company and pay for R&D? From what I've read Fermi is gonna be a GP-GPU first that is only half ass for games, is gonna crank out more heat than a P4, and is gonna cost a mint to boot. With the Radeon 4xxx and 5xxx getting so cheap, the new onboards by both AMD and Intel being more than enough for your average Joe, how long can Nvidia keep this up? Intel has locked them out of the new socket chipsets, AMD don't need their chipsets either, so that is another market lost to them, sure Tegra might help a little but I don't see it making up the numbers they had during the 6xxx and 7xxx series. My prediction is Nvidia is gonna be hurting bad, and may get out of the domestic GPU market altogether, becoming a "Hollywood" company like SGI was back in the day.

    Second Nvidia as an x86 manufacturer. Sure we have heard that rumor for years, but unless I'm mistaken they don't have a license for x86 CPUs, do they? AMD and Intel have pretty much all of x86 locked up behind patents and copyrights, which is why the Chinese went MIPS. So unless they buy out Via I would call this one wishful thinking. TFA tries to claim that they might try to bring Crusoe back from the dead, but there was a reason Transmeta went tits up, their chips sucked. Might work for cell phones, for everything else it would make Atom look like a racehorse. And while battery life is important most folks aren't gonna put up with really shitty performance just to squeeze a little more juice out.

    So my personal predictions are thus- AMD will stay in second place and rule the low end with the "bang for the buck" and kick ass in the GPU market thanks to Eyefinity and their low prices on the nice Radeons. Intel GPUs will continue to suck, but their video hardware acceleration will be good enough the average Joe won't care. Nvidia will own GP-GPU but Fermi will be a flop because of heat VS performance on gaming in the consumer market, which will cause them to ultimately sell off Tegra for cash, or more likely get out of the consumer hardware to focus on GP-GPU and mobile.

    But even TFA notes that Nvidia has just been rehashing the G92 core since 2007, and with only Fermi coming down the pipe I predict times will be tough for Nvidia for the next couple of years regardless. Everyone else has CPU+GPU full top to bottom solutions and Nvidia has been left the odd man out. It looks like AMD wasn't so stupid for buying ATI after all, especially if bulldozer turns out to be a good mobile chip.

  • Re:VLIW (Score:3, Insightful)

    by TheRaven64 (641858) on Tuesday March 02, 2010 @08:25AM (#31328818) Journal
    Sun's VLIW architecture (MAJC) was more interesting than Intel's. The point of VLIW is the same as that of RISC; take more stuff that isn't directly connected to executing instructions off the CPU and make the compiler do it. EPIC missed the point and tried to do VLIW + a load of extra stuff on the chip. Sun did proper VLIW and took it to its logical conclusion with a JIT doing absolutely everything (branch prediction, instruction scheduling, even dynamic partitioning for threads). Unfortunately, it came from an era when Sun was still in the Everything Should Use Java mindset. Something like MAJC with something like LLVM could be insanely fast, but LLVM is still a few years away from being ready for that kind of use and no one is developing successors to MAJC so it probably won't happen for a long time, if at all.

If mathematically you end up with the wrong answer, try multiplying by the page number.

Working...