AMD, Intel, and NVIDIA Over the Next 10 Years 213
GhostX9 writes "Alan Dang from Tom's Hardware has just written a speculative op-ed on the future of AMD, Intel, and NVIDIA in the next decade. They talk about the strengths of AMD's combined GPU and CPU teams, Intel's experience with VLIW architectures, and NVIDIA's software lead in the GPU computing world." What do you think it will take to stay on top over the next ten years? Or, will we have a newcomer that usurps the throne and puts everyone else out of business?
YAY! More Prognostication! (Score:5, Insightful)
Re: (Score:3, Insightful)
I don't even understand why people do this in an official capacity. I mean, I know they have to for shareholders or business planning purposes or whatever, but these sorts of things are almost always wrong.
Are they just doing it for the lulz?
Re:YAY! More Prognostication! (Score:4, Insightful)
I think it's because they're being paid to.
Re:YAY! More Prognostication! (Score:5, Informative)
I believe Moore's Law stating the number of transistors will double on an integrated circuit every two years and the continual increase of CPU GPU processing power is a solution looking for a problem. What we need is power efficient processors that have enough processing capacity to do what we need and nothing more. Unless you are a Gamer or doing some serious GPGPU calculations in CUDA or OpenCL what on earth is the need to have a graphics card like the Nvidia GeForce GT 340 with around 380 GFLOPs of floating point processing. It's ridiculous.
Re: (Score:2, Informative)
Your right, besides video games, servers, scientific research [stanford.edu], movies, increased productivity, and probably a dozen things I haven't thought of what do we need more processing power for?
Of course efficiency is good. Computers have been becoming more efficient since day one.
There is a place for tiny, low power ARM chips and 150 watt 8 core server chips.
Re: (Score:3, Insightful)
Re: (Score:2)
ARM processors have taken over pretty much all the mobile and a lot of the netbook space.
Where can I buy those mythical ARM netbooks I keep hearing about? The only thing I've seen is that detachable tablet/netbook thing I forgot the name of and I don't even know if it was just a prototype or if they are selling them. So far ARM has no presence in the netbook market.
Re: (Score:3, Informative)
I predict they are seriously mistaken in forgetting about ARM processors in their analysis. ARM processors have taken over pretty much all the mobile and a lot of the netbook space.
Mobile I'll give you, but netbook?
When's the last time you saw a ARM netbook? If you've ever seen one? Sure, you read articles on Slashdot saying that they'll be here Any Day Now! But they aren't. I think there was one model in Fry's for a short while-- that's about it. Unless you count the numerous vaporware-spouting websites.
An
Re:YAY! More Prognostication! (Score:4, Insightful)
Not only wrong predictions, but predictions based on a completely faulty notion.
From the summary:
Do you get that? It's no longer enough for a company to innovate, to produce a quality product, and to make a profit. They have to be "on top". They have to kill the competition, to put everyone else out of business. Welcome to Capitalism, 2010.
This might be why some people see this era as being the end-game of "free-market" capitalism. Because now the only way to produce is to destroy. Because it's not enough to succeed, but others have to fail. What good is being rich unless there are poor people to which you can compare your success? After all, if everyone's standard of living goes up, who's going to clean my fucking house?
There was a time, in my lifetime (and I'm not that old) when a company, let's say an electronics manufacturing company, could sell some stock and use the proceeds to fund the building of a new plant, the purchase of new equipment, the hiring of new employees. The family that owns the company sees their success in terms of this growing and profitable concern. A "healthy" profit on investment for such a company could be as little as 8 percent (and this was a time when you could get 5 percent for a savings account). The people who work for this company like it so much, have done so well as employees, that entire extended families go to work for the company, generation after generation. I watched this entire cycle occur right here in my home town to a company that made industrial lighting (like the kind you'd see at a major league ballpark during a night game). Now, the company is gone. Swallowed by a company that was swallowed by a company that was swallowed by a foreign company that lost contracts to a company in Europe. There's a trail of human loss all along the way.
The theory of markets and business that sees the killing off of companies as a preferred outcome will always end up badly.
My post is my +mod (Score:2)
Word.
ARM (Score:3, Insightful)
Re: (Score:3, Insightful)
Re:ARM (Score:4, Insightful)
So you could get an arm laptop or x86 workstation. For work use thinclients will be popular again soon and many people will use a smart-phone, hooked to their tv for display when at home, instead of a home computer.
Then the cycle will restart. Welcome to the wheel of computing.
Re:ARM (Score:5, Interesting)
And why would they bother with that, when they can simply have a separate computer at home instead of having to worry about dropping theirs and losing everything?
PCs aren't going anywhere, and the idea that they'll be replaced by smartphones is utterly ridiculous. Despite the giant increases in computing abilities, and the ability to fit so much processing power into the palm of your hand, even mainframe computers are still with us; their capabilities have simply increased just like everything else. Why limit yourself to the processing ability that can fit into your hand, if you can have a desktop-size computer instead, in which you can fit far more CPU power and storage? Today's smartphones can do far more than the PCs of the 80s, but we still have PCs; they just do a lot more than they used to.
Of course, someone will probably reply saying we won't need all the capability that a PC-sized system in 20 years will have. That sounds just like the guy in the 60s who said no one would want to have a computer in their home. A PC in 2030 will have the power of a supercomputer today, but by then we'll be doing things with them that we can't imagine right now, and we'll actually have a need for all that power.
Re: (Score:2)
I dunno, I can imagine a 3D gaming tapestry taking up half my wall, and I can imagine redundant computing boxes in the house for my kids (whom I haven't had yet, and must presume they will live at home in 2030) and their half wall gaming/TV tapestries. And my girlfriend will probably want something to browse her recipes and listen to music on, not to stereotype mind you, he computer can do a lot more than that now, she just seems to only care about recipes and listening to music.
A computer that costs a bil
Re: (Score:2)
Re: (Score:2)
Exactly. And if you have a full-fledged desktop computer, instead of a smartphone with a docking station, you won't be completely screwed when you accidentally drop your phone in the toilet.
PaaT (Score:3, Interesting)
The best solution would not be to run apps on the phone at all. It would be to get always on bandwidth from a PC at home to your phone that was fast enough to do remote desktop at a speed where you couldn't tell that you were working remotely. Once we have that kind of bandwidth, the phones are basically done. The phone as a terminal. With this configuration, you get:
* Massive upgradeability on the phone since to make your phone faster, you just upgrade the PC in your home.
* Far
Re: (Score:2)
Far greater battery life, as once the phone is a good terminal, adding more processing power to the PC will add power, but since that part is plugged into the wall, it won't drain your battery at all.
I think you got this the wrong way around - powering the radio in your phone uses a _lot_ of battery power - far more than the CPU. If you're going to use the phone at a thin-client all the time then you're going to be keeping that radio powered up all the time and your battery won't last long at all.
You can access the same application from a desktop, TV, or the phone, and there is no reason the interface cannot change for each.
If you're going to have a totally different UI for each platform (and you will need that - a desktop UI is completely unsuited to a small-screen device), you may as well be using a different application on eac
Re: (Score:2)
I think you got this the wrong way around - powering the radio in your phone uses a _lot_ of battery power - far more than the CPU. If you're going to use the phone at a thin-client all the time then you're going to be keeping that radio powered up all the time and your battery won't last long at all.
The radio is always on anyway.
If you're going to have a totally different UI for each platform (and you will need that - a desktop UI is completely unsuited to a small-screen device)
There is no reason that a single PC cannot serve dimension appropriate UI from a single back end, so complaining about the PC's UI on a phone is a red herring derived from a lack of imagination. In fact, I have a PC here that happily serves up an Android UI, and a Mac that happily serves up an iPhone UI.
you may as well be using a different application on each platform anyway.
It isn't 1983 any more. Writing the same code for a half dozen different platforms is a waste of resources and money.
I think you underestimate the cost of mobile bandwidth.
If you read my post, you would have seen that getting the
Re: (Score:2)
I appreciate your thinking this through, but I think you're way off base.
My prediction: your phone will BE your computer. You'll stick it in your pocket, then when you get to work, you'll drop it in the monitor-connected dock and fire up your bluetooth (or whatever) input devices. At the end of the day, you'll pull it out of there, stick it back in your pocket. When you get home, same thing, except dock could be networked to your TV as well.
Simple. Gives you everything in one place, and eliminates need for
Re: (Score:2)
Of course we will; Microsoft will still be producing operating systems around Windows architecture, so we will need the usual CPU-hogging Windows+antispyware+antivirus stack, and IE6 will still be in heavy use in the enterprise. ;)
Re: (Score:2)
For some a current netbook does everything they need.
For some, but not very many. Most people I imagine have them in addition to their other computing devices, and mainly use the netbooks when they're away from the home or office because it's small and easily portable. The problem with the netbook is that the screen and keyboard are much too small for anything besides web surfing (and even that's not that great because of the tiny screen, but at least it's better than a smartphone).
Sure, you could get a "
Re: (Score:2)
Exactly. An iPhone is "good enough" for just simple web surfing on the go, because it lets me do that when I wasn't able to before smartphones came about. But while an iPhone may be a lot more capable in some ways than a PC 15 years ago (except for a decent keyboard), the standard PC has increased a lot in capabilities during that time too, and there's lots of things I do with a PC that I'd never want to bother with an iPhone for, even if the screen and keyboard weren't issues.
Re: (Score:2)
I suspect we might be connecting our smart-phones to a monitor (or some other display technology), and using a wireless keyboard. When programming, or doing other really serious stuff.
Re:ARM (Score:4, Interesting)
All three will be marginalized by the ARM onslaught. Within 10 years, smartphone will be the personal computing device, AMD and Intel processors will power the cloud.
ARM may well come to dominate personal computing, but it sure won't be via the smartphone. No one is going to edit long word processor documents on their phone, much less edit spreadsheets, write code, or do much else that qualifies as actual work. And it's not because they don't already -- in many cases -- have enough processor power; it's because they don't have full-sized keyboards and monitors. I'll grant that it's possible that phones or PDAs of the future might well attach to full-featured I/O devices, but by themselves, no.
The cloud, too, has some significant limits that it will be difficult if not actually impossible to overcome. Security is a major issue, arguably theoretically resolvable, but trusting your critical data to an outside company to whom you are, at best, a large customer is not.
Re: (Score:2)
Wireless connectivity to an HDTV for dislplay and bluetooth for keyboard and mouse.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Exactly. And programmers will all be writing their code on a smartphone with a touchscreen, and no one will have a monitor larger than 4" diagonal. 10 years from now, desktop computers will be gone.
Re: (Score:2)
You mean like all those Java network PCs you are seeing everywhere now?
Re: (Score:2)
I have to take a walk get some fresh air and pizza. I almost had a seizure reading TFA. I probably would've if I'd had a better GPU!
Re: (Score:2)
I believe (prediction alert!) that, while most of the idiotic, regular populace will just buy smartphones and netb
VLIW (Score:2)
Re: (Score:3, Insightful)
Re: (Score:2)
ARM's not VLIW.
Too much hyperbole... (Score:5, Insightful)
You can always spot a sensationalist post when part of it predicts or asks who will go out of business. Or what thing will disappear.
For example, in his post, ScuttleMonkey [slashdot.org] asks this:
NNote, the post is a good one - Im not being critical. But change in the tech industry rarely result in big companies going out of business - if they do, it takes a long time. I think sun is the canonical example here. It took a long time for them to die - even after many, many missteps. Sun faded away not because of competition or some gaming changing technology, but simply because they made bad (or some would say awful) decisions. Same for Transmeta.
People have been predicting the death of this or that forever. As you might imaging, my favorite one is predicting Microsofts death. Thats being going on for a long, long time. The last I checked, we are still quite healthy.
Personally, I dont see Intel, AMD, or NVIDIA ding any time soon. Note, AMD came close this last year, but they have had several near death experiences over the years. (I worked there for several years...).
Intel, AMD and NVIDIA fundamental business is turning sand into money. This was a famous quote by Jerry Sanders the found of AMD. Im paraphrasing, but it was long the idea at AMD that it didnt matter what came out of the fabs as long as the fabs were busy. Even though AMD and NVIDIA no longer own fabs, this is still their business model (more or less).
I think its interesting how a couple of posters have talked about ARM - remember, AMD and NVIDIA can jump on the ARM bandwagon at any time. Intel already is an ARM licensee. Like AMD, they are in the business of turning sand into money - they can and will change their manufacturing mix to maintain profitability.
I also dont see the GPU going away either. GPUs are freakishly good at what they do. By good - I mean better than anything else. Intel flubbed it badly with Larabee. A general purpose core simply isnt going to do what very carefully designed silicon can do. This has been proven time and time again.
Domain specific silicon will always be cheaper, better performing and more power efficient in most areas than a general purpose gizmo. Note, this doesnt mean I dislike general purpose gizmos (like processors) - I believe that the best system designs have a mix of both - suited to the purpose at hand.
-Foredecker
Re: (Score:2, Funny)
we [Microsoft] are still quite healthy
Oh noes! Who let him post on slashdot?!
Re: (Score:2)
Although the Larabee may not be as cost/energy/speed efficient as a dedicated GPU, perhaps it will be almost as fast, but easier to program for, thus ensuring its popularity...
Re: (Score:2)
The intial Larrabee product was canceled. [anandtech.com] Intel had to re-trench on the graphics plans. Again. They are a smart company, but they have struggled to get off the ground in the graphics area. Im assuming they will someday be successful. But today isnt that day.
-Foredecker
Re: (Score:2)
NVIDIA's already got an ARM platform: Tegra.
Re: (Score:2)
This is GPU-only, the less interesting question. (Score:2)
The article is about the GPU business, which, IMO, is the less interesting aspects of these companies. The new hotness is all in the mobile space. You want interesting, read about the upcoming Cortex-A9-based battle going on for 'superphones', smartbooks and tablets.
People have been predicting the death of the desktop computer for a long time, but the problem is that there hasn't been anything realistic to replace it. We're within sight of that replacement. Once everyone has a superphone that can do the bul
Re: (Score:2)
Re: (Score:2)
Mobile phones don't have the shelf life required to replace the desktop.
This would be part of that ecosystem mention I made.
If you dock your mobile phone to access the 'big' data (videos, music, etc.), and other stuff is in the cloud, then replacing the phone itself isn't going to be a big deal, and with cell providers going the hwole 'go with us for 2 years and get your phone for half price' thing, that will incentivize people to upgrade their shit more often, which, from a web developer's perspective, I f
Re: (Score:2)
Re: (Score:2)
Desktops will always be much more powerful than phones, and there will always be ways for the "bulk of people" to utilize that extra power.
Why on Earth won't the future be like now in this sense - mobile small computers (smart phones, netbooks, tablet pcs) that you can lose or break reasonably easily. Bulkier boxes sitting at home, that never go anywhere (not exactly a huge drain on the space inside people's home...where's the push to plug a tiny phone into your huge monitor)?
And in the middle laptops.
And w
Re: (Score:2)
the ecosystem to support such a change is the hard part. Can you get access to all your data, etc., from your docked mobile? That's gonna be the key.
And it's going to be why it doesn't happen for a LOOOOONG time. In the US, Wireless gives you a choice of AT&T, Verizon, Sprint, or T-Mobile. Wired gives you choice of your phone company or cable company. None beat having your data local, and as long as you're keeping a machine around with your data on it, why not pay ~$50 more so it can drive your keyboard, video, and monitor without plugging in your mobile?
whomever can perfect low-power computing (Score:2)
The first low power CPUs like the Atom were lame. Better devices on the way.
Re: (Score:2)
Since when was the Atom the first low power device? ARM was earlier, uses less power and has much better performance/watt.
Predictions (Score:2)
Consoles come out with 1080p/DX11-class graphics. Graphics cards for the PC try to offer 2560x1600+ and whatnot but the returns are extremely diminishing and many current PC games will come to a console with keyboard and mouse. GPGPU remains a small niche for supercomputers and won't carry the cost without mass market gaming cards. The volume is increasingly laptops with CPU+GPU in one package and there'll be Intel CPUs with Intel GPUs using an Intel chipset on Intel motherboards - and they'll be a serious
At some point, the GPU goes on the CPU chip (Score:5, Informative)
At some point, the GPU goes on the CPU chip, and gets faster as a result.
Maybe.
GPUs need enormous bandwidth to memory, and can usefully use several different types of memory with separate data paths. The frame buffer, texture memory, geometry memory, and program memory are all being accessed by different parts of the GPU. Making all that traffic go through the CPU's path to memory, which is already the bottleneck with current CPUs, doesn't help performance.
A single chip solution improves CPU to GPU bandwidth, but that's not usually the worst bottleneck.
What actually determines the solution turns out to be issues like how many pins you can effectively have on a chip.
Re: (Score:2)
At some point, the GPU goes on the CPU chip
Actually, that point was back in January when Intel started shipping the Clarkdale Core i3 and i5 processors [anandtech.com]
Still waiting for OpenCL (Score:3, Insightful)
I am still waiting for OpenCL to get traction. All this CUDA and StreamSDK stuff is tied to a particular company's hardware. I think there is a need for a free software implementation of OpenCL with different backends (NVidia-GPU, AMD-GPU, x86-CPU). Software developers will have great difficulties to support GPUs as long as there is no hardware-independent standard.
Re: (Score:2)
Why do you need a free *implementation*? The whole point is that OpenCL is a free standard, which is at the heart of AMD's stream SDK (with AMD GPU and CPU backends) and also included in nvidia's SDK. Given that OpenCL has an ICD interface so that you can use any of the backends from any OpenCL-supporting software, I'm not sure what isn't present that you're after other, arguably, than current market penetration.
what's interesting to me... (Score:5, Interesting)
What I find interesting is the overall lack of game-changing progress when it comes to non-3d-or-hd-video-related tasks. In March 2000, i.e. ten years ago, top of the line CPU would be a Pentium III coppermine, potentially topping out around 1 Ghz. I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU. Heck, it would probably handle Win7 okay. Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.
Re: (Score:2, Interesting)
In 1990 you would be looking at a 25mhz 486DX.
Which is the minimum for most x86 OSes nowadays. In fact, some newer x86 OSes and software have even higher requirements. Windows XP and SQL Server 7.0 and later for example require the CMPXCHG8B instruction, and Flash 8 and later require MMX.
Re: (Score:3, Insightful)
The biggest challenges we have today are getting more processing performance from less electricity because we're running more things on batteries and quiet computers for the ho
Re:what's interesting to me... (Score:4, Interesting)
I could put Windows XP on one of those (with enough RAM) and do most office / browsing tasks about as fast as I could with today's top of the line CPU.
It's wetware-limited, doesn't matter how much hardware or software you throw at it. We can spend two minutes reading a page then expect the computer to render a new one in 0.2 seconds, in practice it will never go faster. I don't know why it's become such a myth that we'll always find new uses for computing power. A few specialized tasks now and then perhaps, but in general? No, people will chat and email and listen to music and do utterly non-intensive thing that go from taking 10% to 1% to 0.1% to 0.01% of your CPU.
Contrast the period 2000-2010 with the period 1990-2000. In 1990 you would be looking at a 25mhz 486DX.
Yes, computers are starting to return to the normal world from Moore's bizarro-universe where unbounded exponential growth is possible. After decades of conditioning you become oblivious to how crazy it is to expect something double as fast for half the price every 18 months (or whichever bastardization you choose to use). Rventually a ten year old computer will be like a ten year old car, sure they've polished the design a little but it's basically the same. And that is normal, it's we that live in abnormal times where computers have improved by several orders of magnitude.
Re: (Score:2)
I think it's true that my Mom will never have use for a computer faster than what she's using today. But I find that I am still waiting on computers on a regular basis. The real bottleneck for me is multitasking. I get into several projects at once, with perhaps ten applications and 50 browser tabs open, and then the computer starts choking. Some of the problem may be more to do the available memory or with the operating system's code quality, but the point is that computers have not yet reached that point
NVIDIA (Score:2)
I have only one thing to say about NVIDIA: nvlddmkm. That says it all. I won't ever buy NVIDIA again.
Re:The Singularity? (Score:5, Insightful)
With greater personal power, we won't have Microsoft dictating what 3D features we can have. With individuals become supercomputers, these three companies will be out of business. However, personal survivability and power will be sufficient that former employees will be fine.
What?
Re:The Singularity? (Score:5, Funny)
In short: make your time.
Re: (Score:2)
There's a new GPU company called Zig? Where?!
Re: (Score:2)
Only if it's cubed.
Re: (Score:2)
Now you just sound educated stupid.
Re: (Score:2)
Okay, so abstraction on hardware with practically unlimited processing power will make NVidia obsolete. There's precedent for that, even if it's a bit off topic. The Wii and the DS.
What you're describing, though, isn't even really on the horizon. In the more near-term, like at least for the next decade, we're still reliant on predictable hardware specs to make games and apps. You talk about inertia keeping the x86 alive. Well of course! This industry thrives on standards. That's how Microsoft can 'di
Re: (Score:2)
Sorry to reply to my own post, but I was a bit hasty in my rant. I'm thinking about a couple of apps in particular and not about all the apps. Blender's viewport kicks ass in ways that make me envious and 3D Studio MAX isn't far behind. I just wanted to acknowledge this before I get my well-deserved roast. :D
Re: (Score:2)
Actually, usually it's nVidia or ATI dictating what 3D features we have, with the other immediately implementing the same thing to keep up.
Re: (Score:2)
Re:The GPU will go the way of the coprocessor (Score:5, Interesting)
On the contrary, I think the CPU will go the way of the coprocessor. The humble Atom may be enough CPU power for most people these days, but you can never have enough GPU power... at least not until your po-- I mean, games, are photorealistic in real time.
Re: (Score:2)
That depends on the market.
If we are talking about the PC/Laptop market once integrated graphics are good enough at 1080p it is game over. In the workstation market I would agree with you.
Even photorealisim may not be worth the cost. Also that depends on A class games being optimized for the PC and not just console ports.
The way I see the future is this.
nVidia goes for the Supercomputer, workstation, and embedded markets. Their biggest product in numbers and revenue will be the Tegra line if they are lucky.
Re: (Score:2, Funny)
You just said people will never want photorealistic rendered porn. You have got to be the worst predictor ever.
Have you ever even met a person?
Re: (Score:2)
While I am not a fan of porn I can assure you that almost none of it is currently "rendered" and frankly from what little I have seen the last thing anybody really wants is higher resolution porn.
In that category of video even HD maybe a step too far.
Re: (Score:2)
Really? You know, searching for "HD porn" results in 11,8 million hits. There's obviously a market.
Re:The GPU will go the way of the coprocessor (Score:4, Interesting)
Your post is based on several assumptions that make no sense to me as a student of human nature, and an engineer.
1. 1080p is current technology. Even if we assume we will not have hologram visual output within the near future, there will still be some new technology that the powers that be will sell to the masses. It may be an incremental improvement, but it will still be enough to drive the markets.
1a. As long as it's new and shiny, there will always be someone to buy it.
2. Consoles use GPUs and CPUs the same as PCs do. There is a longer update cycle in place, but whenever each cycle ticks they adopt all the new technology that has been developed during the lifetime of a console. As such, it makes sense for the console makers to encourage such development.
3. Intel would have to shut down all of their operations to let nVidia claim the workstation market. Like it or not, Intel still makes pretty hefty CPUs, owns the workstation market, and has more disposable cash, and a bigger engineering staff than any other chip maker. The embedded market has even more competition for its crown, so I will not go there. The supercomputer market, while good for satisfying the nerd bragging rights quota, is not know for being an amazing source of profit.
4. The AMD vs Intel battle for the mid-range market is actually something I can see coming to pass. I would not be too surprised if this market gets a third player as the line between computation devices becomes blurred.
5. ARM is not the only company in the world that can make a low power chips. Worst case, ARM has a few years of dominance before the other guys catch up. Also, as the article pointed out, integrated CPU/GPU has several obvious advantages over discrete CPU + discrete GPU.
In all, while I am not ready to make my own predictions, yours could use a bit more analysis and tweaking.
atom is too lacking in cpu features to take over l (Score:2)
atom is too lacking in cpu features to take over like that and low end amd chips are much better and they have a good low end video plan.
Re:The GPU will go the way of the coprocessor (Score:5, Interesting)
I am not an engineer, these are just thoughts that rolled off my head...
Re: (Score:2)
As bus speeds go higher and higher, you want components closer together, not further apart.
Change your model to putting specialized components on the die, and youa re on the right track. Unless something like optical busses come mainstream, in which case multiple sockets make some sense. Multipurpose sockets that allow you add vector-specific processing to your system for gaming, vs. I/O or GP processing for servers, vs leaving them empty for the budget.
But I think on-die specialization is the near future
Re: (Score:2)
Re: (Score:2)
"If you specialized everything, the number of designs for each part drops drastically (So instead of needing 50 models of CPU, you'd only need 3 or 4. Plus 2 or 3 GPU modules. Plus 2 or 3 FPU modules, etc)... "
Um, right now, you have 50 or so CPU models to accomodate the perceived value of specialization. This impacts choices of CPU speed, caches and their sizes, FSB size, etc.
How this is improved by making 50 or so SKUs of various modules is beyond me. Not to put too fine a point on it, but instead of 50
Re: (Score:2)
This is exactly the sort of progress that will NEVER happen unless Windows is no longer the dominant OS, and to a larger extent proprietary programs in general. With OSS, this sort of thing is just a recompile away. Maybe not for full optimization, but at least to get up and running. For example going from 32 to 64 bit, or x86 to ARM is generally possible with very few (if any) changes. But this is impossible if the source is unavailable.
Mac isn't as bad as Windows for this, since Apple writes the OS and ma
Re: (Score:2, Insightful)
The current version of OSX can run apps as far back as 2001. Apple does not provide any official way of running apps older than that.
The current 64 bit versions of Windows can run apps as far back as 1993. Microsoft provides an official way of running even older apps (XP Mode). 32 bit Windows can often run 16 bit apps without the emulator.
Microsoft has lots to fault them for, but their record on backwards compatibility is WAY better than Apple's.
Re: (Score:2)
You stand to lose much more by using multiple chips instead of multiple cores. As we get faster and faster clocks, the distance a signal can travel in a clock cycle gets smaller and smaller. Even with modern technology trying to access something off chip is likely to cost you hundreds of cycles. As such, you want to minimize the amount of off chip communication that needs to happen.
I think you hit the nail with the different module types, though I would implement it a bit differently. First, the motherboard
Re: (Score:3, Insightful)
I disagree. Floating-point coprocessors basically just added some FP instructions to a regular single-threaded CPU. There was no parallelism; they just removed the need to do slow floating-point calculations using integer math.
However, GPUs, while they mainly do floating-point calculations, are essentially vector processors, and do calculations in parallel. They can easily benefit from increased size and parallelism: the more parallel processing capability a GPU has, the more realistic it can make graphi
Re: (Score:2)
I disagree. Floating-point coprocessors basically just added some FP instructions to a regular single-threaded CPU. There was no parallelism; they just removed the need to do slow floating-point calculations using integer math.
However, GPUs, while they mainly do floating-point calculations, are essentially vector processors, and do calculations in parallel. They can easily benefit from increased size and parallelism: the more parallel processing capability a GPU has, the more realistic it can make graphical applications (i.e. games). And with all the GPGPU applications coming about (where you use GPUs to perform general-purpose (i.e., not graphics) calculations), there's no end to the amount of parallel computational power that can be used. The only limits are cost and energy.
So if someone tried to fold the GPU into the processor, just how much capability would they put there? And what if it's not enough? Intel has already tried to do this, and it hasn't killed the GPU at all. Not everyone plays bleeding-edge 3D games; a lot of people just want a low-powered computer for surfing the web, and maybe looking at Google Earth. An Intel CPU with a built-in low-power GPU works fine for that, but it won't be very useful for playing Crysis unless you think 5 fps is good. People who want to play photo-realistic games, however, are going to want more power than that. And oil exploration companies and protein-folding researchers are going to want even more.
GPUs aren't going anywhere, any time soon. Lots of systems already have eliminated them in favor of integrated solutions, but these aren't systems you're going to play the latest games on. For those markets, NVIDIA is still doing just fine.
Why Can't they dedicate 1/2 the die space to x86 cores and the other half for a big simd processor
Re: (Score:2)
Because die space is expensive. Why would someone with a low-powered notebook or netbook want to spend extra money on 3D graphics capabilities they don't need? And why should someone who's going to buy an NVIDIA card waste money on a CPU that's twice as large as it needs to be, because it has a built-in GPU they'll never use, when they could have double the number of CPU cores instead?
Re: (Score:2)
bandwidth
Re: (Score:2)
netbooks benefit from gpus in the video decoding aspects. Especially with google implementing h.264 codecs for some of their youtube videos there's a big performance advantage to having a gpu instead of an additional cpu core.
Re: (Score:2)
That's because the CPU was crippled by Intel (I think it was the 386sx that didn't have a FPU, the external 387FPU was simply a 386 with an internal FPU). In fact, I think the FPU on the 386sx was fine, it was just disabled by a bonding option. Quite the scam - sell 2 chips instead of one.
Re: (Score:2)
The GPU will go the way of the coprocessor
The GPU is a coprocessor
Re:Haven't you heard? Studies have shown.. (Score:4, Funny)
The use of futurism has been thoroughly discredited.
I predicted that years ago.
Re: (Score:2)
I find it funny when Science Channel has one of those "Future" shows on, and you get some asshole talking into the camera with the little caption under his name pegging him as a professional "futurist".
He gets paid to make wild guesses. ::golf clap::
Re: (Score:2, Insightful)
If he is getting paid well, he doesn't give two shits what kind of clap you are doing.
Re: (Score:2)
Lol, good point. Reminds me of a conversation that took place at our dinner table last night:
Wife's Uncle: "I wonder how Michael Phelps can swim for a living. Doesn't that get boring?"
Me: "Sure...if you consider it boring to be a millionaire."
Re: (Score:2)
Re: (Score:2)
I disagree. It seems CPUs and GPUs are designed and planned well ahead of time. Tapeout [wikipedia.org] occurs many months before products hit the market. Intel's Sandy Bridge apparently taped out in June 2009 [fudzilla.com] and won't be released until 2011. Yonah taped out in October 2004 [theinquirer.net] but wasn't released until January 2006. If it appears that these companies are responding quickly with new, competitive designs, it's because they correctly predicted the market direction and
Re: (Score:3, Insightful)
Re: (Score:2)
Unless those big dogs wake up soon from their stupor, an unknown startup will sneak behind them and steal their pot of gold.
You are going to need that pot of gold to fund your project.
Research, Engineering. Fabrication. Marketing. The barriers to entry here are not trivial.
Re: (Score:2)