Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel

Intel's Dual-core strategy, 75% by end 2006 306

DigitumDei writes "Intel is moving ahead rapidly with their dual core chips, anticipating 75% of their chip sales to be dual core chips by the end of 2006. With AMD also starting to push their dual core solutions, how long until applications make full use of this. Some applications already make good use of multiple cpu's and of course multiple applications running at the same time instantly benifit. Yet the most cpu intensive applications for the average home machine, games, still mostly do not take advantage of this. When game manufacturers start to release games designed to take advantage of this, are we going to see a huge increase in game complexity/detail or is this benifit going to be less than Intel and AMD would have you believe?"
This discussion has been archived. No new comments can be posted.

Intel's Dual-core strategy, 75% by end 2006

Comments Filter:
  • dual cores (Score:3, Insightful)

    by lkcl ( 517947 ) <lkcl@lkcl.net> on Wednesday March 02, 2005 @09:44AM (#11822506) Homepage
    less heat generated. more bang per watt.
  • Well... (Score:3, Insightful)

    by Kn0xy ( 792482 ) * <knoxville&xpd8,net> on Wednesday March 02, 2005 @09:46AM (#11822521) Homepage
    If their going to be that ambitious with their sales, I hope the are concidering pricing the the chips in a price range that anyone could afford and is willing the pay.
  • by Reverant ( 581129 ) on Wednesday March 02, 2005 @09:47AM (#11822531) Homepage
    When game manufacturers start to release games designed to take advantage of this, are we going to see a huge increase in game complexity/detail
    No, because most games depend more on the gpu rather than the CPU. The cpu is left to do tasks such as opponent AI, physics, etc, stuff that the dedicated hardware on the graphics card can't do.
  • Pretty soon (Score:2, Insightful)

    by PeteDotNu ( 689884 ) on Wednesday March 02, 2005 @09:48AM (#11822546) Homepage
    Once multi-core chips start getting into home computers, the game developers will have a good justification for writing thread-awesome programs.

    So I guess the answer to the question is, "pretty soon."
  • by Total_Wimp ( 564548 ) on Wednesday March 02, 2005 @09:53AM (#11822570)
    ... and more everwhere else. Games continue to get most of their good stuff from the GPU, not the CPU. It aint that the CPU isn't important, but it's not going to make a huge difference all by itself.

    What I hope to see, but don't expect, is better prioritization of CPU requests. If you have something high-priority going on, like a full screen video game, recording a movie or ripping a CD, I'd like to see the antivirus and other maintenance tasks handled by the other core, or even put on hold. My personal experience is that this stuff can sometimes be set up to some extent, but it's overall kind of crappy and labor intensive.

    But this really isn't intel's fault. MS and the app vendors need to take the blame. So, the question is: do other OSs handle this better for consumer products?

    TW
  • Hmm? (Score:5, Insightful)

    by Erwos ( 553607 ) on Wednesday March 02, 2005 @09:53AM (#11822573)
    "how long until applications make full use of this"

    Full use? Probably never. There's always improvements to be made, and multi-threaded programs are a bitch and a half to debug, at least in Linux. Making "full use" of SMP would _generally_ decrease program reliability due to complexity, I would imagine.

    But, with an SMP-aware OS (Win2k, WinXP Pro, Linux, etc.), you'll definitely see some multi-tasking benefits immediately. I think the real question is, how will Microsoft adjust their licensing with this new paradigm? Will it be per-core, or per socket/slot?

    I'm going to go out on a limb and predict that Longhorn will support 2-way SMP even for the "Home" version.

    -Erwos
  • by g2racer ( 258096 ) on Wednesday March 02, 2005 @09:56AM (#11822591) Homepage
    A little off topic, but anybody find it interesting that all the next generation consoles will use IBM processing power? Considering the number of consoles sold compared to PCs, this has got to piss both Intel and AMD off...
  • by EngineeringMarvel ( 783720 ) on Wednesday March 02, 2005 @09:57AM (#11822602)
    Your statement is true, but I think you missed the point the article poster was trying to get across. Currently games are writeen to use computer resources that way. If the code was written differently for games, they could allocate some of the graphic responsiblities to the 2nd CPU instead of all of it going to the GPU. The 2nd CPU could be used to help the GPU. Allocating more of the now available (2nd CPU) resources to graphics allows more potential in graphics. That's what the article poster wants to see, that game resoure allocation written in the games code be changed to use the 2nd CPU to help enhance graphics in the video game.
  • It's just _Dual_ (Score:2, Insightful)

    by infofarmer ( 835780 ) <infofarmer@FreeBSD.org> on Wednesday March 02, 2005 @09:58AM (#11822613) Homepage
    Oh, come on, it's just dual, it's just a marketing trick. Speed has been increasing in a logarithmic manner for years on end, and now we're gonna stand still at the word "Dual"? If intel/amd devise a way within reason to logarithmically increase the number of cores in a CPU (which I strongly doubt), that'll be a breakthrough. But for now - it's just a way to keep prices high without inventing anything at all. WOW!
  • Complexity/detail (Score:4, Insightful)

    by Glock27 ( 446276 ) on Wednesday March 02, 2005 @09:59AM (#11822620)
    "are we going to see a huge increase in game complexity/detail?"

    If you consider a factor of about 1.8 (tops) "huge".

  • Yes, but since the core of Intel's marketplace consists of people who see a monitor and think it is the computer, this is a barrier that Intel can easily hurdle.
  • Fairly simple... (Score:4, Insightful)

    by Gadgetfreak ( 97865 ) on Wednesday March 02, 2005 @10:03AM (#11822643)
    I think as long as the hardware becomes established, people will write software for it. From time to time, hardware manufacturers have to push the market in order to get the established standard to jump to the next step.

    It's like what Subaru did when they decided to make all their vehicles All Wheel Drive. It was a great technology, but most people at the time just didn't care to pay extra for it. By making it a standard feature, the cost increase is significantly reduced, and provided that the technology is actually something functional, the market should grow to accept it.

  • by frogmonkey ( 777477 ) on Wednesday March 02, 2005 @10:08AM (#11822677) Homepage
    I am going to wait for at least quad core 64bit processors ;)
  • Boon for Game AI (Score:3, Insightful)

    by fygment ( 444210 ) on Wednesday March 02, 2005 @10:10AM (#11822698)
    A lot of posts have quite rightly pointed out that the GPU is currently how games use a "pseudo" dual core. But it seems that what games could be doing now is harnessing the potential of dual core not for graphics, but for game enhancement i.e. better physics and true AI implementations. Realism in games has to go beyond tarting up the graphics.
  • by mcbevin ( 450303 ) on Wednesday March 02, 2005 @10:16AM (#11822740) Homepage
    The average system is already running a number of different processes at once. Even if most individual applications aren't multithreaded, a dual-core will help not only make the system technically faster but also help hugely on the response of the system (which is often a far more important factor for the 'feel' of how fast a system is as the user experiences it) whenever process are running in the background.

    While one might ask whether it makes much useful difference to the 'average' home user, one might ask the same about say 4ghz vs 2ghz - for most Microsoft Word users this makes little difference in any case. However, for most users who really make use of CPU-power in whatever form, the dual-core will indeed make a difference even without multi-threaded applications, and it won't take long for most applications where it matters to become multi-threaded, as its really not that hard to make most cpu-intensive tasks multi-threaded and thus further improve things.

    I for one am looking forward to buying my first dual-CPU, dual-core system (i.e. 4x the power) once the chips have arrived and reached reasonable price levels, and I'm sure that power won't be going to waste.
  • by chrishillman ( 852550 ) on Wednesday March 02, 2005 @10:17AM (#11822745) Homepage Journal
    Apple has offered Dual-CPU systems for a long time, but they are more than just a company for teachers to buy computers from. They also sell systems to graphic artists, publishing houses and many other places that benefit from dual-CPU systems. It's just the Apple shotgun approach, they are aming at their market which includes many levels of users. It's not their intention that Grandma should have a dual CPU 64bit system (unless she is a Lightwave user looking to decrease render times). Multiple core CPUs have been AMDs dream for a long time now. This is just Intel not wanting to look stupid on the 64bit front any longer. They are making sloppy decisions to try to beat AMD to market so Dell can use such "innovations" in their ads.
  • Re:dual cores (Score:3, Insightful)

    by PureCreditor ( 300490 ) on Wednesday March 02, 2005 @10:19AM (#11822766)
    if the 2 cores can share L1 and L2, then it's less than "twice the power"...and given the close distances between the 2, it's not hard to create a high-speed connect that will equate Shared L1 to Local L1 speeds.
  • by Ironsides ( 739422 ) on Wednesday March 02, 2005 @10:28AM (#11822848) Homepage Journal
    Not really. Intel and AMD have never had the console gaming market. Also, the consoles really do require either an embedded microprocessor, or one that is customized. The gameboy series uses arm7 and arm9 processors. The recent Consoles themselves have used customized ones. The X-Box is the only exception in that it used a general purpose Pentium 3.

    I can see that Intel and AMD might want to break into that market, but they would have to create a custom chip (as a general purposed will either use too much power or won't cut it) just for that. Something I am not sure they want to do.
  • Budy, some of us have AMD.

    Come on in the HyperTransport is fine. Care for a pinya-Onchipmemorycontroller?
  • by Ulric ( 531205 ) on Wednesday March 02, 2005 @10:35AM (#11822917) Homepage
    I do not for one moment believe that Amdahl's law won't affect these systems. Dual-core won't be a problem, quad-core probably won't, but I can't see that stuffing in more cores will solve the scalability problem in the long run.
  • by wilsone8 ( 471353 ) on Wednesday March 02, 2005 @10:43AM (#11822967)
    Multithreading IS hard if you are sharing any state between the threads. And the difficulty in debugging multithreaded issue in a large/complex application (i.e., any commercial game these days) goes up by at least an order of magnitude with the introduction of true multi-threading via 2 cores. On a single processor machine, you can get away with more because you don't have true concurrency. And in your particular case, you actually have no concurrency so it is even easier. But if you want to be truely scalable and try to squeeze as much out of that second processor as possible, you have to deal with all sorts of problems like deadlock, write barriers, critical sections, etc. These are HARD issues, not least because many of the bugs these issues can introduce are both hard to wrap your head around and because the issues will only expose themselves when you get an unlucky context switch at just the right moment in execution.
  • by magarity ( 164372 ) on Wednesday March 02, 2005 @11:13AM (#11823278)
    networking code on a third core

    The CPU waiting on networking, even 1Gbits/sec, is like waiting for a raise without asking. It's so little overhead to a modern CPU that using an entire core to do it is an exercise is silliness. If you are worried about any overhead associated with network encryption, etc, you can just spend $45 on an upgraded NIC with that capability built in to its own logic. The CPU never need be bothered.
  • by dasunt ( 249686 ) on Wednesday March 02, 2005 @11:31AM (#11823506)
    One thing that has bugged me a long time about a lot of games (this has particular relevence to multi-player games, but also single player games to some extent) is the 'game loading' screen. Or rather, the fact that during the 'loading' screen I lose all control of, and ability to interact, with the program.

    It has always seemed to me, that it should be possible, with a sufficiently clever multi-threaded approach, to create a game engine where I could, for example, keep chatting with other players while the level/zone/map that I'm transitioning to is being loaded.

    You don't technically need multithreading to make the game seem responsive while its doing something.

    Imagine:

    while ( load_tiny_bit_of_map() )
    {
    if ( check_input() ) { process_input(); }
    }

    Assuming that the function 'load_tiny_bit_of_map()' takes only a few dozen milliseconds, you won't notice it.

    Being multithreaded makes that a bit easier, but other parts of the game may grow in complexity (depending on the game). The reason why that's not done is lazyness/lack of time/poor feedback. (I always thought there should be a minigame or something while maps load...)

  • by Anonymous Coward on Wednesday March 02, 2005 @11:38AM (#11823589)
    SPARC has more than one register.

    Please refrain from commenting on things that you do not understand.

    I do not know anything about 17th Century Prussian architecture. Therefore I would not try and argue with somebody that did.

    You know nothing about processor internals. If you had not posted, this would not be so abundantly clear.

    PS. To the people that modded him up - please refrain from breathing from now on. Thanks.
  • by Foktip ( 736679 ) on Wednesday March 02, 2005 @03:56PM (#11826449)
    Yeah right... the consoles will mostly stop piracy (or at least be way less than on PC's), they'll be many many times faster, they connect to the internet, etc. - in short, they're the new "Top End" of gaming.

    If they can make the design toolkit (whatever its called) good enough that programming for the cell isnt horribly difficult, consoles will win the high performance gaming market.

    Half-life 3 for PC - if its made - will run into massive bottlenecking: hard drive read speeds, processor speeds, archetecture limitations, etc. But the consoles dont have to worry about compatability with Window$ (they make their own OS-skeleton), so they can optimize everything like mad.

    The only reason the PC won the top end gaming market so far, is they kept getting faster all the time, while the consoles had to keep a many-year cyclic release cycle; and became out dated. Now that the PC has hit several HEAVY bottlenecks, they dont stand a chance. Even with the 4-year release cycling of consoles, the PC will not catch up. Not in 4 years, not in 8 years, mayby not AT ALL. At leas not until some new-generation hardware compatable operating system shows up.

    THe PC gaming market could be in trouble. It could even sink into "second rate", as all the FPS's migrate to consoles. On the bright side, Apple computers and Linux will look increasingly feasible.

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...