Forgot your password?
typodupeerror

Valve's New Direction On Multicore Processors 80

Posted by Zonk
from the hard-not-to-enjoy-this dept.
illeism writes "Ars Technica has a good piece on Valve's about face on multithread and multicore application in programming. From the article: '...we were treated to an unveiling of the company's new programming strategy, which has been completely realigned around supporting multiple CPU cores. Valve is planning on more than just supporting them. It wants to make the absolute maximum use of the extra power to deliver more than just extra frames per second, but also a more immersive gaming experience.'"
This discussion has been archived. No new comments can be posted.

Valve's New Direction On Multicore Processors

Comments Filter:
  • by kalirion (728907) on Monday November 06, 2006 @03:07PM (#16738901)
    I remember reading of all kinds of bugs in games running on dual-core processors in Windows. Something to do with the OS providing different amount of power to the two cores. Has that been sorted out, or will Valve be compensating in the game engine code?
    • Re: (Score:2, Interesting)

      by oggiejnr (999258)
      A lot of problems can be caused by the lack of a coherent timing signal across even logical, let alone physical, processors. This is the main reason behind cut of lines, out of sync video etc bugs which affects a lot of older games designed to only one processor.
    • by Enry (630)
      Huh. Flatout 2 (great game!) had a problem on my dual core Athlon 64. When I downloaded and installed the AMD drivers for the CPU, the problems went away. I haven't tried HL2/Episode 1 yet on that CPU, but everything else I have works fine.
      • by ThosLives (686517)

        Anyone else disturbed by the fact that even processors now require drivers?

        • by mlheur (212082)
          Not particularily. Processors already had drivers, an OS. So, who do you trust more to write for that processor, your OS provider or your processor provider?
          • Re: (Score:3, Interesting)

            by ThosLives (686517)

            I've never considered an OS as a 'driver' for a processor; I've always looked at a processor as a fixed-ability piece of hardware, and for good reason: reliability. The more knobs you have on something, the more likely it is to be broken.

        • Your system cannot be aware of the advanced functionality of any device, including CPU's unless:
          a) It has drivers
          b) It's programmed into the software
          c) It comforms to an existing standard in a more efficient way (aka faster at operation X internally, without any different I/O communication to the machine)

          I'd rather have drivers at the OS-level than have code-bloat in every app for hundreds of hardware combinations, besides I've been compiling linux kernels specific to my hardware for years, so windows
          • by ThosLives (686517)

            My take is that this 'advanced' functionality doesn't belong in the CPU, as such. A CPU should be a wholly transparent entity that causes information to flow around, not be something that I can poke and prod at (yes, I know how much fun that is). I think perhaps the problem here is that by 'CPU' I really do mean a 'CPU' - a driver for the interlink between CPUs, even if on the same core, is really something different. However, now that I think of it that way, I don't know that I have such a problem with th

            • by dknj (441802)
              what, do you not remember the pentium F00F bug? this is why microcode exists, to fix problems in the chip design AFTER the chip has been released. this is why you don't see recalls on processors. i'm willing to bet this "processor driver" was nothing more than a microcode update, but i don't care to research it so take this line with a grain of salt.
      • by Creepy (93888)
        That's not necessarily a multicore problem. With most games, it's probably more likely a problem with the 64 bit architecture running 32 bit code.

        The exception would be if the code is, in fact, multithreaded or if it runs a server and clients as separate processes. Most games don't, or at best use a thread to dynamically load files in the background (and most games that do that are RPGs like Diablo 2 and Gothic 1-3). Even then, assuming that threads are stable, it's highly likely that the problem would c
        • by Enry (630)
          Err..It's a 32 bit OS, 32 bit application. The fact that the CPU supports 64 bit has nothing to do with it (or else the single core chip I had in previously would have had problems as well).
          • by Creepy (93888)
            Yeah, but it's still an underlying 64 bit architecture that's running in a 32 bit mode. Fixes for bugs to that would be provided by AMD.

                Since you tried a single core, though, it makes me suspect the real problem is with the shared cache between the cores. I can't think of anything else that AMD would patch, at least, that would fix application problems, especially if the cores themselves are the same architecture as the single core chip you had in first.
        • by SnowZero (92219)
          The major problems with threads are lock/starvation (see the dining philosopher's problem and race conditions (you have A and B in separate concurrent threads but A needs to finish before B). Both of these problems are usually caused by coding errors.

          When it first came up that game programmers were mystified with how they were going to use mutli-core processors, I didn't understand why it would be as hard as they claimed it would be. I've been writing graphics and multi-threaded software for years in suppo
          • by Creepy (93888)
            Most game programmers have none of the above. I've known 3 and the best educated of them has between 2 and 3 years of college, which got him through C++ and then he got hired by Volition (and has moved around since). The other two started working straight out of high school for MECC (the Oregon Trail people) and Wizard Works (or wizworks? - low budget games), both of which were absorbed into larger companies. I lost touch with all three of them in college, but I do still chat on IRC with some other game

      • I haven't tried HL2/Episode 1 yet on that CPU, but everything else I have works fine.

        When you do let me know if you get the famous audio stuttering bug, that my rig can't beat.

        gpu: nVidia GeForce 7100
        cpu: AMD Athlon 64 3500+
        snd: Creative Audigy 2 ZS
        ram: 1.5Gb DDR


        • Couldn't have ANYTHING to do with the fact that you have a POS video card, could it?

          Nvidia pulled a fast one on you. The 7100 GS series is a rebranding of the old 6200 TC series [theinquirer.net], and thus has pathetic performance. The old 6200 does not have advanced compression technology (unlike every other chip available today), which means with a pathetic 64-bit bus it runs like a dog.

          I've noticed with Source that the lower your framerate, and the less on-card memory, the more likely you are to encounter sound stutteri
    • Re: (Score:3, Informative)

      by Ford Prefect (8777)
      I've had no problems at all with the original Half-Life, and its sequel, on a dual-core machine [hylobatidae.org].

      From a modding point of view, the Source map compilation tools are fully SMP-aware - so I guess someone at Valve knows about multithreaded programming. Seeing both processors pegged at 100% is great, as is hearing the whooshing noise from my laptop's fans. No belching of flames quite yet, fortunately.

      (Actually, the compilation tools will scale up to running in a distributed manner - apparently at Valve, even the
      • by cnettel (836611)
        One significant problem when doing game code is that you have to keep that > 50 FPS frame rate. In such an environment, context switches can be a killer on their own. If you just go ahead and write a schoolbook threaded design, it might work on a quad-core chip, but it will perform far worse on a single-core chip compared to the single-threaded design. You can naturally avoid the context switcehs by serializing the calls again, but then you realize that you don't really need to copy around all that data
    • by Apocalypse111 (597674) on Monday November 06, 2006 @03:45PM (#16739579) Journal
      For a while, if you played Planetside with a dual-core machine it essentially gave you a speedhack. It didn't affect your rate of fire, but it did affect your rate of movement, and how quickly your COF bloom came back down. While in the lightest armor available, and with the speed implant installed and enabled, it was possible to run almost as fast as a Mosquito (the fastest aircraft available) on afterburners. In a tank you were almost untouchable, and a Galaxy (large air transport craft capable of carrying 12 people including the pilot) could get you and your squad to your target faster than the enemy could respond. It was nuts, but fortunatly not much abused as those caught doing it were frequently reported.
      • Thats how it was in Halflife(1) under certain conditions, likely the same ones.

        Except it did affect your rate of fire in addition to movement. It was full on speedhacks no different than the downloadable program. Luckily Valve's anticheat doesn't detect any form of speedhacking so nobody got globally banned from it, but I'm sure a lot of people got banned from servers for it without knowing how to fix it or even why it was happening.
    • I'm currently working on icculus' port of the older game "Aliens versus Predator". I noticed that the darn thing would run fine, ~30fps, on a PIII-700MHz with i810 graphics. But it ran like a dog, On what I thought was an unrelated note, I added the ability to play the game music from Ogg files instead of off CD. For some weird reason, when the game is playing the music from files, the framerate is at least 50fps. If it can't find the files, the framerate drops again. Totally bizarre, but totally repeatab
    • I remember reading of all kinds of bugs in games running on dual-core processors in Windows. Something to do with the OS providing different amount of power to the two cores. Has that been sorted out, or will Valve be compensating in the game engine code?

      It's mainly been sorted out. Out of all the games I own, the only one with dual core issues is Need for Speed Most Wanted.

      List of my games that work fine with dual core:
      1) Warcraft 3
      2) UT2004
      3) Need for Speed Underground 2
      4) Call of Duty 2
      5) Oblivion
      6) Call

  • by Dr. Eggman (932300) on Monday November 06, 2006 @03:18PM (#16739093)
    With Videos! [bit-tech.net] (on the 4th page.)
  • GPU Death (Score:2, Insightful)

    by Thakandar2 (260848)
    I realize this was brought up in the story, but I really do think that when AMD bought ATI, they were looking forward at stuff like this. If AMD started adding ATI GPU instructions to their cores, and you get 4 of them on one slot along with the memory controller, what kind of frame rates and graphical shenanigans will happen then?

    Of course, the problem is that my AMD 64 3200 will do DirectX 8 with my old GeForce, and will do DirectX 10 if I buy the new ATI card after Vista ships since its the video card
    • Ah, you have run into the classic problem of integrating too many features into one piece of hardware. Now, instead of just upgrading your video card, you have to upgrade your entire processor and possibly your entire system (new mobo for new CPU pin configuration and new memory due to a different memory control used in the new CPU/GPU). I've always felt that the components you are most likely to upgrade (CPU, GPU, and Memory) should always be separate so you are not forced to upgrade all at once. But of
      • I am sure that if AMD could make it where you could just upgrade the CPU and get value for that upgrade, they would. They don't sell memory or motherboards, and only recently have they gotten vested interest in GPUs.
      • by Bert64 (520050)
        Often you find that your new GPU requires a relatively fast processor to feed it data quick enough...
        Not to mention newer versions of AGP/PCIe etc...
        I bought an Athlon64 3200 a while back, and pretty soon thereafter upgraded my video card too, since the old one had poor to non-existent 64bit drivers. And when i bought that videocard originally, for a K6-2/400, it provided virtually no performance improvement over the previous card i had because the system couldn't keep up with it.
      • by webrunner (108849)
        Of course that problem exists even with completely discrete components when new pipelines come out.

        To update my home machine's processor or video card, i have to replace the motherboard:
        - all faster processors worth the money to upgrade to use a different socket
        - All faster video cards worth the money to upgrade to are PCI Express

        If I get a new video card, I have to get a new motherboard to support it, and then I have to get a new processor to fit in the new slot.

        If I get a new CPU, I have to get a new moth
    • by ClamIAm (926466)
      If AMD started adding ATI GPU instructions to their cores, and you get 4 of them on one slot along with the memory controller, what kind of frame rates and graphical shenanigans will happen then?

      Well, AMD Fusion [tgdaily.com] looks interesting. Of course, I also hope that we'll still be able to buy them separately, because it's cheaper to upgrade parts as you go, rather than all at the same time.

      But the possibilities of a hybrid processor are pretty cool. For example, this type of CPU would be awesome in a home theater
      • by cnettel (836611)
        On the other hand, we already know that GPUs and CPUs separate pretty well. The APIs are rather batch-based, right now out of necessity, so you can separate them. When we already have some airflow within the HTPC system, it's not too obvious that it's a good thing to integrate even more processing power onto the same die. Cooling that one might actually be harder than cooling the discrete components. At the very least, it makes sense to keep integrated graphics in the chipset, rather than the CPU.

        I guess s

  • Big deal... (Score:2, Insightful)

    Sounds like pure hype. Basically they're saying they have a 2 years technical advantage because they've been working on better multithreading schemes and avoiding deadlocks to put the multi cores at better use. How groundbreaking... Never mind that most game companies obviously have been working on that too but simply don't talk about it. They also make it sound like they invented distcc... which reinforce the impression that they're just trying to impress people and show how high tech they are (not). Almos
  • by Coryoth (254751) on Monday November 06, 2006 @03:40PM (#16739463) Homepage Journal
    Debugging multithreaded code can be relatively easy, you just have to start off on the right foot. The best way to do that is to leave behind older concurrency models like monitors with mutexes, which the inventor of that model rejected back in the 80s and go with more recent concurrency models like CSP [usingcsp.com] (the newer way to do concurrency from the man who brought you monitors). From a more modern perspective like CSP reasoning about concurrency is a lot easier, and hence debugging becomes much simpler. In fact tere are model checking tools that can verify lack of deadlocks etc. The downside is that its much easier if you have a language that supports the model, or get an addon library to do it for you. You can get a CSP add-ons for Java: JCSP [kent.ac.uk], and for C++: C++CSP [twistedsquare.com]. Alternatively languages like Eiffel, Erlang, Occam, and Oz, offer more of hat you need out of the box - concurrent programming with those languages is easy to get right. Changing languages is, of course, not an option for most people.
  • IANAD... But I do dabble a bit and do want to head in that direction professionally (not as a game developer but more towards general applications).

    Should I start getting my teeth cut on the concept of multicore programming? Is there enough of an advantage for this doing smalled generalized apps? How does software written with multicore in mind suffer on single core systems?

    I've been thinking about this more but currently do not have the proficiency to take this as seriously as the general studies I'm doi
    • I would recommend learning how to write multi-threaded programs now. The way CPUs are heading, pretty much everyone will have a dual or quad core in 3-5 years, so I think it would a wise decision to learn now rather than later.
  • The whole article talks about how Valve created tools to ease multithreaded programming, but never actually says what they are. There's very little to the article except the obvious statement about how multithreaded code can run faster.

    And the technical content is questionable - for example the definitions of multithreading problems are somewhat innaccurate.

    • by dknj (441802)
      they spread tasks across all the cores. valve has maybe a 1 month headstart, but lets look at the bigger picture. ps3 was designed with this in mind and development kits have been out for awhile. of course, you have to design your game differently for a ps3 compared to, say, an xbox360 or pc since you don't have to traverse a series of cpus before getting to the one you want.

      this is valve doing smoke and mirrors to hype up the steam platform, but in reality we'll see games that are designed in a similar
      • by adz (630844)
        That's not a tool though, that's simply a decomposition strategy!
  • It is nice when software is rewritten to take advantage of multiple cores, and imagine that most new games will be designed to do this. For older games, however, Apple has announced that programs using the OpenGL APIs will automatically spawn a process that feeds the GPU, using a second thread. This means, theoretically, some programs could see up to 2X the performance when run on OS X 10.5 and a dual core system, without any changes from the developers.

    It's a nice little optimization and hopefully Vista

    • by ClamIAm (926466)
      You probably meant to say this, so I'll add it for clarification: Leopard has code that splits up the OpenGL load between all usable processors. This means not only the GPU but the second CPU core as well.
    • by abdulla (523920)
      I thought they stated that the games had to be written to take advantage of this, it wasn't so much an automatic boost as one that you need to apply to take full advantage of.
      • I thought they stated that the games had to be written to take advantage of this, it wasn't so much an automatic boost as one that you need to apply to take full advantage of.

        Their literature (sparse as it is) strongly implies otherwise, but I don't have any of the NDA stuff so I could be wrong.

  • by EricBoyd (532608) <mrericboyd@yaho[ ]om ['o.c' in gap]> on Monday November 06, 2006 @04:06PM (#16739961) Homepage
    I found this paragraph from the conclusion really interesting:

    "Newell even talked about a trend he sees happening in the future that he calls the "Post-GPU Era." He predicts that as more and more cores appear on single chip dies, companies like Intel and AMD will add more CPU instructions that perform tasks normally handled by the GPU. This could lead to a point where coders and gamers no longer have to worry if a certain game is "CPU-bound" or "GPU-bound," only that the more cores they have available the better the game will perform. Newell says that if it does, his company is in an even better position to take advantage of it."

    This is almost certainly why AMD has bought out ATI - they see that the future is about integrating everything on the motherboard into one IC, and AMD wants the CPU to be that point of integration. For more, see:

    Computers in 2020
    http://digitalcrusader.ca/archives/2006/02/compute rs_in_20.html [digitalcrusader.ca] which is my prediction for how the whole field is going to evolve over the next 14 years.
    • That won't work. You can crack DES using dozens of high-end 64-bit PCs in several years; using a $300 DES cracking card, you can do it in 4 days. The DES cracking card has 76,000 transistors; the CPU has hundreds of millions. We're talking about dedicated hardware versus something general purpose here; squashing more crap on the same dye generates more heat and creates longer logic paths to try and accomplish the same tasks, leading to slower operation. It also decreases modularity (you can't get a "Hig
    • by StikyPad (445176)
      they see that the future is about integrating everything on the motherboard into one IC, and AMD wants the CPU to be that point of integration

      So everything old is new again, eh?

      I don't see the GPU disappearing anytime soon. It's much easier to create an IC that is good at one specific task than a multi-use IC that is just as good at that particular task. As long as there are companies that can outshine the CPU manufacturers' performance in graphics, there will be a market for GPUs. The only exception is
  • This is lame, I have been making the argument that FPS != game benchmark. Everyone always says "look dual core FPS is the same as single core, maybe 1 or 2 faster," and they don't even understand why it's 1 or 2 faster (because the few cycles used for AI and physics are out of the way...). It takes VALVE to come out and say it?

    Yesterday I blew and made the same statements on kerneltrap [kerneltrap.org]. A little unscientific, but should cover the issue nicely.

  • Too Bad... (Score:3, Insightful)

    by ArcadeNut (85398) on Monday November 06, 2006 @06:14PM (#16742863) Homepage
    Too bad as I'll never buy another Valve game because of Steam.
    • Have it your way.

      Myself, I'm off to enjoy the excellent games made by the excellent people at Valve, automatically updated by the excellent automatic patching system Steam provides. Even if you ignore the convenience of the patches, the built-in IM and server browser capabilities, and the media and game tool content, the fact that similar services are springing up here and there has to mean Valve did SOMETHING right.

      You go shuffle your CDs around and Google for patches, finding 5 different ones and, in

      • by Bert64 (520050)
        What happens when/if steam is shut down? Valve won't run it forever, at which point you'll no longer be able to download games from it, and may no longer be able to run the ones you already have.
        I prefer console games, that due to the lack of ability to patch them, have to be written properly in the first place, also since the hardware is always the same I too can take them anywhere only needing the appropriate console, instead of an x86 compatible computer which meets a huge array of other requirements.
        • What happens when/if steam is shut down? Valve won't run it forever, at which point you'll no longer be able to download games from it, and may no longer be able to run the ones you already have.

          Steam includes an option for backing up a hard copy of your games to convenient CD or DVD-sized files. Then, if Steam ever vanishes, unpack the backed-up .gcf files, use a no-authenticaion crack on it, and you're good to go.

    • You know, I've heard all of this shit about steam and how gamers hate it and all that, but when I finally decided to install XP on my MacBook Pro so I could play some games, I installed steam. Using it to get games, install them, updates, run them- It's probably the easiest software I've used in a while, and it's clean and fast. I dont see what the deal is with people. Everyone bitches and moans about it because it makes it (near) impossible to steal the games.
    • by Reapy (688651)
      I will grudginly admit steam is ok. I think my initial problem with it was being required to verify my cd key online in order to play it.

      My biggest problem with is with source, which makes me want to throw up all over the place! I never made it through hl2, and I really wanted to. :((
  • by GoatVomit (885506)
    If you are on a budget and want to play games you'll probably get more bang for the buck with a single core proc and a better gpu than with the same amount of cash spent on dual core + slower gpu. I've been waiting for this to change for a while but so far it's been more marketing than anything else. I ended up moving the dual core proc to a linux box and single core to windows after a few weeks of testing. Naturally windows chugs more after a game and isn't as responsive but while playing it was hard to no

Take care of the luxuries and the necessities will take care of themselves. -- Lazarus Long

Working...