Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Chip Promises AI Performance in Games 252

Heartless Gamer writes to mention an Ars Technica article about a dedicated processor for AI performance in games. The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics. From the article: "AIseek will offer an SDK for developers that will enable their titles to take advantage of the Intia AI accelerator. According to the company, Intia works by accelerating low-level AI tasks up to 200 times compared to a CPU doing the work on its own. With the acceleration, NPCs will be better at tasks like terrain analysis, line-of-sight sensory simulation, path finding, and even simple movement. In fact, AIseek guarantees that with its coprocessor NPCs will always be able to find the optimal path in any title using the processor." Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?
This discussion has been archived. No new comments can be posted.

Chip Promises AI Performance in Games

Comments Filter:
  • hm (Score:2, Insightful)

    sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK.

    I dont think we are going to get any good AI until it has some method of "learning"
    • Re:hm (Score:5, Funny)

      by Snarfangel ( 203258 ) on Wednesday September 06, 2006 @02:21PM (#16054052) Homepage
      sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK.

      This will suck 200 times faster, though. That's like a straw compared to a fire hose.
      • Re:hm (Score:5, Funny)

        by orasio ( 188021 ) on Wednesday September 06, 2006 @02:34PM (#16054157) Homepage
        This will suck 200 times faster, though. That's like a straw compared to a fire hose.

        Fire hoses don't suck. You need a more visual analogy.
        Maybe something like this:

          "That's like a tick compared to your mother!"

      • Re: (Score:3, Informative)

        by Chas ( 5144 )

        Jeeze. What the hell is with the whole accellerator thing now?

        A physics accellerator which does jack and shit when compared and mid to high end graphics solutions. It's offloading some of the CPU load, sure. But at high-res, the CPU is NOT the bottleneck.

        A network accellerator which is going to do jack and shit. It's offloading some of the network processing from the CPU, sure. See "the CPU is not the bottleneck". Sure, some people are going to build apps for the embedded Linux. Great.

        Now an AI accel

    • Re:hm (Score:5, Interesting)

      by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Wednesday September 06, 2006 @02:25PM (#16054091) Homepage

      I read the blurb this morning. The idea is that it accelerates the basic operations that everything uses (line of sight, path finding, etc.). The more complex AI (actual behavior, planning, etc) is built the normal way. It simply offloads the CPU and thus allows faster calculations.

      The other real difference is that it is better than current algorythms. So instead of using A* for pathfinding, it works correctly even on dynamicaly changing terrain. This would mean things like no longer having NPCs getting stuck on rocks or logs or some such (*cough* half-life 1 *cough*).

      • by SnowZero ( 92219 )
        it is better than current algorythms [sic]

        You do realize that A* if from 1968, and that things have improved a bit since then, right? It is better than algorithms used in current video games maybe, but that's as far as I'd take it.
    • Re: (Score:3, Insightful)

      by Da Fokka ( 94074 )

      sounds like it just speeds up existing AI routines..... and existing AI routines, well, SUCK.


      That's largely because most CPU cycles go to the pretty graphics. More computing power might help the AI in some games (although many AI routines are basically flawed anyway). This chip offers a more powerful tool to the AI programmer. It's still up to him to make an AI that's not totally stupid.

    • I really enjoy letting the marines in Halo 2 do the driving because they kind of understand that it's bad to get near cliffs, but they kind of don't understand it as well. The MC does the driving from here on out.
    • by EatHam ( 597465 )
      Good, now maybe that stupid Dr. Sbaitso will be able to give me some answers that make some god damned sense.
    • Right, this "AI enhancer" thing is bullshit. Specialized offload cards are only useful when the offloaded task is limited by the ability of the processor to perform specific algorithms at a high speed: e.g. 3D video, TCP packet reassembly, and (maybe) game physics.

      AI is not that kind of task. No one has convincingly demonstrated that you can make "smart" human-like AI simply by throwing more processor power at it. Even for a comparative simple deterministic game like chess, the brute-force approach will
      • Re:hm (Score:4, Interesting)

        by SnowZero ( 92219 ) on Wednesday September 06, 2006 @07:47PM (#16056281)
        The problem with the state of AI today is not that the algorithms are too processor-intensive, it's that they flat-out suck.

        Please don't take what you see in games to be state of the art. Watch this video [youtube.com] of my RoboCup team and tell me that AI still completely sucks. You'll see two teams playing 5 on 5 soccer with zero human input, i.e. fully autonomous. Game AIs may suck, but that's because their AI programmers are graphics people trying to do AI. The result looks about as good as me trying to make a state of the art graphics engine.

        The only reasonable application of hardware AI acceleration that I can think of would be a massively parallel chip that runs thousands or millions of neural net nodes at once... but this would mainly be a benefit for academic AI research, not for games.

        Neural nets died down as a fad in academic circles almost 10 years ago. There's a common saying that "Neural nets are the second best way to do everything." ... meaning that if you analyze a problem, some other approach almost always turns out to work better. They are a reasonable approach for unstructured classification problems that aren't fully understood, but after some analysis other approaches almost always take over. This has been the case with things like OCR and face recognition.

        I'm pretty sure that most games use simple "heuristic" algorithms for AI, rather than anything complicated like neural nets or Bayesian learning or SVM.

        NNs, Naive Bayes, and SVMs are all classifiers (and often slow ones at that). They aren't really directly applicable for defining policies for an agent, so they don't get used much (as well they shouldn't). Most agent decision systems use a combination of heirarchical finite state machines (FSMs), planning on a small set of high level actions, and motion planning.

        Games tend toward the absolute simplest of FSMs with only binary inputs, and yield the expected highly rigid behavior in a given situation. For the most part, they don't even use randomness, which is absolutely necessary in any kind of situation where one player is trying to outguess another. I've heard that non-RTS games only budget about 1% of the CPU to AI, and it shows. Rich FSMs, action-based planning, and proper motion planning get swept aside, and that is unfortunate. However the coming multi-core revolution may offer some hope. Game programmers are having trouble splitting up graphics routines, so it might be that AI can get the core or two that it deserves when we hit quad-core CPUs. Due to the many algorithms, AI benefits from general purpose CPUs, and parallelizes quite well.

        Whether enough real AI people will ever get hired to do games right remains to be seen. At the moment it seems even primarily systems companies like Google are more interested in AI people than game companies.
        • Re: (Score:3, Interesting)

          by MoxFulder ( 159829 )

          The problem with the state of AI today is not that the algorithms are too processor-intensive, it's that they flat-out suck.

          Please don't take what you see in games to be state of the art.

          Sorry, didn't mean to cast aspersion on AI in general... merely game AI. My point was that game AI algorithms are hopelessly lame, so speeding them up isn't going to help. I'm very much aware of more impressive AI efforts, such as in RoboCup. In games, AI often seems to be an afterthought, whereas in RoboCup AI is prett

    • Re: (Score:3, Interesting)

      by beaverfever ( 584714 )
      "I dont think we are going to get any good AI until it has some method of "learning""

      That would be great, but in the meantime other approaches are available. Going back several years, Unreal Tournament had the capability of customising individual bots' gameplay style with several different parameters beyond mere difficulty levels, and I found that this was very much worth the effort. A large library of custom characters could be saved, and they could be selected randomly to add unpredictability to gameplay
  • Al? (Score:5, Funny)

    by gEvil (beta) ( 945888 ) on Wednesday September 06, 2006 @02:14PM (#16053993)
    Who is Al and why do I want him controlling everything in my games?
  • What may occur is a separate box consisting of the GFX card, Physics Card, AI card, PSU for the above along with supporting memory modules just to power existing games. Mulitple cards consisting of mulitple chips with multiple cores will likely overpower the common case. Thus for the hardcore games, a separate box wired to the main rig could be the norm. Thus, for the average home user, we will get smaller and smaller (Mac mini et. al) but for the gamer we'll see module system, with multiple boxes and mu
    • by gEvil (beta) ( 945888 ) on Wednesday September 06, 2006 @02:21PM (#16054056)
      And then they can take everything and put it all in a big case with the a monitor and speakers and a special panel with the controls on it. And then all you need to do is put a slot in the front that says $1.
    • by gregmac ( 629064 ) on Wednesday September 06, 2006 @02:25PM (#16054090) Homepage
      What may occur is a separate box consisting of the GFX card, Physics Card, AI card, PSU for the above along with supporting memory modules just to power existing games.

      what an [microsoft.com] interesting [nintendo.com] idea [playstation.com].
      • Re: (Score:3, Interesting)

        by MBGMorden ( 803437 )
        Ok, link speak is annoying. Don't do that ;).

        The systems you mention though are all scaled down computers. They don't really have any extra hardware that a standard computer doesn't have. The GP's comment seemed to be talking about putting all the "extra" hardware out of the case, which doesn't fit your model of just making a smaller and more focused computer.
    • Program the physics, graphics and AI routines into hardware. Offload the processing onto the FPGA. Call it a generic games accelerator. The games developers could then optimise their own libraries of hardware routines for their games rather than trying to optimise the games for general purpose hardware.

       
      • Re: (Score:3, Informative)

        by philipgar ( 595691 )
        First, graphics won't work on an FPGA. I mean technically it's feasible, but the demand for graphics is great enough to make it economical to produce graphic ASICs such as those Nvidia and ATI produce.

        However the FPGA idea is a good one, and is being researched. Actually, what is even more interesting is to utilize transistors on the CPU die to integrate reconfigurable hardware accelerators. The research is being done currently, and will allow for CMP + reconfigurable systems so that custom processors
  • Well.. (Score:4, Funny)

    by geniusj ( 140174 ) on Wednesday September 06, 2006 @02:15PM (#16054003) Homepage
    Well if Chip promises it, I believe him..
  • Not Gonna Work (Score:5, Insightful)

    by TychoCelchuuu ( 835690 ) on Wednesday September 06, 2006 @02:15PM (#16054005) Journal
    The physics card could theoretically work because if the player doesn't have it, you could always leave out some of the eye candy and only calculate fancy physics for objects that affect gameplay. With an AI card, you don't have that luxury. Either they player has it, or you have to just dump all the AI (obviously not) or do it all on the CPU, which begs the question: why program your game for a dedicated AI card if you're just going to have to make it work on computers without one?
    • Re: (Score:3, Interesting)

      by arthurh3535 ( 447288 )
      Actually, they could do something similar to the graphics, just allowing for "weaker" AI routines that can work on standard system.

      It will be insteresting to see if games are "more fun" with smarter AI or if AI really isn't the big and important thing about making interesting games.
      • by mmalove ( 919245 )
        Interesting indeed. I think today's social gamer is more inclined to play against a human opponent, no matter how "humanlike" you can make a bot/AI. There's just no "I one-upped you" thrill from outplaying a computer.
        No warm glow of teamwork from your AI henchmen helping you through a challenging encounter.

        However, in any situation where you do have computer controlled entities, a miserably bad AI can detract very much from the game. I hate to use it an example - but look at World of Warcraft. If a mob
      • Thing is (Score:4, Insightful)

        by Sycraft-fu ( 314770 ) on Wednesday September 06, 2006 @05:19PM (#16055391)
        That will drastically alter gameplay. You'd literally have to design the game twice, once for dumb AI, once for smart AI. As an example, look at the difference between the original Doom, and Doom 3. While Doom 3's monsters aren't brain surgeons, they are smart enough to sneak around, take cover, etc. If you were to apply those tactics to the massive numbers of monsters in the original Doom, you'd have a near impossible game. Likewise if you put the dumb, "walk straight at the player" AI in Doom 3, the challenge would be gone.

        Now this is just the case with a game where AI is fairly unimportant in the scheme of things. In a game where it highly relies on the AI, say one where squad tactics are used, it'd be a nightmare. With the card you have highly competent teammates that practically complete a mission for you, without it you have guys stepping on their own grenades, things like that.

        AI also has the problem of being different for different games. I'm sure the AI process for an imp in Doom 3 is nothing like the AI process for an enemy civ in Civilization 4. Thus I don't know there's a way you can provide a more "optimised" kind of chip for it. Graphics accelerators work because you can design a chip that's highly specialized. They'd suck as CPUs, and in fact until very very recently weren't even Turing complete. However since graphics is always the same kind of thing, they can be optimised to do certain things very fast. I just don't think that's the case with AI, since there's so many kind of AIs one might need.
    • by MBCook ( 132727 )

      But that only affects multiplayer. In single player, you can just use the dumber routines if they don't have the card. This especially appiles to small creatures who aren't doing much more than pathfinding. In a GTA type game you can just put more people/cars on the street. There are circumstances where it would be perfectly possible.

      The biggest problem is multiplayer where you basically have to have everyone require it or force everyone to use software.

    • by lawpoop ( 604919 )
      It sounds to me like the same problem with the physics card. You still have to code default behavior for objects if they don't have a physics card. You don't get default behavior for free, only errors and crashes.

      So if the player doesn't have an AI card, you turn off some of the 'mind candy', and have stupider enemies.
    • Re:Not Gonna Work (Score:5, Interesting)

      by Dr. Spork ( 142693 ) on Wednesday September 06, 2006 @02:48PM (#16054280)
      I think this is the right question, but it may have an interesting answer. Maybe the way they picture future game AI is similar to present-day chess AI: Chess games evaluate many potential what-if scenarios several moves into the future, and select the one with the best outcome. Clearly, the more processing power is made available to them, the more intelligently they can play.

      Maybe future RPG AI could have some similar routines regarding fight/flight decisions, fighting methods and maybe even dialogue. But that would require a pretty universal processor, which would just speak for getting a second CPU. I don't have much hope of this catching on, but I'd welcome it. For one thing, writing AI that can run in a separate process from the rest of the game is something I'd love to see. I want something to keep that second core busy while I'm gaming!

      Plus, it would be pretty cool for hardware manufacturers if AIs really got smarter with better hardware (be it CPU or add-in card). That would require big coding changes from the way AI is written now, but I do think those would be changes for the better.

      • Re: (Score:3, Insightful)

        by izomiac ( 815208 )
        Computing possible outcomes is only one of the strategies humans use to make a decision. For chess that may be enough for a decently human-acting AI. But for most games such an AI would either be terrible or impossible to beat (at which point it would be programmed to make mistakes thus go back to being terrible). Throwing more processing power at the problem won't fix it, though it might make the AI slightly better. What's really needed is a better approach to creating AIs.

        A learning neural network
        • Re: (Score:3, Interesting)

          by Dr. Spork ( 142693 )
          If you're advocating that AI only work with the information and "reflexes" available to human players, I wholeheartedly agree. I'm not as scared as you of AI that's too good, because it's always easier to make something worse than better. AI that's too good can be downgraded in many interesting, humanlike ways - like simulated dispositions to panic, or freeze, waste ammo, needlessly conserve ammo, get too close before firing, etc. Basically, you just need to observe what imperfect players do and tell an out
    • by j1m+5n0w ( 749199 ) on Wednesday September 06, 2006 @02:56PM (#16054339) Homepage Journal
      why program your game for a dedicated AI card if you're just going to have to make it work on computers without one?

      Perhaps the card could be most useful not on the client, but in dedicated mmorpg servers. I know WoW could definitely use some smarter mobiles. Sometimes I think whoever designed the AI was inspired by the green turtles from Super Mario 1. I'd like to see games with smarter mobs and NPCs, and any game with a realistic ecology (for instance, suppose mobs don't magically spawn, they procreate the old fashioned way, and must eat food (a limited resource) to survive) would require many more mobs than a WoW-like game in order to prevent players from destroying the environment. Simulating millions of intelligent mobs would likely be very expensive computationally.

      • Stupid question... (Score:4, Interesting)

        by UbuntuDupe ( 970646 ) on Wednesday September 06, 2006 @03:11PM (#16054463) Journal
        ...that's I've always wanted the answer to from someone who knows what they're talking about:

        For the application you've described, and similar ones, people always claim it would be cool to be able to handle massive dataprocessing so you could have lots of AI's, and that would get realistic results. However, it seems that with *that many* in-game entities, you could have gotten essentially the same results with a cheap random generator with statistic modifiers. How is a user going to be able to discern "there are lots of Species X here because they 'observed' the plentiful food and came and reproduced" from "there are lots of Species X here because the random generator applied a greater multiple due to more favorable conditions"?

        I saw this in the game Republic: the Revolution (or was it Revolution: the Republic?). It bragged about having lots and lots of AI's in it, but in the game, voter support in each district appeared *as if* it were determined by the inputs that are supposed to affect it, with a little randomness thrown in. The AI's just seemed to eat up cycles.

        Long story short, aren't emergent results of a large number of individual AI's essentially the same that you would get from statistical random generation?
    • So far, those seem really short of AI. Maybe because they have less computing power per player (that server farm must be affordable). With dedicated AI cards for the servers, MMORPGs might be able to catch up to newer single player games that have at least half-decent AI.
    • On games I've worked on in the past, we had a global strategizing algorithm that ran once every few seconds (over the course of a bunch of frames), more localized map sectional AI that ran slightly more frequently, per-unit pathfinding that ran (incompletely) every second, and moment-to-moment movement that ran every frame.

      Now, if we could run all of those AI routines every frame, the game would appear a bit smarter. It wouldn't have a delay upon reacting to stimulous, the pathfinding could run a character
    • Yea, and all games coming out today, will run in software renderers. Even if it is not really plausable today... tomorrow, there may be enough penetration, to make it like Video cards. Where you have to have a high-end one, to play video games.
    • by gutnor ( 872759 )
      Or that would be the same than for Graphic card. In the old time of 3DFX 3D card, games used to ship various engine: an accelarated engine and another 100% software engine.

      But the main difference with graphic card is that there was an obvious difference between not having the card and having it. ( at the time that meant jumping from 320*200 to 640*480 with more eye candy. Also 3D games were the brand new fashion stuff of the time. )
      For physic card, that's already another matter. In current demo, you can't s
  • by w33t ( 978574 ) on Wednesday September 06, 2006 @02:18PM (#16054028) Homepage
    Something that's always bugged me a bit about expansion boards is that the experience can only be enjoyed by the user with the board.

    For instance, in a multiplayer game, some players will obviously be getting better graphics than the rest - but often the maps are tailored to work equally well (or at least as equally as possible) to low-end and high-end video cards.

    And then there is this new physX card - which sounds like a neat idea, but you have the same kind of situation. You can kind of model physics looking a bit better for the player with the card - but all actual physical actions must be reproducible for the non-card having players.

    Now, here is where I think the AI card could be different: distributed processing.

    Let's take two human players and 4 AI players in a multiplayer game. Normally the server would be responsible for the AI decision-making processing and would pass to the clients only the x,y,z movement and animation data as a network stream. The AI thinking would take place completely free of the client machines. This puts strain on the server's resources.

    Now, imagine rather than the server processing and the clients recieving network info you were to turn this on it's head.

    Have the clients process a subset of the AI - say, 2 AI for player 1's machine, and 2 AI for player 2's machine. Now both clients will send the AI's movement information to the server. From the server's point of view the AI would require the same processing power that a regular human player would require (very little - relatively speaking).

    With the plethora of bandwidth available client-side these days I think this kind of idea is very realistic.
    • Re: (Score:2, Insightful)

      by Anonymous Coward
      Having worked in the industry on a MMORPG, I agree that it would be nice to use client machines for extra distributed processing, but there are issues.

      First, as a rule of thumb in multiplayer development, never trust the client machines for anything other than controller and view data for that player. The client machine is hackable, unlike (supposedly) your server. They can wrap .dll's so that they can modify and view data, in your case data that they may not legitimately have access to, such as "what's bei
      • by SnowZero ( 92219 )
        We thought about this a fair bit where I worked. We decided that it just wasn't doable.

        One thing I could imagine doing for a game is offloading just the path planning. You could make it a fairly dynamic thing where no particular client has responsibility for an NPC. Each client would recieve a source and target position for some NPCs, and the client could plan paths and send back a nearby waypoint for each NPC that will take it partway to its goal. You could assign multiple client computers the same prob
    • That makes it a lot easier to cheat and make the AI you run either super-smart or braindead. If you can do that, you just solo and wipe the floor with brain-dead AIs until you have enough treasure/levels/skillz to take on anyone.
  • by Anonymous Coward on Wednesday September 06, 2006 @02:20PM (#16054050)
    In fact, AIseek guarantees that with its coprocessor NPCs will always be able to find the optimal path in any title using the processor.

    Aren't many problems of that ilk NP-complete?

    • True, finding the optimal path is often NP-complete, but if the AI card decreases the constant factor in the majority of cases, this could still be a win.

      • by 26199 ( 577806 )
        Hmm. Path finding isn't NP complete -- polynomial, at worst. So it's quite solvable.

        It's only NP complete if you have some weird requirement like 'visit all cities' as in the Travelling Salesperson Problem.

        And anyway in general if a problem is NP complete then people can't do it either, so you don't need a full solution...
  • by MaineCoon ( 12585 ) on Wednesday September 06, 2006 @02:22PM (#16054058) Homepage
    The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics.


    They want to completely ruin game performance by killing the PCI bus bandwidth and causing the GPU to stall waiting on the position/orientation and generated geometry that it will have to render?

    Physics and AI coprocessors are 2 years too late - with the increasing availability of dual core processors in even midrange consumer systems now, and quad core on the horizon, engineering time is much better spent on making an app multithreaded so that it runs efficiently on hyperthreaded and dual core machines, instead of trying to offload it to a coprocessor that few customers will have. For a consumer, it is a better investment to spend an extra $50 to $100 for a dual core processor than spend $300 on a physics or AI coprocessor.

    I doubt, and openly mock, their claims of '200x' speedup. I imagine it will be more like speeding up the process of $200 leaving foolish consumers' wallets.
    • ya, SMP anyone? (Score:2, Insightful)

      I agree. Intel just released dual core chips, AMD has them already and is about to release quad core chips, plus we have -cheap- dual processor boards available. That'd be eight cores, as soon as AMD releases their new kit.

      Even Windows is shipping with SMP available, we have processing capability out the wazoo pretty much. Should be able to handle any AI requirements I'd think and have room to balance your checkbook at the same time.

      Some clever lad should be able to design a bot that doesn't do the same t
    • They want to completely ruin game performance by killing the PCI bus bandwidth

      Positional updates to a character in the game are very low bandwidth - I mean, MMO's do this all the time and don't saturate network connections, much less a PCI buss. The calculations are heavy but the input and end result are just a few numbers, plus a terrain map you would load once and forget until you zone, at which time a little latency is happening anyways.

      causing the GPU to stall waiting on the position/orientation a
      • You either misunderstood what I meant (I was mocking what Agea PhysX did for physics; the card is trash, and games that utilize it have suffered an average 25% framerate drop due to the extra geometry being handled, for a variety fo reasons)... or you aren't aware of the problems with the PhysX card, that I just mentioned. Hit up some sites and look at the benchmarks for when the card is used.
    • Re: (Score:3, Informative)

      by j00r0m4nc3r ( 959816 )
      I fully believe their claim is totally realistic. With a dedicated circuit to process A* or Dijkstra's algorithm (or solve generic network traversal problems) you could very easily beat a general-purpose processor by 200x. While computer CPU's are very good at doing a lot of different things, they generall y suck at doing specific things extremely fast. A dedicated DSP chip for example can easily outperform a general-purpose processor doing a DSP subroutine by 200x. If they can make these things cheap enoug
    • Re: (Score:2, Interesting)

      by Targon ( 17348 )
      PCI is on it's way out, PCI Express is the next stage, or HTX(HyperTransport slot).

      Dedicated co-processors are a good idea, the problem is the costs involved. AMD is pushing for these companies to just make their new co-processors work with an existing socket type so instead of trying to sell cards(which cost more money to make due to the PCB), we will buy just the chip itself.

      To be honest, this is a better way to go since if a GPU were implemented in this way, you could easily just buy a GPU, toss it on
    • Physics and AI coprocessors are 2 years too late - with the increasing availability of dual core processors in even midrange consumer systems now, and quad core on the horizon, engineering time is much better spent on making an app multithreaded so that it runs efficiently on hyperthreaded and dual core machines

      I agree that multithreaded game engines are probably the wave of the future, but I would still love to see these physics and AI co-processors integrated onto video cards. Expecting gamers to pay $20
      • by k_187 ( 61692 )
        well, there will always be people that will buy crap so they can be the 1337est. I'd say that getting MS to put this stuff into DirectX and then getting developers to use it will spur enough adoption of the cards to bring prices down. It wasn't that long ago that GPUs weren't needed. It took quake and voodoo together to get people to realize what the difference could be. What are these people doing to illustrate that difference?
  • Will games powered by specific pieces of hardware become the norm?
    Many games have been over the past three decades or so. They're known as console games.

    If things continue in this direction, it looks like we may be buying game consoles to hook to our computers instead of our televisions.
    • it looks like we may be buying game consoles to hook to our computers instead of our televisions.

      Will video game consoles for computers come with the same systematic bias against smaller game developers that video game consoles for televisions have traditionally come with?

  • The product, from a company called AIseek, seeks to do for NPC performance what the PhysX processor does for in-game physics.

    Damn. And I hoped it'd actually be useful for AI.

    The problem with PhysX is it costs similarly to a mid range graphics card and yet adds kind of performance gains of a first gen graphics card. Whilst GPUs are massively evolved compared to first gen offerings, PhysX in its first gen state is a really expensive nice little add on.

    Don't get me wrong, when physics processors and AI process
  • by Rhys ( 96510 ) on Wednesday September 06, 2006 @02:27PM (#16054101)
    Since the Mhz jumps of the past seem to be by and large behind us these days, but we're looking at more and more cores, isn't it time that games become multithreaded and offload that nasty pathing work to a second core? Sure you could buy stupid shiny cards for the game physics and AI and network (some sort of network booster that avoids the OS's TCP stack -- posted a while back I believe), or alternatly just make use of the extra hardware that /will/ be in the box anyway.

    Now, the decent AI toolkit that folks can license might be worth it anyway, when they figure out they should just run it on the CPU instead of their custom CPU-like-thing.
  • One might think that the future is piecemeal, given the PhysX card, this thing and the even more ridiculous Killer(TM) NIC, but there's a few small things that the would-be bandwagoneers developing these things don't want to think about.

    The first is money. A serious gamer who likes his bells and whistles might be expected to spend several hundred dollars every year or two, in order to make his games run at their prettiest and fastest. He still has a finite budget, though-- asking him to spend a similar am

  • Slap this into the Xbox 720 or PS3/4 and you get a mondo increase in NPC performance as long as the developers put in some rudimentary "learning" routines to keep things random. All gamers buying games that desire that NPC chip, get to enjoy the fun. Not so for the PC gamers. For PC gamers, game companies that make games for the most elite configurations - namely, those requiring the PhysX processor and this one - will have a lower percentage of sales per owners of PCs.
    • Meh, if this became popular, it'd just be another required card for a gamer PC, much like the 3d accelerators.
  • I could see something like this used to lower the costs, and increase the scale of games like Everquest/World of Warcraft. Those games have dedicated server machines running AI's 24/7, for profit.
  • way of the present (Score:2, Interesting)

    by krotkruton ( 967718 )
    Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?

    Short-run, maybe; long-run, no. IMHO, things will consolidate like they always seem to do. Video cards are necessary for more than just games, so they won't be going anywhere. Physics and AI cards seem to be useful for nothing but games. It would be foolish to combine all video cards with physics and AI chips because not everyone plays games, but why not combine the physics and AI chips?
  • by bockelboy ( 824282 ) on Wednesday September 06, 2006 @02:39PM (#16054198)
    One begins to wonder what the "endgame" scenario for the respective manufacturers of the physics and AI cards we're seeing. I can foresee three distinct situations:

    1) The CEOs, investors, and engineers are complete idiots, and expect all the gamers of the world to buy separate physics, AI, and graphics cards
    2) They're hoping to provide chips to ATI or nVidia for a "game card" instead of a "graphics card", the next generation of expensive purchases for gamers
    3) They're hoping to provide chips for the nextgen xbox / playstation / wii, hoping that their chips will be the ones to make gaming interesting again.
  • by Yvan256 ( 722131 ) on Wednesday September 06, 2006 @02:43PM (#16054239) Homepage Journal
    Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?
    If you look at the Amiga, I think it had a CPU or co-processor for almost everything...

    As for this new thing "doing the same thing as the PhysX processor", we'd have to see this PhysX processor in action (and on the market) first, wouldn't we?

  • I assume you could accelerate A* with a dedicated chip - but that makes only a relatively small part of AI. Or you could accelerate Neuronal Networks, but most games I know use relatively plain state machines.
    I'd move the pathfinding onto another thread, and with the gaining popularity of multi-core architectures you should get the same effect. That way you'd share most of the resources with the rest of the system, and wouldn't have to worry about sending everything over the bus to another card.
  • by ThomasBHardy ( 827616 ) on Wednesday September 06, 2006 @03:07PM (#16054433)
    Ok I do know I should be more tolerant of my fellow man and all that stuff, but really... this is just damned foolish.

    Imagine the conversation that led to this...

    -misty flashback fade-

    Marketing Guy : Oh man, gaming is ready for a revolution!
    Technical Guy : It's called a Wii now

    Marketing Guy : Huh? We now what? -shakes head- I mean these gamers, they buy top end stuff, they have money to burn!

    Technical Guy : Not really, they buy slightly under the curve and tweak up and overclock mostly
    Marketing Guy : No no I read in a magazine that all gamers have more common sense than money
    Technical Guy : -sigh-

    Marketing Guy : These Ageis guys really whipped up a lot of frenzy about a new type of add on card.
    Technical Guy : Yeah it's supposed to make the gamers run better by adding physics processing but the demo..
    Marketing Guy : And they are making money hand over fist!
    Technical Guy : Well, actually...

    Marketing Guy : And it's so easy to make specialty stuff!!
    Technical Guy : But their demo runs the same even without the card!

    Marketing Guy : Wait, Wait, I got it! We'll make a card that adds more CPU power!
    Technical Guy : Well dual cores add lots of CPu power that has yet to be tapped by games
    Marketing Guy : No wait, even better, we'll make it special! That's what made the Ageis guys rich!

    Technical Guy : Listen, the Ageis guys are not selling much, you might not want to...
    Marketing Guy : We'll add better AI! That's IT!

    Technical Guy : Better AI?
    Marketing Guy : Yeah, we'll sell a card that makes the games run better!
    Technical Guy : How's that work?
    Marketing Guy : We'll umm, make it able to process AI commands like a graphics card processes graphics commands.

    Technical Guy : But Graphics Commands are standardized, so they can optimize for that.
    Marketing Guy : We'll get them to standardize AI commands.

    Technical Guy : -twitches- But, every game has different needs from AI
    Marketing Guy : So we'll make it flexible, generic, so it can do anything

    Technical Guy : If it's generic processor design, it's the same as a regular CPU.
    Marketing Guy : Exactly!

    Technical Guy : But then what is it's advantage?
    Marketing Guy : Haven't you been listening? It'll make games play BETTER!
  • by archeopterix ( 594938 ) on Wednesday September 06, 2006 @03:11PM (#16054464) Journal
    I think it won't repeat the success of 3d acceleration, because AI is quite unlike 3D. The key factor in 3d accelerators' success is IMHO a very good set of primitives. If you are fast at drawing large numbers of textured triangles potentially obscuring each other, then you are there (almost) - you can accelerate practically any 3d game. I don't see anything like this in AI. Well, perhaps a generic depth-first search accelerator for brute force algorithms, but the problem I see with that is that the search spaces will vary from game to game, so you probably won't be faster than your current multi-core generic CPU.

    It seems that those guys did what's best under these circumstances - got a specific search space that is common in many games and specialized in that. IMHO, it's not enough to get the snowball rolling, but time will tell.

  • > In fact, AIseek guarantees that with its coprocessor NPCs
    > will always be able to find the optimal path in any title using the processor.

    It has been mathematically demonstrated there is no general pathfinding solution significantly better than trying all possibilities (though pretty much only in degenerate cases could the best path be difficult not to find by a hill-climbing heuristic.)

    Still, it should be trivial to whip up a case that would require these dedicated processors longer than the known
    • by Hast ( 24833 )
      What? A* finds the optimal solution as long as the heuristic is valid. There are better algorithms of course, but to claim that you might as well brute force is just silly.

      Or perhaps I just completely misunderstood your statement?
  • Is this the 'way of the future' for PC titles? Will games powered by specific pieces of hardware become the norm?

    Hmm, I thought we had that already... What about all those game consoles with custom video chips and CPU's in them? (PS1, PS2, Xbox, Xbox 360, Gamecube...)

    IMO, this chipset (or at least its functionality) may be more likely to find a home in consoles than as add-on's for PC systems.

    As others have pointed out, the number of people who would drop an extra $100 to get the last erg of performanc

  • AndrAIa [wikipedia.org] is my favorite AI. Making her 200X better is an amazing throught to speculate on.
  • Comment removed based on user account deletion
  • This touts the ability to speed up AI 'thinking' or 'reaction' times. Is this something that really needs to be sped up?

    I believe the current situation of AI is a result of lack of breakthroughs or lazy programming, not that the AI simply reacts too slowly as a result of your CPU not being able to process it fast enough.

    If this is truely the way the market is heading though, and not just a hardware bubble, then I think we will see things like this integrated onto high performance motherboards rather th
  • by ScaryFroMan ( 901163 ) <scaryfroman&hotmail,com> on Wednesday September 06, 2006 @05:03PM (#16055265)
    So for optimal performance, I need two video cards, a physics card, an AI card, a sound card, and a network card. And even then, that's leaving out stuff like a RAID or SCSI controller. Sounds great, but where's a motherboard that can support more than one PCI card with both PCI-E slots filled? Hell, a lot of motherboards can't even handle one.
  • by kickabear ( 173514 ) on Wednesday September 06, 2006 @05:05PM (#16055278) Homepage
    This is just another spoke in the wheel of reincarnation [catb.org]. This too shall pass.
  • by goldcd ( 587052 ) on Wednesday September 06, 2006 @05:32PM (#16055465) Homepage
    entertain this notion? This card is doomed.
    The PhysX card is doomed
    Multicore CPUs are here - no longer some weird expensive ninja-component, they merely cost a few $/£ more than a single core.
    Currently nothing (non-industrial) really takes advantage of multi-core systems - the spiel for them currently seems to be 'Run an AV scan without slowing your game' - that's it.
    *rubs crystal ball*
    What's going to happen is the established middle-ware (i.e those with a product people use now) will develop engines that 'run on a core'. Current core #1 will run the game and core #2 will run the physics and eventually core #3 will be the AI, #4 will run the procedural graphics, #5 will do the 12.1 audio etc.
    If you look what's being developed for the PS3 (the most insanely multicored CPU so far), the cores are being divided up by function - one for the OS menu, one for the lead characters hair etc. Threading a single function across multiple cores is not only insanely hard, but hinders cross-platform porting.
    Middleware is just going to be sold to run on one core and ported per platform - and I'm fine with that.
  • If this does for AI what "3D accelerators" did for graphics, then AI is doomed to atrophy.

    Prior to the invention of the 3D accelerator card, 3D graphics was awash with variety and innovation: Duke Nukem 3D's sprite-based engine allowed 3D Realms to simulate real-time mirrored surfaces; Shattered Steel used voxels to create a smooth, contoured landscape oozing atmosphere, then dotted it with metallic polygonal buildings and polygonal enemy vehicles; the Wing Commander games of the time used phong shading t

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...